1
|
Idris M, Gay CC, Woods IG, Sullivan M, Gaughan JB, Phillips CJC. Automated Quantification of the Behaviour of Beef Cattle Exposed to Heat Load Conditions. Animals (Basel) 2023; 13:ani13061125. [PMID: 36978665 PMCID: PMC10044595 DOI: 10.3390/ani13061125] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 02/28/2023] [Accepted: 03/16/2023] [Indexed: 03/30/2023] Open
Abstract
Cattle change their behaviour in response to hot temperatures, including by engaging in stepping that indicates agitation. The automated recording of these responses would be helpful in the timely diagnosis of animals experiencing heat loading. Behavioural responses of beef cattle to hot environmental conditions were studied to investigate whether it was possible to assess behavioural responses by video-digitised image analysis. Open-source automated behavioural quantification software was used to record pixel changes in 13 beef cattle videorecorded in a climate-controlled chamber during exposure to a simulated typical heat event in Queensland, Australia. Increased digitised movement was observed during the heat event, which was related to stepping and grooming/scratching activities in standing animals. The 13 cattle were exposed in two cohorts, in which the first group of cattle (n = 6) was fed a standard finisher diet based on a high percentage of cereal grains, and the second group of cattle (n = 7) received a substituted diet in which 8% of the grains were replaced by lucerne hay. The second group displayed a smaller increase in digitised movements on exposure to heat than the first, suggesting less discomfort under hot conditions. The results suggest that cattle exposed to heat display increased movement that can be detected automatically by video digitisation software, and that replacing some cereal grain with forage in the diet of feedlot cattle may reduce the measured activity responses to the heat.
Collapse
Affiliation(s)
- Musadiq Idris
- Faculty of Veterinary and Animal Sciences, The Islamia University of Bahawalpur, Punjab 63100, Pakistan
| | - Caitlin C Gay
- School of Veterinary Science, Gatton Campus, The University of Queensland, Gatton, QLD 4343, Australia
| | - Ian G Woods
- Department of Biology, Ithaca College, Ithaca, NY 14850, USA
| | - Megan Sullivan
- School of Agriculture and Food Sciences, Gatton Campus, The University of Queensland, Gatton, QLD 4343, Australia
| | - John B Gaughan
- School of Agriculture and Food Sciences, Gatton Campus, The University of Queensland, Gatton, QLD 4343, Australia
| | - Clive J C Phillips
- Institute of Veterinary Medicine and Animal Sciences, Estonian University of Life Sciences, Kreutzwalki 1, 51014 Tartu, Estonia
- Curtin University Sustainability Policy (CUSP) Institute, Curtin University, Perth, WA 6845, Australia
| |
Collapse
|
2
|
Hu C, Wang Z, Liu B, Huang H, Zhang N, Xu Y. Validation of a system for automatic quantitative analysis of laboratory mice behavior based on locomotor pose. Comput Biol Med 2022; 150:105960. [PMID: 36122441 DOI: 10.1016/j.compbiomed.2022.105960] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2022] [Revised: 07/28/2022] [Accepted: 08/06/2022] [Indexed: 11/17/2022]
Abstract
Automatic recognition and accurate quantitative analysis of rodent behavior play an important role in brain neuroscience, pharmacological and toxicological. Currently, most behavior recognition systems used in experiments mainly focus on the indirect measurements of animal movement trajectories, while neglecting the changes of animal body pose that can indicate more psychological factors. Thus, this paper developed and validated an hourglass network-based behavioral quantification system (HNBQ), which uses a combination of body pose and movement parameters to quantify the activity of mice in an enclosed experimental chamber. In addition, The HNBQ was employed to record behavioral abnormalities of head scanning in the presence of food gradients in open field test (OFT). The results proved that the HNBQ in the new object recognition (NOR) experiment was highly correlated with the scores of manual observers during the latent exploration period and the cumulative exploration time. Moreover, in the OFT, HNBQ was able to capture the subtle differences in head scanning behavior of mice in the gradient experimental groups. Satisfactory results support that the combination of body pose and motor parameters can regard as a new alternative approach for quantification of animal behavior in laboratory.
Collapse
Affiliation(s)
- Chunhai Hu
- School of Electrical Engineering, Yanshan University, Qinhuangdao, 066044, China
| | - Zhongjian Wang
- School of Electrical Engineering, Yanshan University, Qinhuangdao, 066044, China
| | - Bin Liu
- School of Electrical Engineering, Yanshan University, Qinhuangdao, 066044, China.
| | - Hong Huang
- Centre for Pharmacological and Toxicological Research, Institute of Medicinal Plants, Beijing, 100193, China
| | - Ning Zhang
- School of Electrical Engineering, Yanshan University, Qinhuangdao, 066044, China
| | - Yanguang Xu
- School of Electrical Engineering, Yanshan University, Qinhuangdao, 066044, China
| |
Collapse
|
3
|
Hatton-Jones KM, Christie C, Griffith TA, Smith AG, Naghipour S, Robertson K, Russell JS, Peart JN, Headrick JP, Cox AJ, du Toit EF. A YOLO based software for automated detection and analysis of rodent behaviour in the open field arena. Comput Biol Med 2021; 134:104474. [PMID: 34058512 DOI: 10.1016/j.compbiomed.2021.104474] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2020] [Revised: 05/02/2021] [Accepted: 05/03/2021] [Indexed: 11/24/2022]
Abstract
Rodent models are important in mechanistic studies of the physiological and pathophysiological determinants of behaviour. The Open Field Test (OFT) is one of the most commonly utilised tests to assess rodent behaviour in a novel open environment. The key variables assessed in an OFT are general locomotor activity and exploratory behaviours and can be assessed manually or by automated systems. Although several automated systems exist, they are often expensive, difficult to use, or limited in the type of video that can be analysed. Here we describe a machine-learning algorithm - dubbed Cosevare - that uses a trained YOLOv3 DNN to identify and track movement of mice in the open-field arena. We validated Cosevare's capacity to accurately track locomotive and exploratory behaviour in 10 videos, comparing outputs generated by Cosevare with analysis by 5 manual scorers. Behavioural differences between control mice and those with diet-induced obesity (DIO) were also documented. We found the YOLOv3 based tracker to be accurate at identifying and tracking the mice within the open-field arena and in instances with variable backgrounds. Additionally, kinematic and spatial-based analysis demonstrated highly consistent scoring of locomotion, centre square duration (CSD) and entries (CSE) between Cosevare and manual scorers. Automated analysis was also able to distinguish behavioural differences between healthy control and DIO mice. The study found that a YOLOv3 based tracker is able to easily track mouse behaviour in the open field arena and supports machine learning as a potential future alternative for the assessment of animal behaviour in a wide range of species in differing environments and behavioural tests.
Collapse
Affiliation(s)
| | | | - Tia A Griffith
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - Amanda G Smith
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - Saba Naghipour
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - Kai Robertson
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - Jake S Russell
- School of Biomedical Science, University of Queensland, Brisbane, 4072, Australia
| | - Jason N Peart
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - John P Headrick
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - Amanda J Cox
- School of Medical Science, Griffith University, Southport, 4217, Australia
| | - Eugene F du Toit
- School of Medical Science, Griffith University, Southport, 4217, Australia
| |
Collapse
|
4
|
Shokaku T, Moriyama T, Murakami H, Shinohara S, Manome N, Morioka K. Development of an automatic turntable-type multiple T-maze device and observation of pill bug behavior. THE REVIEW OF SCIENTIFIC INSTRUMENTS 2020; 91:104104. [PMID: 33138567 DOI: 10.1063/5.0009531] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/03/2020] [Accepted: 10/05/2020] [Indexed: 06/11/2023]
Abstract
In recent years, various animal observation instruments have been developed to support long-term measurement and analysis of animal behaviors. This study proposes an automatic observation instrument that specializes for turning behaviors of pill bugs and aims to obtain new knowledge in the field of ethology. Pill bugs strongly tend to turn in the opposite direction of a preceding turn. This alternation of turning is called turn alternation reaction. However, a repetition of turns in the same direction is called turn repetition reaction and has been considered a malfunction of turn alternation. In this research, the authors developed an automatic turntable-type multiple T-maze device and observed the turning behavior of 34 pill bugs for 6 h to investigate whether turn repetition is a malfunction. As a result, most of the pill bug movements were categorized into three groups: sub-diffusion, Brownian motion, and Lévy walk. This result suggests that pill bugs do not continue turn alternation mechanically but elicit turn repetition moderately, which results in various movement patterns. In organisms with relatively simple nervous systems such as pill bugs, stereotypical behaviors such as turn alternation have been considered mechanical reactions and variant behaviors such as turn repetition have been considered malfunctions. However, our results suggest that a moderate generation of turn repetition is involved in the generation of various movement patterns. This study is expected to provide a new perspective on the conventional view of the behaviors of simple organisms.
Collapse
Affiliation(s)
- Takaharu Shokaku
- Department of Network Design, Meiji University, Nakano, Tokyo 164-8525, Japan
| | - Toru Moriyama
- Faculty of Texitile Science and Technology, Shinshu University, Ueda, Nagano 386-8567, Japan
| | - Hisashi Murakami
- Research Center for Advanced Science and Technology, The University of Tokyo, Meguro, Tokyo 153-8904, Japan
| | - Shuji Shinohara
- Faculty of Engineering, The University of Tokyo, Bunkyo, Tokyo 113-8656, Japan
| | - Nobuhito Manome
- Faculty of Engineering, The University of Tokyo, Bunkyo, Tokyo 113-8656, Japan
| | - Kazuyuki Morioka
- Department of Network Design, Meiji University, Nakano, Tokyo 164-8525, Japan
| |
Collapse
|
5
|
|
6
|
Madan CR, Spetch ML. Visualizing and quantifying movement from pre-recorded videos: The spectral time-lapse (STL) algorithm. F1000Res 2014; 3:19. [PMID: 25580219 PMCID: PMC4038320 DOI: 10.12688/f1000research.3-19.v1] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 01/20/2014] [Indexed: 11/20/2022] Open
Abstract
When studying animal behaviour within an open environment, movement-related data are often important for behavioural analyses. Therefore, simple and efficient techniques are needed to present and analyze the data of such movements. However, it is challenging to present both spatial and temporal information of movements within a two-dimensional image representation. To address this challenge, we developed the spectral time-lapse (STL) algorithm that re-codes an animal’s position at every time point with a time-specific color, and overlays it with a reference frame of the video, to produce a summary image. We additionally incorporated automated motion tracking, such that the animal’s position can be extracted and summary statistics such as path length and duration can be calculated, as well as instantaneous velocity and acceleration. Here we describe the STL algorithm and offer a freely available MATLAB toolbox that implements the algorithm and allows for a large degree of end-user control and flexibility.
Collapse
Affiliation(s)
- Christopher R Madan
- Department of Psychology, University of Alberta, Edmonton, Alberta T6G 2E9, Canada
| | - Marcia L Spetch
- Department of Psychology, University of Alberta, Edmonton, Alberta T6G 2E9, Canada
| |
Collapse
|
7
|
Chen HC, Jia W, Yue Y, Li Z, Sun YN, Fernstrom JD, Sun M. Model-based measurement of food portion size for image-based dietary assessment using 3D/2D registration. MEASUREMENT SCIENCE & TECHNOLOGY 2013; 24:10.1088/0957-0233/24/10/105701. [PMID: 24223474 PMCID: PMC3819104 DOI: 10.1088/0957-0233/24/10/105701] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Dietary assessment is important in health maintenance and intervention in many chronic conditions, such as obesity, diabetes, and cardiovascular disease. However, there is currently a lack of convenient methods for measuring the volume of food (portion size) in real-life settings. We present a computational method to estimate food volume from a single photographical image of food contained in a typical dining plate. First, we calculate the food location with respect to a 3D camera coordinate system using the plate as a scale reference. Then, the food is segmented automatically from the background in the image. Adaptive thresholding and snake modeling are implemented based on several image features, such as color contrast, regional color homogeneity and curve bending degree. Next, a 3D model representing the general shape of the food (e.g., a cylinder, a sphere, etc.) is selected from a pre-constructed shape model library. The position, orientation and scale of the selected shape model are determined by registering the projected 3D model and the food contour in the image, where the properties of the reference are used as constraints. Experimental results using various realistically shaped foods with known volumes demonstrated satisfactory performance of our image based food volume measurement method even if the 3D geometric surface of the food is not completely represented in the input image.
Collapse
Affiliation(s)
- Hsin-Chen Chen
- Department of Electrical & Computer Engineering, University of Pittsburgh, Pittsburgh, PA, USA
- Department of Neurosurgery, University of Pittsburgh, Pittsburgh, PA, USA
- Department of Computer Science and Information Engineering, National Cheng Kung University, Tainan City, Taiwan, R.O.C
| | - Wenyan Jia
- Department of Neurosurgery, University of Pittsburgh, Pittsburgh, PA, USA
| | - Yaofeng Yue
- Department of Electrical & Computer Engineering, University of Pittsburgh, Pittsburgh, PA, USA
| | - Zhaoxin Li
- School of Computer Science and Technology, Harbin Institute of Technology, China
| | - Yung-Nien Sun
- Department of Computer Science and Information Engineering, National Cheng Kung University, Tainan City, Taiwan, R.O.C
| | - John D. Fernstrom
- Departments of Psychiatry and Pharmacology, University of Pittsburgh, Pittsburgh, PA, USA
| | - Mingui Sun
- Department of Electrical & Computer Engineering, University of Pittsburgh, Pittsburgh, PA, USA
- Department of Neurosurgery, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
8
|
Crispim Junior CF, Pederiva CN, Bose RC, Garcia VA, Lino-de-Oliveira C, Marino-Neto J. ETHOWATCHER: validation of a tool for behavioral and video-tracking analysis in laboratory animals. Comput Biol Med 2012; 42:257-64. [DOI: 10.1016/j.compbiomed.2011.12.002] [Citation(s) in RCA: 86] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2010] [Revised: 07/11/2011] [Accepted: 12/02/2011] [Indexed: 11/30/2022]
|
9
|
Yeh YS, Huang KN, Jen SL, Li YC, Young MS. Development of a multitarget tracking system for paramecia. THE REVIEW OF SCIENTIFIC INSTRUMENTS 2010; 81:074302. [PMID: 20687744 DOI: 10.1063/1.3460266] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
This investigation develops a multitarget tracking system for the motile protozoa, paramecium. The system can recognize, track, and record the orbit of swimming paramecia within a 4 mm diameter of a circular experimental pool. The proposed system is implemented using an optical microscope, a charge-coupled device camera, and a software tool, Laboratory Virtual Instrumentation Engineering Workbench (LABVIEW). An algorithm for processing the images and analyzing the traces of the paramecia is developed in LABVIEW. It focuses on extracting meaningful data in an experiment and recording them to elucidate the behavior of paramecia. The algorithm can also continue to track paramecia even if they are transposed or collide with each other. The experiment demonstrates that this multitarget tracking design can really track more than five paramecia and simultaneously yield meaningful data from the moving paramecia at a maximum speed of 1.7 mm/s.
Collapse
Affiliation(s)
- Yu-Sing Yeh
- Department of Electrical Engineering, National Cheng Kung University, Tainan, 701 Taiwan
| | | | | | | | | |
Collapse
|