1
|
Samiappan S, Krishnan BS, Dehart D, Jones LR, Elmore JA, Evans KO, Iglay RB. Aerial Wildlife Image Repository for animal monitoring with drones in the age of artificial intelligence. Database (Oxford) 2024; 2024:baae070. [PMID: 39043628 PMCID: PMC11265857 DOI: 10.1093/database/baae070] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Revised: 05/31/2024] [Accepted: 07/08/2024] [Indexed: 07/25/2024]
Abstract
Drones (unoccupied aircraft systems) have become effective tools for wildlife monitoring and conservation. Automated animal detection and classification using artificial intelligence (AI) can substantially reduce logistical and financial costs and improve drone surveys. However, the lack of annotated animal imagery for training AI is a critical bottleneck in achieving accurate performance of AI algorithms compared to other fields. To bridge this gap for drone imagery and help advance and standardize automated animal classification, we have created the Aerial Wildlife Image Repository (AWIR), which is a dynamic, interactive database with annotated images captured from drone platforms using visible and thermal cameras. The AWIR provides the first open-access repository for users to upload, annotate, and curate images of animals acquired from drones. The AWIR also provides annotated imagery and benchmark datasets that users can download to train AI algorithms to automatically detect and classify animals, and compare algorithm performance. The AWIR contains 6587 animal objects in 1325 visible and thermal drone images of predominantly large birds and mammals of 13 species in open areas of North America. As contributors increase the taxonomic and geographic diversity of available images, the AWIR will open future avenues for AI research to improve animal surveys using drones for conservation applications. Database URL: https://projectportal.gri.msstate.edu/awir/.
Collapse
Affiliation(s)
- Sathishkumar Samiappan
- Geosystems Research Institute, Mississippi State University, 2 Research Blvd, Starkville, MS 39759, United States
| | - B. Santhana Krishnan
- Geosystems Research Institute, Mississippi State University, 2 Research Blvd, Starkville, MS 39759, United States
| | - Damion Dehart
- Geosystems Research Institute, Mississippi State University, 2 Research Blvd, Starkville, MS 39759, United States
- Computer Sciences and Computer Engineering, University of Southern Mississippi, 118 College Drive, Hattiesburg, MS 39406, United States
| | - Landon R Jones
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Stone Blvd, Mississippi State, MS 39762, United States
| | - Jared A Elmore
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Stone Blvd, Mississippi State, MS 39762, United States
| | - Kristine O Evans
- Geosystems Research Institute, Mississippi State University, 2 Research Blvd, Starkville, MS 39759, United States
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Stone Blvd, Mississippi State, MS 39762, United States
| | - Raymond B Iglay
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Stone Blvd, Mississippi State, MS 39762, United States
| |
Collapse
|
2
|
Luo W, Zhang G, Shao Q, Zhao Y, Wang D, Zhang X, Liu K, Li X, Liu J, Wang P, Li L, Wang G, Wang F, Yu Z. An efficient visual servo tracker for herd monitoring by UAV. Sci Rep 2024; 14:10463. [PMID: 38714785 DOI: 10.1038/s41598-024-60445-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2023] [Accepted: 04/23/2024] [Indexed: 05/10/2024] Open
Abstract
It is a challenging and meaningful task to carry out UAV-based livestock monitoring in high-altitude (more than 4500 m on average) and cold regions (annual average - 4 °C) on the Qinghai Tibet Plateau. The purpose of artificial intelligence (AI) is to execute automated tasks and to solve practical problems in actual applications by combining the software technology with the hardware carrier to create integrated advanced devices. Only in this way, the maximum value of AI could be realized. In this paper, a real-time tracking system with dynamic target tracking ability is proposed. It is developed based on the tracking-by-detection architecture using YOLOv7 and Deep SORT algorithms for target detection and tracking, respectively. In response to the problems encountered in the tracking process of complex and dense scenes, our work (1) Uses optical flow to compensate the Kalman filter, to solve the problem of mismatch between the target bounding box predicted by the Kalman filter (KF) and the input when the target detection in the current frame is complex, thereby improving the prediction accuracy; (2) Using a low confidence trajectory filtering method to reduce false positive trajectories generated by Deep SORT, thereby mitigating the impact of unreliable detection on target tracking. (3) A visual servo controller has been designed for the Unmanned Aerial Vehicle (UAV) to reduce the impact of rapid movement on tracking and ensure that the target is always within the field of view of the UAV camera, thereby achieving automatic tracking tasks. Finally, the system was tested using Tibetan yaks on the Qinghai Tibet Plateau as tracking targets, and the results showed that the system has real-time multi tracking ability and ideal visual servo effect in complex and dense scenes.
Collapse
Affiliation(s)
- Wei Luo
- North China Institute of Aerospace Engineering, Langfang, 065000, China
- Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, 100101, China
- Aerospace Remote Sensing Information Processing and Application Collaborative Innovation Center of Hebei Province, Langfang, 065000, China
- National Joint Engineering Research Center of Space Remote Sensing Information Application Technology, Langfang, 065000, China
| | - Guoqing Zhang
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Quanqin Shao
- Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, 100101, China
- University of Chinese Academy of Sciences, Beijing, 101407, China
| | - Yongxiang Zhao
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Dongliang Wang
- Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, 100101, China.
| | - Xiongyi Zhang
- Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, 100101, China
| | - Ke Liu
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Xiaoliang Li
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Jiandong Liu
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Penggang Wang
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Lin Li
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Guanwu Wang
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Fulong Wang
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| | - Zhongde Yu
- North China Institute of Aerospace Engineering, Langfang, 065000, China
| |
Collapse
|
3
|
Alexeenko V, Jeevaratnam K. Artificial intelligence: Is it wizardry, witchcraft, or a helping hand for an equine veterinarian? Equine Vet J 2023; 55:719-722. [PMID: 37551620 DOI: 10.1111/evj.13969] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Accepted: 06/14/2023] [Indexed: 08/09/2023]
Affiliation(s)
- Vadim Alexeenko
- School of Veterinary Medicine, University of Surrey, Surrey, UK
| | | |
Collapse
|
4
|
Krishnan BS, Jones LR, Elmore JA, Samiappan S, Evans KO, Pfeiffer MB, Blackwell BF, Iglay RB. Fusion of visible and thermal images improves automated detection and classification of animals for drone surveys. Sci Rep 2023; 13:10385. [PMID: 37369669 DOI: 10.1038/s41598-023-37295-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Accepted: 06/19/2023] [Indexed: 06/29/2023] Open
Abstract
Visible and thermal images acquired from drones (unoccupied aircraft systems) have substantially improved animal monitoring. Combining complementary information from both image types provides a powerful approach for automating detection and classification of multiple animal species to augment drone surveys. We compared eight image fusion methods using thermal and visible drone images combined with two supervised deep learning models, to evaluate the detection and classification of white-tailed deer (Odocoileus virginianus), domestic cow (Bos taurus), and domestic horse (Equus caballus). We classified visible and thermal images separately and compared them with the results of image fusion. Fused images provided minimal improvement for cows and horses compared to visible images alone, likely because the size, shape, and color of these species made them conspicuous against the background. For white-tailed deer, which were typically cryptic against their backgrounds and often in shadows in visible images, the added information from thermal images improved detection and classification in fusion methods from 15 to 85%. Our results suggest that image fusion is ideal for surveying animals inconspicuous from their backgrounds, and our approach uses few image pairs to train compared to typical machine-learning methods. We discuss computational and field considerations to improve drone surveys using our fusion approach.
Collapse
Affiliation(s)
- B Santhana Krishnan
- Geosystems Research Institute, Mississippi State University, Mississippi State, Mississippi State, MS, 39762, USA
| | - Landon R Jones
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Box 9690, Mississippi State, MS, 39762, USA
| | - Jared A Elmore
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Box 9690, Mississippi State, MS, 39762, USA
- Department of Forestry and Environmental Conservation, Clemson University, Clemson, SC, 29634, USA
| | - Sathishkumar Samiappan
- Geosystems Research Institute, Mississippi State University, Mississippi State, Mississippi State, MS, 39762, USA
| | - Kristine O Evans
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Box 9690, Mississippi State, MS, 39762, USA
| | - Morgan B Pfeiffer
- U.S. Department of Agriculture, Animal and Plant Health Inspection Service, Wildlife Services, National Wildlife Research Center, Ohio Field Station, Sandusky, OH, 44870, USA
| | - Bradley F Blackwell
- U.S. Department of Agriculture, Animal and Plant Health Inspection Service, Wildlife Services, National Wildlife Research Center, Ohio Field Station, Sandusky, OH, 44870, USA
| | - Raymond B Iglay
- Department of Wildlife, Fisheries, and Aquaculture, Mississippi State University, Box 9690, Mississippi State, MS, 39762, USA.
| |
Collapse
|
5
|
Mancuso D, Castagnolo G, Porto SMC. Cow Behavioural Activities in Extensive Farms: Challenges of Adopting Automatic Monitoring Systems. SENSORS (BASEL, SWITZERLAND) 2023; 23:3828. [PMID: 37112171 PMCID: PMC10143811 DOI: 10.3390/s23083828] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Revised: 04/04/2023] [Accepted: 04/06/2023] [Indexed: 06/19/2023]
Abstract
Animal welfare is becoming an increasingly important requirement in the livestock sector to improve, and therefore raise, the quality and healthiness of food production. By monitoring the behaviour of the animals, such as feeding, rumination, walking, and lying, it is possible to understand their physical and psychological status. Precision Livestock Farming (PLF) tools offer a good solution to assist the farmer in managing the herd, overcoming the limits of human control, and to react early in the case of animal health issues. The purpose of this review is to highlight a key concern that occurs in the design and validation of IoT-based systems created for monitoring grazing cows in extensive agricultural systems, since they have many more, and more complicated, problems than indoor farms. In this context, the most common concerns are related to the battery life of the devices, the sampling frequency to be used for data collection, the need for adequate service connection coverage and transmission range, the computational site, and the performance of the algorithm embedded in IoT-systems in terms of computational cost.
Collapse
Affiliation(s)
- Dominga Mancuso
- Department of Agriculture, Food and Environment (Di3A), Building and Land Engineering Section, University of Catania, Via S. Sofia 100, 95123 Catania, Italy; (D.M.); (S.M.C.P.)
| | - Giulia Castagnolo
- Department of Electrical, Electronic and Computer Engineering (DIEEI), University of Catania, Viale A. Doria 6, 95125 Catania, Italy
| | - Simona M. C. Porto
- Department of Agriculture, Food and Environment (Di3A), Building and Land Engineering Section, University of Catania, Via S. Sofia 100, 95123 Catania, Italy; (D.M.); (S.M.C.P.)
| |
Collapse
|
6
|
Besson M, Alison J, Bjerge K, Gorochowski TE, Høye TT, Jucker T, Mann HMR, Clements CF. Towards the fully automated monitoring of ecological communities. Ecol Lett 2022; 25:2753-2775. [PMID: 36264848 PMCID: PMC9828790 DOI: 10.1111/ele.14123] [Citation(s) in RCA: 27] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2022] [Revised: 08/09/2022] [Accepted: 09/06/2022] [Indexed: 01/12/2023]
Abstract
High-resolution monitoring is fundamental to understand ecosystems dynamics in an era of global change and biodiversity declines. While real-time and automated monitoring of abiotic components has been possible for some time, monitoring biotic components-for example, individual behaviours and traits, and species abundance and distribution-is far more challenging. Recent technological advancements offer potential solutions to achieve this through: (i) increasingly affordable high-throughput recording hardware, which can collect rich multidimensional data, and (ii) increasingly accessible artificial intelligence approaches, which can extract ecological knowledge from large datasets. However, automating the monitoring of facets of ecological communities via such technologies has primarily been achieved at low spatiotemporal resolutions within limited steps of the monitoring workflow. Here, we review existing technologies for data recording and processing that enable automated monitoring of ecological communities. We then present novel frameworks that combine such technologies, forming fully automated pipelines to detect, track, classify and count multiple species, and record behavioural and morphological traits, at resolutions which have previously been impossible to achieve. Based on these rapidly developing technologies, we illustrate a solution to one of the greatest challenges in ecology: the ability to rapidly generate high-resolution, multidimensional and standardised data across complex ecologies.
Collapse
Affiliation(s)
- Marc Besson
- School of Biological SciencesUniversity of BristolBristolUK,Sorbonne Université CNRS UMR Biologie des Organismes Marins, BIOMBanyuls‐sur‐MerFrance
| | - Jamie Alison
- Department of EcoscienceAarhus UniversityAarhusDenmark,UK Centre for Ecology & HydrologyBangorUK
| | - Kim Bjerge
- Department of Electrical and Computer EngineeringAarhus UniversityAarhusDenmark
| | - Thomas E. Gorochowski
- School of Biological SciencesUniversity of BristolBristolUK,BrisEngBio, School of ChemistryUniversity of BristolCantock's CloseBristolBS8 1TSUK
| | - Toke T. Høye
- Department of EcoscienceAarhus UniversityAarhusDenmark,Arctic Research CentreAarhus UniversityAarhusDenmark
| | - Tommaso Jucker
- School of Biological SciencesUniversity of BristolBristolUK
| | - Hjalte M. R. Mann
- Department of EcoscienceAarhus UniversityAarhusDenmark,Arctic Research CentreAarhus UniversityAarhusDenmark
| | | |
Collapse
|
7
|
Pandey S, Kumari N. Prediction and monitoring of LULC shift using cellular automata-artificial neural network in Jumar watershed of Ranchi District, Jharkhand. ENVIRONMENTAL MONITORING AND ASSESSMENT 2022; 195:130. [PMID: 36409418 DOI: 10.1007/s10661-022-10623-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Accepted: 10/07/2022] [Indexed: 06/16/2023]
Abstract
Jumar watershed of Ranchi district is agrarian in nature. The unplanned and exponentially growing urban sprawl has become one of the probable threats in achieving sustainable development goals (SDG-15). The purpose of this research study is to monitor the urban sprawl in Jumar watershed within three decades i.e. from the year 1990 to 2021. Land use land cover (LULC) change has been monitored using satellite data from LANDSAT (4, 5 and 8). Various indices are calculated like normalised difference vegetation index (NDVI), normalised difference built-up index (NDBI), normalised difference water index (NDWI) and built-up index (BUI) to monitor LULC change in the area. For prediction of urban sprawl, cellular automata and artificial neural network (CA-ANN) with GIS application technique is used. The model is validated by using Kappa coefficient. The prediction results showed increase in built-up area by 8.23 sq. km in the next decade. The built-up and barren land together increase up to 42.85 sq. km by 2030 and 34.61 sq. km in 2021. The NDVI for 3-decade period showed significant decrease in the healthy vegetation and increase in sparse vegetation. The NDBI showed a slight increase in urban area but massive increase in uncultivated and barren land. NDWI showed a decrease in area of the surface water. The LULC studies showed a major shift from healthy vegetation to agriculture and then to barren land. To assess the impact of urbanisation on water quality, water samples are taken seasonally from J1to J11 sampling locations and are analysed as per APHA procedure. The sites are classified as urban, semi urban and rural area as per their location. The water quality index (WQI) varied between 42.14 to 61.42 during pre-monsoon, 62.20 to 68.7995 during monsoon and 43.48 to 60.12 during post-monsoon. The quality of water is found poor in all seasons at all sampling sites. The water is found highly turbid and alkaline throughout the year. Overall, it can be concluded that the water needs to be pre-treated for drinking purposes throughout the year.
Collapse
Affiliation(s)
- Soumya Pandey
- Department of Civil and Environmental Engineering, Birla Institute of Technology, Mesra, Ranchi, Jharkhand, India, 835215
| | - Neeta Kumari
- Department of Civil and Environmental Engineering, Birla Institute of Technology, Mesra, Ranchi, Jharkhand, India, 835215.
| |
Collapse
|