1
|
Nguyen Hong Duc P, Torterotot M, Samaran F, White PR, Gérard O, Adam O, Cazau D. Assessing inter-annotator agreement from collaborative annotation campaign in marine bioacoustics. ECOL INFORM 2021. [DOI: 10.1016/j.ecoinf.2020.101185] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
2
|
Shiu Y, Palmer KJ, Roch MA, Fleishman E, Liu X, Nosal EM, Helble T, Cholewiak D, Gillespie D, Klinck H. Deep neural networks for automated detection of marine mammal species. Sci Rep 2020; 10:607. [PMID: 31953462 PMCID: PMC6969184 DOI: 10.1038/s41598-020-57549-y] [Citation(s) in RCA: 42] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2019] [Accepted: 12/20/2019] [Indexed: 11/25/2022] Open
Abstract
Deep neural networks have advanced the field of detection and classification and allowed for effective identification of signals in challenging data sets. Numerous time-critical conservation needs may benefit from these methods. We developed and empirically studied a variety of deep neural networks to detect the vocalizations of endangered North Atlantic right whales (Eubalaena glacialis). We compared the performance of these deep architectures to that of traditional detection algorithms for the primary vocalization produced by this species, the upcall. We show that deep-learning architectures are capable of producing false-positive rates that are orders of magnitude lower than alternative algorithms while substantially increasing the ability to detect calls. We demonstrate that a deep neural network trained with recordings from a single geographic region recorded over a span of days is capable of generalizing well to data from multiple years and across the species’ range, and that the low false positives make the output of the algorithm amenable to quality control for verification. The deep neural networks we developed are relatively easy to implement with existing software, and may provide new insights applicable to the conservation of endangered species.
Collapse
Affiliation(s)
- Yu Shiu
- Center for Conservation Bioacoustics, Cornell Lab of Ornithology, Cornell University, Ithaca, NY, 14850, USA.
| | - K J Palmer
- Department of Computer Science, San Diego State University, San Diego, CA, 92182, USA
| | - Marie A Roch
- Department of Computer Science, San Diego State University, San Diego, CA, 92182, USA
| | - Erica Fleishman
- Department of Fish, Wildlife and Conservation Biology, Colorado State University, Fort Collins, CO, 80523, USA
| | - Xiaobai Liu
- Department of Computer Science, San Diego State University, San Diego, CA, 92182, USA
| | - Eva-Marie Nosal
- Department of Ocean and Resources Engineering, University of Hawai'i at Mānoa, Honolulu, HI, 96822, USA
| | - Tyler Helble
- US Navy, Space and Naval Warfare Systems Command, System Center Pacific, San Diego, CA, 92152, USA
| | - Danielle Cholewiak
- Northeast Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, Woods Hole, MA, 02543, USA
| | - Douglas Gillespie
- Sea Mammal Research Unit, Scottish Oceans Institute, University of St. Andrews, St Andrews, Fife, KY16 8LB, Scotland
| | - Holger Klinck
- Center for Conservation Bioacoustics, Cornell Lab of Ornithology, Cornell University, Ithaca, NY, 14850, USA
| |
Collapse
|