Volltext-Downloads (blau) und Frontdoor-Views (grau)
  • search hit 8 of 119
Back to Result List

Bitte verwenden Sie diesen Link, wenn Sie dieses Dokument zitieren oder verlinken wollen: https://nbn-resolving.org/urn:nbn:de:gbv:9-opus-106440

BatNet: a deep learning-based tool for automated bat species identification from camera trap images

  • Automated monitoring technologies can increase the efficiency of ecological data collection and support data-driven conservation. Camera traps coupled with infrared light barriers can be used to monitor temperate-zone bat assemblages at underground hibernacula, where thousands of individuals of multiple species can aggregate in winter. However, the broad-scale adoption of such photo-monitoring techniques is limited by the time-consuming bottleneck of manual image processing. Here, we present BatNet, an open-source, deep learning-based tool for automated identification of 13 European bat species from camera trap images. BatNet includes a user-friendly graphical interface, where it can be retrained to identify new bat species or to create site-specific models to improve detection accuracy at new sites. Model accuracy was evaluated on images from both trained and untrained sites, and in an ecological context, where community- and species-level metrics (species diversity, relative abundance, and species-level activity patterns) were compared between human experts and BatNet. At trained sites, model performance was high across all species (F1-score: 0.98–1). At untrained sites, overall classification accuracy remained high (96.7–98.2%), when camera placement was comparable to the training images (<3 m from the entrance; <45° angle relative to the opening). For atypical camera placements (>3 m or >45° angle), retraining the detector model with 500 site-specific annotations achieved an accuracy of over 95% at all sites. In the ecological case study, all investigated metrics were nearly identical between human experts and BatNet. Finally, we exemplify the ability to retrain BatNet to identify a new bat species, achieving an F1-score of 0.99 while maintaining high classification accuracy for all original species. BatNet can be implemented directly to scale up the deployment of camera traps in Europe and enhance bat population monitoring. Moreover, the pretrained model can serve as a baseline for transfer learning to automatize the image-based identification of bat species worldwide.

Download full text files

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author: Gabriella Krivek, Alexander Gillert, Martin Harder, Marcus Fritze, Karina Frankowski, Luisa Timm, Liska Meyer-Olbersleben, Uwe Freiherr von Lukas, Gerald Kerth, Jaap van Schaik
URN:urn:nbn:de:gbv:9-opus-106440
DOI:https://doi.org/10.1002/rse2.339
ISSN:2056-3485
Parent Title (English):Remote Sensing in Ecology and Conservation
Publisher:Wiley
Place of publication:Hoboken, NJ
Document Type:Article
Language:English
Date of Publication (online):2023/05/09
Date of first Publication:2023/12/01
Release Date:2024/02/21
Tag:Automated monitoring; Chiroptera; bat conservation; camera trap; deep learning; infrared light barrier
Volume:9
Issue:6
First Page:759
Last Page:774
Faculties:Mathematisch-Naturwissenschaftliche Fakultät / Zoologisches Institut und Museum
Collections:Artikel aus DFG-gefördertem Publikationsfonds
Licence (German):License LogoCreative Commons - Namensnennung-Nicht kommerziell-Keine Bearbeitung 4.0 International