Next Article in Journal
Advancements in Polymer-Assisted Layer-by-Layer Fabrication of Wearable Sensors for Health Monitoring
Previous Article in Journal
Soft Sensory-Motor System Based on Ionic Solution for Robotic Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Bat2Web: A Framework for Real-Time Classification of Bat Species Echolocation Signals Using Audio Sensor Data

1
Department of Computer Science and Engineering, American University of Sharjah, Sharjah 26666, United Arab Emirates
2
Nature & Ecosystem Restoration, Soudah Development, Riyadh 13519, Saudi Arabia
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(9), 2899; https://doi.org/10.3390/s24092899
Submission received: 27 February 2024 / Revised: 9 April 2024 / Accepted: 26 April 2024 / Published: 1 May 2024
(This article belongs to the Section Environmental Sensing)

Abstract

Bats play a pivotal role in maintaining ecological balance, and studying their behaviors offers vital insights into environmental health and aids in conservation efforts. Determining the presence of various bat species in an environment is essential for many bat studies. Specialized audio sensors can be used to record bat echolocation calls that can then be used to identify bat species. However, the complexity of bat calls presents a significant challenge, necessitating expert analysis and extensive time for accurate interpretation. Recent advances in neural networks can help identify bat species automatically from their echolocation calls. Such neural networks can be integrated into a complete end-to-end system that leverages recent internet of things (IoT) technologies with long-range, low-powered communication protocols to implement automated acoustical monitoring. This paper presents the design and implementation of such a system that uses a tiny neural network for interpreting sensor data derived from bat echolocation signals. A highly compact convolutional neural network (CNN) model was developed that demonstrated excellent performance in bat species identification, achieving an F1-score of 0.9578 and an accuracy rate of 97.5%. The neural network was deployed, and its performance was evaluated on various alternative edge devices, including the NVIDIA Jetson Nano and Google Coral.
Keywords: IoT; bioacoustics; bat echolocation analysis; bat species classification; LoRaWAN; machine learning; NVIDIA Jetson; Google Coral IoT; bioacoustics; bat echolocation analysis; bat species classification; LoRaWAN; machine learning; NVIDIA Jetson; Google Coral

Share and Cite

MDPI and ACS Style

Mahbub, T.; Bhagwagar, A.; Chand, P.; Zualkernan, I.; Judas, J.; Dghaym, D. Bat2Web: A Framework for Real-Time Classification of Bat Species Echolocation Signals Using Audio Sensor Data. Sensors 2024, 24, 2899. https://doi.org/10.3390/s24092899

AMA Style

Mahbub T, Bhagwagar A, Chand P, Zualkernan I, Judas J, Dghaym D. Bat2Web: A Framework for Real-Time Classification of Bat Species Echolocation Signals Using Audio Sensor Data. Sensors. 2024; 24(9):2899. https://doi.org/10.3390/s24092899

Chicago/Turabian Style

Mahbub, Taslim, Azadan Bhagwagar, Priyanka Chand, Imran Zualkernan, Jacky Judas, and Dana Dghaym. 2024. "Bat2Web: A Framework for Real-Time Classification of Bat Species Echolocation Signals Using Audio Sensor Data" Sensors 24, no. 9: 2899. https://doi.org/10.3390/s24092899

APA Style

Mahbub, T., Bhagwagar, A., Chand, P., Zualkernan, I., Judas, J., & Dghaym, D. (2024). Bat2Web: A Framework for Real-Time Classification of Bat Species Echolocation Signals Using Audio Sensor Data. Sensors, 24(9), 2899. https://doi.org/10.3390/s24092899

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop