Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (14)

Search Parameters:
Keywords = muzzle detection

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 798 KB  
Article
Factors Affecting the Applicability of Infrared Thermography as a Measure of Temperament in Cattle
by Paolo Mongillo, Elisa Giaretta, Enrico Fiore, Giorgia Fabbri, Bruno Stefanon, Lorenzo Degano, Daniele Vicario and Gianfranco Gabai
Vet. Sci. 2025, 12(9), 913; https://doi.org/10.3390/vetsci12090913 - 19 Sep 2025
Viewed by 444
Abstract
Animal temperament, defined as consistent behavioral and physiological responses to stressors, plays a crucial role in cattle welfare, productivity, and safety during handling. This motivates researchers to identify objective, non-invasive methods for temperament assessment. Infrared thermography (IRT) has emerged as a promising tool [...] Read more.
Animal temperament, defined as consistent behavioral and physiological responses to stressors, plays a crucial role in cattle welfare, productivity, and safety during handling. This motivates researchers to identify objective, non-invasive methods for temperament assessment. Infrared thermography (IRT) has emerged as a promising tool to detect superficial temperature changes associated with stress and temperament in cattle. This study aimed to evaluate how superficial temperature variations measured by IRT in fattening bulls are influenced by environmental temperature, humidity, and temperament. The study involved 223 bulls at approximately 7.5 months old, while thermal images of eye and muzzle regions were captured at baseline and during restraint in a squeeze chute. Temperament was assessed using chute score and flight time, and environmental conditions were recorded via a temperature–humidity index (THI). Results showed significant increases in eye and muzzle temperatures during handling. Notably, changes in eye temperature were independent of environmental THI but correlated with flight time, with more temperamental bulls displaying larger temperature increases. In contrast, changes in muzzle temperature were strongly influenced by ambient THI and its variation at handling, consistent with the region’s thermoregulatory function. Temperament explained a small proportion of temperature variation. A follow-up experiment on a subset of 104 bulls around 11 months old showed no significant age effects on the IRT–temperament relationship. These findings indicate that IRT, particularly of the eye region, holds promise as a non-invasive, objective method to assess stress responses related to temperament in cattle. Careful selection of thermal regions and accounting for environmental influences are critical. While IRT alone accounts for limited variability, its integration with other behavioral and physiological measures could enhance temperament evaluation. This approach offers novel opportunities for improving animal welfare and management by identifying highly temperamental individuals without invasive procedures. Future research with higher temporal resolution and varied stressors is warranted to further elucidate temperature dynamics associated with temperament. Full article
Show Figures

Figure 1

13 pages, 2486 KB  
Article
Identification and Characterization of MmuPV1 Causing Papillomatosis Outbreak in an Animal Research Facility
by Vladimir Majerciak, Kristin E. Killoran, Lulu Yu, Deanna Gotte, Elijah Edmondson, Matthew W. Breed, Renee E. King, Melody E. Roelke-Parker, Paul F. Lambert, Joshua A. Kramer and Zhi-Ming Zheng
Viruses 2025, 17(9), 1204; https://doi.org/10.3390/v17091204 - 1 Sep 2025
Viewed by 886
Abstract
Mouse papillomavirus (MmuPV1) is the first papillomavirus known to infect laboratory mice, making it an irreplaceable tool for research on papillomaviruses. Despite wide use, standardized techniques for conducting MmuPV1 animal research are lacking. In this report, we describe an unexpected MmuPV1 outbreak causing [...] Read more.
Mouse papillomavirus (MmuPV1) is the first papillomavirus known to infect laboratory mice, making it an irreplaceable tool for research on papillomaviruses. Despite wide use, standardized techniques for conducting MmuPV1 animal research are lacking. In this report, we describe an unexpected MmuPV1 outbreak causing recurrent papillomatosis in a specific pathogen-free animal research facility. The infected mice displayed characteristic papillomatosis lesions from the muzzles, tails, and feet with histological signs including anisocytosis, epithelial dysplasia, and typical koilocytosis. Etiology studies showed that the papilloma tissues exhibited MmuPV1 infection with expression of viral early and late genes detected by RNA-ISH using MmuPV1 antisense probe to viral E6E7 region and antisense probe to viral L1 region. The viral L1 protein was detected by an anti-MmuPV1 L1 antibody. PCR amplification and cloning of the entire viral genome showed that the origin of the outbreak virus, named MmuPV1 Bethesda strain (GenBank Acc. No. PX123224), could be traced to the MmuPV1 virus previously used in studies at the same facility. Our data indicate that MmuPV1 could exist in a contaminated environment for a long period of time, and a standardized international animal protocol discussing how to handle MmuPV1 studies is urgently needed. Full article
(This article belongs to the Section Animal Viruses)
Show Figures

Figure 1

23 pages, 28830 KB  
Article
Micro-Expression-Based Facial Analysis for Automated Pain Recognition in Dairy Cattle: An Early-Stage Evaluation
by Shuqiang Zhang, Kashfia Sailunaz and Suresh Neethirajan
AI 2025, 6(9), 199; https://doi.org/10.3390/ai6090199 - 22 Aug 2025
Viewed by 1005
Abstract
Timely, objective pain recognition in dairy cattle is essential for welfare assurance, productivity, and ethical husbandry yet remains elusive because evolutionary pressure renders bovine distress signals brief and inconspicuous. Without verbal self-reporting, cows suppress overt cues, so automated vision is indispensable for on-farm [...] Read more.
Timely, objective pain recognition in dairy cattle is essential for welfare assurance, productivity, and ethical husbandry yet remains elusive because evolutionary pressure renders bovine distress signals brief and inconspicuous. Without verbal self-reporting, cows suppress overt cues, so automated vision is indispensable for on-farm triage. Although earlier systems tracked whole-body posture or static grimace scales, frame-level detection of facial micro-expressions has not been explored fully in livestock. We translate micro-expression analytics from automotive driver monitoring to the barn, linking modern computer vision with veterinary ethology. Our two-stage pipeline first detects faces and 30 landmarks using a custom You Only Look Once (YOLO) version 8-Pose network, achieving a 96.9% mean average precision (mAP) at an Intersection over the Union (IoU) threshold of 0.50 for detection and 83.8% Object Keypoint Similarity (OKS) for keypoint placement. Cropped eye, ear, and muzzle patches are encoded using a pretrained MobileNetV2, generating 3840-dimensional descriptors that capture millisecond muscle twitches. Sequences of five consecutive frames are fed into a 128-unit Long Short-Term Memory (LSTM) classifier that outputs pain probabilities. On a held-out validation set of 1700 frames, the system records 99.65% accuracy and an F1-score of 0.997, with only three false positives and three false negatives. Tested on 14 unseen barn videos, it attains 64.3% clip-level accuracy (i.e., overall accuracy for the whole video clip) and 83% precision for the pain class, using a hybrid aggregation rule that combines a 30% mean probability threshold with micro-burst counting to temper false alarms. As an early exploration from our proof-of-concept study on a subset of our custom dairy farm datasets, these results show that micro-expression mining can deliver scalable, non-invasive pain surveillance across variations in illumination, camera angle, background, and individual morphology. Future work will explore attention-based temporal pooling, curriculum learning for variable window lengths, domain-adaptive fine-tuning, and multimodal fusion with accelerometry on the complete datasets to elevate the performance toward clinical deployment. Full article
Show Figures

Figure 1

9 pages, 5139 KB  
Proceeding Paper
You Only Look Once v8 Cattle Identification Based on Muzzle Print Pattern Using ORB and Fast Library for Approximate Nearest Neighbor Algorithms
by Allan Josef Balderas, Kaila Mae A. Pangilinan and Meo Vincent C. Caya
Eng. Proc. 2025, 92(1), 53; https://doi.org/10.3390/engproc2025092053 - 7 May 2025
Viewed by 1060
Abstract
Cattle identification is important in livestock management, and advanced techniques are required to identify cattle without ear tagging, branding, or any identification method that harms the cattle. This study aims to develop computer vision techniques to identify cattle based on their unique muzzle [...] Read more.
Cattle identification is important in livestock management, and advanced techniques are required to identify cattle without ear tagging, branding, or any identification method that harms the cattle. This study aims to develop computer vision techniques to identify cattle based on their unique muzzle print features. The developed method employed the YOLOv8 object detection model to detect the cattle’s muzzle. Following detection, the captured muzzle image underwent image processing. Contrast-limited adaptive histogram equalization (CLAHE) was used to enhance the image quality and obtain a prominent and detailed image of the muzzle print. Feature extraction algorithm-oriented FAST and rotated BRIEF (ORB) was applied to extract key points and detect descriptors that are crucial for the cattle identification process. The fast library for approximate nearest neighbor (FLANN) was also employed to identify individual cattle by comparing descriptors of query images from those stored in the database. To validate the developed method, its performance was evaluated on 25 different cattle. In total, 22 out of 25 were correctly identified, resulting in an overall accuracy of 88%. Full article
(This article belongs to the Proceedings of 2024 IEEE 6th Eurasia Conference on IoT, Communication and Engineering)
Show Figures

Figure 1

8 pages, 3278 KB  
Communication
Spectral Measurements of Muzzle Flash with a Temporally and Spatially Modulated LWIR-Imaging Fourier Transform Spectrometer
by Zhixiong Yang, Kun Li, Chunchao Yu, Mingyao Yuan, Boyang Wang and Jie Feng
Sensors 2023, 23(8), 3862; https://doi.org/10.3390/s23083862 - 10 Apr 2023
Cited by 2 | Viewed by 2085
Abstract
It is important to obtain information on an instantaneous target. A high-speed camera can capture a picture of an immediate scene, but spectral information about the object cannot be retrieved. Spectrographic analysis is a key tool for identifying chemicals. Detecting dangerous gas quickly [...] Read more.
It is important to obtain information on an instantaneous target. A high-speed camera can capture a picture of an immediate scene, but spectral information about the object cannot be retrieved. Spectrographic analysis is a key tool for identifying chemicals. Detecting dangerous gas quickly can help ensure personal safety. In this paper, a temporally and spatially modulated long-wave infrared (LWIR)-imaging Fourier transform spectrometer was used to realize hyperspectral imaging. The spectral range was 700~1450 cm−1 (7~14.5 μm). The frame rate of infrared imaging was 200 Hz. The muzzle-flash area of guns with calibers of 5.56 mm, 7.62 mm, and 14.5 mm were detected. LWIR images of muzzle flash were obtained. Spectral information on muzzle flash was obtained using instantaneous interferograms. The main peak of the spectrum of the muzzle flash appeared at 970 cm−1 (10.31 μm). Two secondary peaks near 930 cm−1 (10.75 μm) and 1030 cm−1 (9.71 μm) were observed. Radiance and brightness temperature were also measured. The spatiotemporal modulation of the LWIR-imaging Fourier transform spectrometer provides a new method for rapid spectral detection. The high-speed identification of hazardous gas leakage can ensure personal safety. Full article
Show Figures

Figure 1

22 pages, 7614 KB  
Article
Deep Transfer Learning-Based Animal Face Identification Model Empowered with Vision-Based Hybrid Approach
by Munir Ahmad, Sagheer Abbas, Areej Fatima, Ghassan F. Issa, Taher M. Ghazal and Muhammad Adnan Khan
Appl. Sci. 2023, 13(2), 1178; https://doi.org/10.3390/app13021178 - 16 Jan 2023
Cited by 21 | Viewed by 6538
Abstract
The importance of accurate livestock identification for the success of modern livestock industries cannot be overstated as it is essential for a variety of purposes, including the traceability of animals for food safety, disease control, the prevention of false livestock insurance claims, and [...] Read more.
The importance of accurate livestock identification for the success of modern livestock industries cannot be overstated as it is essential for a variety of purposes, including the traceability of animals for food safety, disease control, the prevention of false livestock insurance claims, and breeding programs. Biometric identification technologies, such as thumbprint recognition, facial feature recognition, and retina pattern recognition, have been traditionally used for human identification but are now being explored for animal identification as well. Muzzle patterns, which are unique to each animal, have shown promising results as a primary biometric feature for identification in recent studies. Muzzle pattern image scanning is a widely used method in biometric identification, but there is a need to improve the efficiency of real-time image capture and identification. This study presents a novel identification approach using a state-of-the-art object detector, Yolo (v7), to automate the identification process. The proposed system consists of three stages: detection of the animal’s face and muzzle, extraction of muzzle pattern features using the SIFT algorithm and identification of the animal using the FLANN algorithm if the extracted features match those previously registered in the system. The Yolo (v7) object detector has mean average precision of 99.5% and 99.7% for face and muzzle point detection, respectively. The proposed system demonstrates the capability to accurately recognize animals using the FLANN algorithm and has the potential to be used for a range of applications, including animal security and health concerns, as well as livestock insurance. In conclusion, this study presents a promising approach for the real-time identification of livestock animals using muzzle patterns via a combination of automated detection and feature extraction algorithms. Full article
Show Figures

Figure 1

20 pages, 2021 KB  
Article
Facial Expressions of Horses Using Weighted Multivariate Statistics for Assessment of Subtle Local Pain Induced by Polylactide-Based Polymers Implanted Subcutaneously
by Júlia R. G. Carvalho, Pedro H. E. Trindade, Gabriel Conde, Marina L. Antonioli, Michelli I. G. Funnicelli, Paula P. Dias, Paulo A. Canola, Marcelo A. Chinelatto and Guilherme C. Ferraz
Animals 2022, 12(18), 2400; https://doi.org/10.3390/ani12182400 - 13 Sep 2022
Cited by 12 | Viewed by 3076
Abstract
Facial-expression-based analysis has been widely applied as a pain coding system in horses. Herein, we aimed to identify pain in horses undergoing subcutaneously polylactide-based polymer implantation. The sham group was submitted only to surgical incision. The horses were filmed before and 24 and [...] Read more.
Facial-expression-based analysis has been widely applied as a pain coding system in horses. Herein, we aimed to identify pain in horses undergoing subcutaneously polylactide-based polymer implantation. The sham group was submitted only to surgical incision. The horses were filmed before and 24 and 48 h after implantation. Five statistical methods for evaluating their facial expressions (FEs) were tested. Primarily, three levels of scores (0, 1, and 2) were applied to the seven FEs (ear movements, eyebrow tension, orbicularis tension, dilated nostrils, eye opening, muzzle tension, and masticatory muscles tension). Subsequently, the scores of the seven FEs were added (SUM). Afterwards, principal component analysis (PCoA) was performed using the scores of the seven FEs obtained using the first method. Subsequently, weights were created for each FE, based on each variable’s contribution variability obtained from the PCoA (SUM.W). Lastly, we applied a general score (GFS) to the animal’s face (0 = without pain; 1 = moderate pain; 2 = severe pain). The mechanical nociceptive threshold (MNT) and cutaneous temperature (CT) values were collected at the same moments. The results show no intra- or intergroup differences, when evaluating each FE separately or in the GFS. In the intragroup comparison and 48 h after implantation, the control group showed higher values for SUM, PCoA, and SUM.W, although the horses implanted with polymers displayed more obvious alterations in the CT and MNT. Our findings show that the five statistical strategies used to analyze the faces of the horses were not able to detect low-grade inflammatory pain. Full article
(This article belongs to the Section Equids)
Show Figures

Figure 1

13 pages, 4130 KB  
Article
A Fast Identification Method of Gunshot Types Based on Knowledge Distillation
by Jian Li, Jinming Guo, Xiushan Sun, Chuankun Li and Lingpeng Meng
Appl. Sci. 2022, 12(11), 5526; https://doi.org/10.3390/app12115526 - 29 May 2022
Cited by 6 | Viewed by 3141
Abstract
To reduce the large size of a gunshot recognition network model and to improve the insufficient real-time detection in urban combat, this paper proposes a fast gunshot type recognition method based on knowledge distillation. First, the muzzle blast and the shock wave generated [...] Read more.
To reduce the large size of a gunshot recognition network model and to improve the insufficient real-time detection in urban combat, this paper proposes a fast gunshot type recognition method based on knowledge distillation. First, the muzzle blast and the shock wave generated by the gunshot are preprocessed, and the quality of the gunshot recognition dataset is improved using Log-Mel spectrum corresponding to these two signals. Second, a teacher network is constructed using 10 two-dimensional residual modules, and a student network is designed using depth wise separable convolution. Third, the lightweight student network is made to learn the gunshot features under the guidance of the pre-trained large-scale teacher network. Finally, the network’s accuracy, model size, and recognition time are tested using the AudioSet dataset and the NIJ Grant 2016-DN-BX-0183 gunshot dataset. The findings demonstrate that the proposed algorithm achieved 95.6% and 83.5% accuracy on the two datasets, the speed was 0.5 s faster, and the model size was reduced to 2.5 MB. The proposed method is of good practical value in the field of gunshot recognition. Full article
(This article belongs to the Special Issue Computer Vision and Pattern Recognition Based on Deep Learning)
Show Figures

Figure 1

6 pages, 353 KB  
Article
Investigation of the Use of Non-Invasive Samples for the Molecular Detection of EHV-1 in Horses with and without Clinical Infection
by Danielle Price, Samantha Barnum, Jenny Mize and Nicola Pusterla
Pathogens 2022, 11(5), 574; https://doi.org/10.3390/pathogens11050574 - 13 May 2022
Cited by 9 | Viewed by 2398
Abstract
The purpose of this study was to explore sampling options for a reliable and logistically more feasible protocol during a large EHV-1 outbreak. Seventeen horses with clinical infection as well as nineteen healthy herdmates, all part of an EHM outbreak, were enrolled in [...] Read more.
The purpose of this study was to explore sampling options for a reliable and logistically more feasible protocol during a large EHV-1 outbreak. Seventeen horses with clinical infection as well as nineteen healthy herdmates, all part of an EHM outbreak, were enrolled in the study. Each horse was sampled two–four times at intervals of 2–6 days during the outbreak. All samples were collected using 6′′ rayon-tipped swabs. Nasal secretions were used as the diagnostic sample of choice. Additional samples, including swabs from the muzzle/nares, swabs from the front limbs, rectal swabs, swabs of the feed bin, and swabs of the water troughs were collected as well. All swabs were tested for the presence of EHV-1 by qPCR. With the exception of two EHV-1 qPCR-positive swabs from two different horses, all remaining swabs collected from healthy herdmates tested qPCR-negative for EHV-1. For horses with clinical infection, EHV-1 was detected in 31 nasal swabs, 30 muzzle/nares swabs, 7 front limb swabs, 7 feeders, 6 water troughs and 6 rectal swabs. Not all positive muzzle/nares swabs correlated with a positive nasal swab from the same set, however, and all other positive swabs did correlate with a positive nasal swab in their respective set. The agreement between nasal swabs and muzzle/nares swabs was 74%. The sampling of non-invasive swabs from the muzzle/nares should facilitate the identification of EHV-1 shedders during an outbreak, allowing for prompt isolation and implementation of biosecurity measures. Full article
(This article belongs to the Section Viral Pathogens)
Show Figures

Figure 1

18 pages, 1229 KB  
Article
Behavioural Classification of Cattle Using Neck-Mounted Accelerometer-Equipped Collars
by Dejan Pavlovic, Mikolaj Czerkawski, Christopher Davison, Oskar Marko, Craig Michie, Robert Atkinson, Vladimir Crnojevic, Ivan Andonovic, Vladimir Rajovic, Goran Kvascev and Christos Tachtatzis
Sensors 2022, 22(6), 2323; https://doi.org/10.3390/s22062323 - 17 Mar 2022
Cited by 19 | Viewed by 5063
Abstract
Monitoring and classification of dairy cattle behaviours is essential for optimising milk yields. Early detection of illness, days before the critical conditions occur, together with automatic detection of the onset of oestrus cycles is crucial for obviating prolonged cattle treatments and improving the [...] Read more.
Monitoring and classification of dairy cattle behaviours is essential for optimising milk yields. Early detection of illness, days before the critical conditions occur, together with automatic detection of the onset of oestrus cycles is crucial for obviating prolonged cattle treatments and improving the pregnancy rates. Accelerometer-based sensor systems are becoming increasingly popular, as they are automatically providing information about key cattle behaviours such as the level of restlessness and the time spent ruminating and eating, proxy measurements that indicate the onset of heat events and overall welfare, at an individual animal level. This paper reports on an approach to the development of algorithms that classify key cattle states based on a systematic dimensionality reduction process through two feature selection techniques. These are based on Mutual Information and Backward Feature Elimination and applied on knowledge-specific and generic time-series extracted from raw accelerometer data. The extracted features are then used to train classification models based on a Hidden Markov Model, Linear Discriminant Analysis and Partial Least Squares Discriminant Analysis. The proposed feature engineering methodology permits model deployment within the computing and memory restrictions imposed by operational settings. The models were based on measurement data from 18 steers, each animal equipped with an accelerometer-based neck-mounted collar and muzzle-mounted halter, the latter providing the truthing data. A total of 42 time-series features were initially extracted and the trade-off between model performance, computational complexity and memory footprint was explored. Results show that the classification model that best balances performance and computation complexity is based on Linear Discriminant Analysis using features selected through Backward Feature Elimination. The final model requires 1.83 ± 1.00 ms to perform feature extraction with 0.05 ± 0.01 ms for inference with an overall balanced accuracy of 0.83. Full article
(This article belongs to the Collection Sensors and Robotics for Digital Agriculture)
Show Figures

Figure 1

16 pages, 5127 KB  
Article
Automated Muzzle Detection and Biometric Identification via Few-Shot Deep Transfer Learning of Mixed Breed Cattle
by Ali Shojaeipour, Greg Falzon, Paul Kwan, Nooshin Hadavi, Frances C. Cowley and David Paul
Agronomy 2021, 11(11), 2365; https://doi.org/10.3390/agronomy11112365 - 22 Nov 2021
Cited by 60 | Viewed by 13475
Abstract
Livestock welfare and management could be greatly enhanced by the replacement of branding or ear tagging with less invasive visual biometric identification methods. Biometric identification of cattle from muzzle patterns has previously indicated promising results. Significant barriers exist in the translation of these [...] Read more.
Livestock welfare and management could be greatly enhanced by the replacement of branding or ear tagging with less invasive visual biometric identification methods. Biometric identification of cattle from muzzle patterns has previously indicated promising results. Significant barriers exist in the translation of these initial findings into a practical precision livestock monitoring system, which can be deployed at scale for large herds. The objective of this study was to investigate and address key limitations to the autonomous biometric identification of cattle. The contributions of this work are fourfold: (1) provision of a large publicly-available dataset of cattle face images (300 individual cattle) to facilitate further research in this field, (2) development of a two-stage YOLOv3-ResNet50 algorithm that first detects and extracts the cattle muzzle region in images and then applies deep transfer learning for biometric identification, (3) evaluation of model performance across a range of cattle breeds, and (4) utilizing few-shot learning (five images per individual) to greatly reduce both the data collection requirements and duration of model training. Results indicated excellent model performance. Muzzle detection accuracy was 99.13% (1024 × 1024 image resolution) and biometric identification achieved 99.11% testing accuracy. Overall, the two-stage YOLOv3-ResNet50 algorithm proposed has substantial potential to form the foundation of a highly accurate automated cattle biometric identification system, which is applicable in livestock farming systems. The obtained results indicate that utilizing livestock biometric monitoring in an advanced manner for resource management at multiple scales of production is possible for future agriculture decision support systems, including providing useful information to forecast acceptable stocking rates of pastures. Full article
(This article belongs to the Special Issue Data-Driven Agricultural Innovations)
Show Figures

Figure 1

13 pages, 2800 KB  
Article
Emotion Recognition in Horses with Convolutional Neural Networks
by Luis A. Corujo, Emily Kieson, Timo Schloesser and Peter A. Gloor
Future Internet 2021, 13(10), 250; https://doi.org/10.3390/fi13100250 - 28 Sep 2021
Cited by 22 | Viewed by 5346
Abstract
Creating intelligent systems capable of recognizing emotions is a difficult task, especially when looking at emotions in animals. This paper describes the process of designing a “proof of concept” system to recognize emotions in horses. This system is formed by two elements, a [...] Read more.
Creating intelligent systems capable of recognizing emotions is a difficult task, especially when looking at emotions in animals. This paper describes the process of designing a “proof of concept” system to recognize emotions in horses. This system is formed by two elements, a detector and a model. The detector is a fast region-based convolutional neural network that detects horses in an image. The model is a convolutional neural network that predicts the emotions of those horses. These two elements were trained with multiple images of horses until they achieved high accuracy in their tasks. In total, 400 images of horses were collected and labeled to train both the detector and the model while 40 were used to test the system. Once the two components were validated, they were combined into a testable system that would detect equine emotions based on established behavioral ethograms indicating emotional affect through the head, neck, ear, muzzle, and eye position. The system showed an accuracy of 80% on the validation set and 65% on the test set, demonstrating that it is possible to predict emotions in animals using autonomous intelligent systems. Such a system has multiple applications including further studies in the growing field of animal emotions as well as in the veterinary field to determine the physical welfare of horses or other livestock. Full article
(This article belongs to the Section Big Data and Augmented Intelligence)
Show Figures

Figure 1

18 pages, 1526 KB  
Article
Classification of Cattle Behaviours Using Neck-Mounted Accelerometer-Equipped Collars and Convolutional Neural Networks
by Dejan Pavlovic, Christopher Davison, Andrew Hamilton, Oskar Marko, Robert Atkinson, Craig Michie, Vladimir Crnojević, Ivan Andonovic, Xavier Bellekens and Christos Tachtatzis
Sensors 2021, 21(12), 4050; https://doi.org/10.3390/s21124050 - 12 Jun 2021
Cited by 48 | Viewed by 6916
Abstract
Monitoring cattle behaviour is core to the early detection of health and welfare issues and to optimise the fertility of large herds. Accelerometer-based sensor systems that provide activity profiles are now used extensively on commercial farms and have evolved to identify behaviours such [...] Read more.
Monitoring cattle behaviour is core to the early detection of health and welfare issues and to optimise the fertility of large herds. Accelerometer-based sensor systems that provide activity profiles are now used extensively on commercial farms and have evolved to identify behaviours such as the time spent ruminating and eating at an individual animal level. Acquiring this information at scale is central to informing on-farm management decisions. The paper presents the development of a Convolutional Neural Network (CNN) that classifies cattle behavioural states (‘rumination’, ‘eating’ and ‘other’) using data generated from neck-mounted accelerometer collars. During three farm trials in the United Kingdom (Easter Howgate Farm, Edinburgh, UK), 18 steers were monitored to provide raw acceleration measurements, with ground truth data provided by muzzle-mounted pressure sensor halters. A range of neural network architectures are explored and rigorous hyper-parameter searches are performed to optimise the network. The computational complexity and memory footprint of CNN models are not readily compatible with deployment on low-power processors which are both memory and energy constrained. Thus, progressive reductions of the CNN were executed with minimal loss of performance in order to address the practical implementation challenges, defining the trade-off between model performance versus computation complexity and memory footprint to permit deployment on micro-controller architectures. The proposed methodology achieves a compression of 14.30 compared to the unpruned architecture but is nevertheless able to accurately classify cattle behaviours with an overall F1 score of 0.82 for both FP32 and FP16 precision while achieving a reasonable battery lifetime in excess of 5.7 years. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

16 pages, 791 KB  
Article
An Artificial Vibrissa-Like Sensor for Detection of Flows
by Moritz Scharff, Philipp Schorr, Tatiana Becker, Christian Resagk, Jorge H. Alencastre Miranda and Carsten Behn
Sensors 2019, 19(18), 3892; https://doi.org/10.3390/s19183892 - 10 Sep 2019
Cited by 7 | Viewed by 3947
Abstract
In nature, there are several examples of sophisticated sensory systems to sense flows, e.g., the vibrissae of mammals. Seals can detect the flow of their prey, and rats are able to perceive the flow of surrounding air. The vibrissae are arranged around muzzle [...] Read more.
In nature, there are several examples of sophisticated sensory systems to sense flows, e.g., the vibrissae of mammals. Seals can detect the flow of their prey, and rats are able to perceive the flow of surrounding air. The vibrissae are arranged around muzzle of an animal. A vibrissa consists of two major components: a shaft (infector) and a follicle–sinus complex (receptor), whereby the base of the shaft is supported by the follicle-sinus complex. The vibrissa shaft collects and transmits stimuli, e.g., flows, while the follicle-sinus complex transduces them for further processing. Beside detecting flows, the animals can also recognize the size of an object or determine the surface texture. Here, the combination of these functionalities in a single sensory system serves as paragon for artificial tactile sensors. The detection of flows becomes important regarding the measurement of flow characteristics, e.g., velocity, as well as the influence of the sensor during the scanning of objects. These aspects are closely related to each other, but, how can the characteristics of flow be represented by the signals at the base of a vibrissa shaft or by an artificial vibrissa-like sensor respectively? In this work, the structure of a natural vibrissa shaft is simplified to a slender, cylindrical/tapered elastic beam. The model is analyzed in simulation and experiment in order to identify the necessary observables to evaluate flows based on the quasi-static large deflection of the sensor shaft inside a steady, non-uniform, laminar, in-compressible flow. Full article
Show Figures

Figure 1

Back to TopTop