Next Article in Journal
Emotional “Contagion” in Piglets after Sensory Avoidance of Rewarding and Punishing Treatment
Previous Article in Journal
The Levels of Cortisol and Selected Biochemical Parameters in Red Deer Harvested during Stalking Hunts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automated Observations of Dogs’ Resting Behaviour Patterns Using Artificial Intelligence and Their Similarity to Behavioural Observations

by
Ivana Schork
1,
Anna Zamansky
2,
Nareed Farhat
2,
Cristiano Schetini de Azevedo
3 and
Robert John Young
1,*
1
School of Sciences, Engineering & Environment, University of Salford, Manchester M5 4WT, UK
2
Information Systems Department, University of Haifa, Haifa 31905, Israel
3
Department of Evolution, Biodiversity and Environment, Institute of Exact and Biological Sciences, Federal University of Ouro Preto, Ouro Preto 35402-136, Brazil
*
Author to whom correspondence should be addressed.
Animals 2024, 14(7), 1109; https://doi.org/10.3390/ani14071109
Submission received: 12 February 2024 / Revised: 25 March 2024 / Accepted: 3 April 2024 / Published: 4 April 2024
(This article belongs to the Section Animal Welfare)

Abstract

:

Simple Summary

Our research team has developed an automated computer system that uses convolutional neural networks (CNNs) to monitor and analyse the sleep patterns of dogs. Traditional methods of recording animal behaviour, such as direct observations (of sleep) of either live behaviour or recorded behaviour, can be time-consuming and error-prone, making it difficult to replicate studies. Sleep may be a crucial indicator of an animal’s well-being, but it has been overlooked in animal welfare research due to the time-consuming nature of measuring sleep. Compared to direct behavioural observations from the same videos, our system achieved an 89% similarity score in automatically detecting and quantifying sleep duration and fragmentation in dogs. Although there were no significant differences in the time percentage of sleep observed, the system recorded more total sleep time than human observers making direct observations on the same data sources. The automated system used could become a valuable tool for animal behaviour and welfare research.

Abstract

Although direct behavioural observations are widely used, they are time-consuming, prone to error, require knowledge of the observed species, and depend on intra/inter-observer consistency. As a result, they pose challenges to the reliability and repeatability of studies. Automated video analysis is becoming popular for behavioural observations. Sleep is a biological metric that has the potential to become a reliable broad-spectrum metric that can indicate the quality of life and understanding sleep patterns can contribute to identifying and addressing potential welfare concerns, such as stress, discomfort, or health issues, thus promoting the overall welfare of animals; however, due to the laborious process of quantifying sleep patterns, it has been overlooked in animal welfare research. This study presents a system comparing convolutional neural networks (CNNs) with direct behavioural observation methods for the same data to detect and quantify dogs’ sleeping patterns. A total of 13,688 videos were used to develop and train the model to quantify sleep duration and sleep fragmentation in dogs. To evaluate its similarity to the direct behavioural observations made by a single human observer, 6000 previously unseen frames were used. The system successfully classified 5430 frames, scoring a similarity rate of 89% when compared to the manually recorded observations. There was no significant difference in the percentage of time observed between the system and the human observer (p > 0.05). However, a significant difference was found in total sleep time recorded, where the automated system captured more hours than the observer (p < 0.05). This highlights the potential of using a CNN-based system to study animal welfare and behaviour research.

1. Introduction

The study of animal welfare is often carried out through the measurement of animal behaviour. This is because an animal’s behaviour (i.e., welfare output) directly corresponds to its environmental conditions (i.e., welfare inputs) and the attempts of individuals to adapt [1]. Furthermore, behavioural observations are frequently preferred over other methods, such as physiological measures, due to their non-invasive nature and lower probability of interfering with individuals’ responses [2,3].
Despite being a simple method, behavioural observations (assessments) are not without their limitations. First, observing behaviour demands time from a human observer to quantify the behaviour. Second, a basic knowledge of the species’ behaviour is necessary to answer specific questions about welfare. Third, results are highly dependable on the reliability of scoring the same behaviours consistently over time, which demands the training of human observers to ensure inter-observer reliability if multiple observers are used. Lastly, not all animal-holding institutions allow researchers to observe animals outside their working/daytime hours, which causes a loss of important information over time; for example, zoo studies are biased towards daylight hours [3,4,5].
With advances in technology, it is possible to try to mitigate some of the problems of direct behavioural observations; for example, video monitoring of animals in different environments is used as an alternative to direct observations, but images still need to be quantified by a human observer [6,7]. Additionally, software has been developed to help score behaviour from videos to expedite the processing of images (e.g., [8]). Despite this, human-observed measurements of animal behaviour, even when computer-assisted, remain slow, labour-intensive, and prone to errors [4,6,9]. Such tedious and lengthy processes reduce the number of experiments that can be conducted, reduce the opportunity to work with larger sample sizes, and can limit the statistical power of the results [6,9].
Recent advancements in computer vision and deep learning could lead to the development of automated tracking and behaviour analysis systems that could revolutionise how behavioural variables are recorded. Examples of how these automated systems can help handle larger data sets can be found in the field of Ecology and Conservation. Using camera traps is a well-known and cost-effective methodology for monitoring populations without interference [10]. However, they also generate vast amounts of data from the photos and videos acquired during sampling [11,12]. Deep learning models have been used in all stages of data processing, from the system classification of photos with and without individuals to the categorisation of behavioural repertoires, thus providing an efficient tool for analysing large-scale camera-trap data [11,12,13].
Therefore, automated videos can increase scoring accuracy, replicability, and the number and nature of measured variables. This allows for the generation of larger data sets and more significant sample sizes, which increases statistical power [14,15,16]. These are the aims of the emergent field of computational behaviour analysis, also called computational ethology [6].
Automated video systems to record animal behaviour already exist for wild animals [17], farm animals [18], laboratory rodents [19], insects [20], and fish [21], and there are even well-developed commercial systems such as Ethovision [22]. Furthermore, in the field of animal welfare, such systems have been used to monitor pregnant cows before calving [23], aggression in pigs [24], and the activity of broiler chickens with different gait scores [25].
However, among the behaviours that automation has yet to explore is sleep, a well-researched indicator of humans’ good health and well-being [26], which has been mostly overlooked in animal welfare research.
The decades-long comprehensive study of the sleep/wake cycle, using both human and several non-human models, led to the conclusion that sleep is not a simple resting state but an essential physiological process that mediates individuals’ physiological and psychological functions and is an intrinsic part of the homeostatic process [27,28]. Moreover, the environment and events experienced during waking hours can affect the quantity and quality of sleep, and stress remains the main factor impairing sleep in both human and non-human animals [29,30,31,32]. As stress remains the primary source of stress in captivity [33], understanding sleep characteristics, especially sleep changes, is relevant to the health and well-being of animals under human care.
Still, the use of sleep as a metric is limited, most likely due to the difficulties in measuring such behaviour [34]. Not only is sleep behaviour challenging to measure due to the time-consuming and intensive nature of observations, but sleep has also been believed to provide accurate information about an animal’s biology only if assessed using EEGs.
Nonetheless, previous studies have attempted to quantify sleep and rest behaviour through video monitoring and this has proven to be an effective non-invasive technique. Some studies have scored over 90% confidence between observations and EEG, demonstrating the reliability of this method (e.g., [35,36]). Even though video monitoring is based on human observations, it still shows that measuring sleep using a video-based methodology is possible. Additionally, video monitoring is invaluable for such studies because cameras provide a spatial and temporal metric that can be used to assess most aspects of animals’ behaviour without interfering with the individuals [6,21].
Sleep also has desirable characteristics that may facilitate its use as a target behaviour for automated monitoring. Despite mammal species differing in some of the characteristics of their sleep patterns, such as the number of bouts and time of day, sleep still fulfils the same biological purposes. All species follow comparable sleep cycles, which commence with slow-wave sleep, followed by REM sleep (Rapid Eye Movement), and then wakefulness [37,38]. Moreover, virtually all mammals (except some marine mammals [39]) are immobile when sleeping or have sleep postures which can be linked to specific phases of their sleep cycle (e.g., horses [40] and cows [41]). Therefore, facilitating the assessment of sleep behaviour using automation should provide a valuable tool for animal welfare assessment.
In recent years, research has demonstrated that behavioural sleep information can provide valuable insights into an individual’s welfare (e.g., dogs [42], horses [40], giraffes [43], chickens [44]). Dogs, in particular, can be useful as models to study sleep welfare due to the unique relationship developed through coevolution with humans; dogs present certain cognitive and behavioural traits that enable them to have similar responses to the environment and towards other individuals, much more like humans than any other existent species [45]. Hence, they also make valuable models when studying complex subjects such as brain development, cognition, and sleep disorders (e.g., narcolepsy) [46,47]. Moreover, several characteristics make the domestic dog an ideal model species for animal welfare studies. First, they have well-known physiology and behaviour, including sleep parameters [48]. Second, dogs are accessible in large sample sizes and easily trained [49]. Third, pet dogs coexist with humans, which means they can provide information on how they cope with a world designed for humans [45,50,51].
Studies aiming to automate the investigation of dog behaviour have relied on wearable technology for pets to measure dog behaviour [52]. While these devices can measure activity and sleeping patterns, scientific validation is often lacking [53,54]. Furthermore, using this technology in clinical or scientific settings is not always appropriate.
Similarly, although some of these sensor-based activity trackers have achieved good accuracy [53,55,56], they are limited to a small number of behaviours (e.g., resting) and postures (e.g., lying down, sitting, etc.), which compromises their use in animal welfare assessment.
Despite being a well-studied species, only a few studies address the automatic video-based analysis of dog behaviour [14,57,58,59]. These studies automatically tracked individuals and detected dogs’ body parts using machine learning classifiers. However, the experiments used videos taken from 3D Microsoft Kinect cameras (Redmond, Washington, DC, USA) or street (security) surveillance systems whose installation is not trivial, and the devices are expensive.
In this study, we present a system that compares convolutional neural networks (CNNs) with direct human behavioural observation methods to detect and quantify dogs’ sleeping patterns using the same dataset. A CNN is a deep learning algorithm trained to classify behaviours directly from images. These networks learn how to use patterns to identify objects, faces, and scenes from image data [60]. Unlike previous studies using automatic video analysis, requiring specialised equipment, this system was designed to work on video footage obtained from low-quality, cheap, readily available cameras. It also has a user-friendly interface that produces a summary output of the variables measured, which removes the need for any advanced knowledge from the user to be able to use the system.

2. Materials and Methods

2.1. Ethical Statement

This study underwent review and approval by two ethics panels: the Science and Technology Research Ethics Panel at the University of Salford, Manchester (STR1617-80), and the Commission of Ethical Use of Animals in Research at the Universidade Federal de Ouro Preto, Minas Gerais, Brazil (Protocol 2017/04).

2.2. Video Acquisition of Dogs

For the study, cameras were installed in the kennel facilities of the University of Ouro Preto, Minas Gerais, Brazil. Thirteen mixed-breed adult dogs were observed during eight months in five-day recording periods, totalling 130 nights of recordings and 13,668 videos captured by the cameras. These were then used to develop the system at the Tech4Animals lab at the University of Haifa, Israel.
The videos were recorded on a domestic CCTV system (Swann SWDVK-845504, Santa Fe Springs, CA, USA) with night vision capability. The cameras were able to capture videos in two modes: full-colour mode, when the sun or a lamp illuminates the space, and grey-scale mode, during the night (dark) or with very low light levels, which triggers the infrared light and automatically switches the camera to night vision. Despite the camera’s HD resolution (1280 × 720), the video footage is technically considered low quality.

2.3. BlyzerDS System Overview

The BlyzerDS (Behaviour Analyzer—Dog Sleep) system is an extension of a previously developed system for automatic tracking of dog movement [57], which was adapted specifically for the needs of this project. The system is designed to take digital video footage as input and generate a summary of sleep parameters. It can calculate each dog’s amount of sleep in the footage and the number of sleeping bouts. The video is transmitted to the system server and processed frame by frame. The neural network performs two main tasks: marking the position of the dog or dogs and classifying their state as awake or asleep. The system accepts raw digital video as input and generates a summary of the sleep parameters for that video. The analysis is divided into two stages. In stage 1, the system application sends the raw digital video frame by frame to the server. The server application uses a RESNET neural network [61] that has been trained beforehand to detect dogs in a frame by outputting a bounding box around them (as shown in Figure 1). The frames containing the bounding box are then sent back to the system application to be processed in the second stage.
In stage 2, the system detects movement in a series of frames. If no movement is detected the dog is classified as asleep. The following steps are used to detect movement:
  • The content of the bounding box is converted into black-and-white images.
  • The image is blurred.
  • The change (delta) between consecutive frames is calculated.
  • The computed delta is binarized with a threshold.
  • The binarized image is dilated to fill in the gaps.
  • Contours are detected, and their area is computed.
After these steps, the system scores the dog as asleep or awake and returns a summary of sleep parameters for that video. Detailed information on the system architecture, including pre- and post-processing training algorithms, can be found in [62].

2.4. Training Data Set and System Evaluation

The system was trained using 80,000 frames extracted from the data set. The developers manually revised the output results to ensure each frame contained two attributes: a bounding box surrounding each identifiable dog and the dog’s state as awake or asleep. Any frames with unclear images of dogs or with no dogs were discarded from the analysis.
The system’s accuracy was evaluated using ten videos of 600 s each. The video set included videos with 0–2 dogs, day/night, and different dogs and kennels. The system processed the videos, and a testing set of 6000 frames annotated with the system’s predictions was manually checked for correctness by the developers.

2.5. Similarity of the System against Standard Behavioural Observations

To compare the system’s similarity to the manual recordings of behaviour, 15 random nights were selected to be evaluated by the system and a human observer. Behavioural observations were carried out using focal sampling with continuous recordings of behaviour [63]. Sleep duration was recorded at any time the animal was in a resting position, with eyes closed, and/or with no perceivable movement. Additionally, the number of sleeping bouts was recorded for the observed period. A sleeping bout was determined as the shifting between wakefulness, sleeping, and wakefulness, regardless of sleep duration. Data were summarised for the observed periods as the sleeping duration in seconds and the number of observed sleeping bouts. These are the metrics that were also summarised by the system output, and which were used to compare the similarity of methods. Behavioural data were scored in the software Boris v.7.0.12 (Behavioural Observation Research Interactive Software) [8].

2.6. Data Analysis

The summary of data from the automated system and the direct behavioural observations (summarized identically) were tested for normality using Anderson–Darling tests. All statistical tests were considered significant at p < 0.05. Results are presented as either total measured time or count totals. The system’s similarity was tested against the human-recorded observations of the same observation sessions using paired t-tests [64]. Statistical analyses were carried out using SPSS 27.0 [65].

3. Results

During the evaluation of the system using the training data set, the system correctly classified 5340 frames out of the 6000 tested, scoring an 89% accuracy. Of the 15 days submitted, three days had to be excluded from the final analysis since poor weather conditions caused the cameras to move and the lights to switch on and off, leading to an inaccurate analysis. An additional day was excluded due to the loss of three hours of footage, which prevented comparisons between the methods.
The automated system scored an average of sleep of 10.9 ± 2.2 h, against 9.7 ± 1.6 h recorded manually by human observers. Moreover, the system found an average of 15 ± 5 bouts per night, while the human observations returned 16 ± 3.5 bouts per night. The differences between the methods ranged from 0.13% to 2.68%, with a mean difference of 0.88% (Table 1).
There was a significant difference between the computer system and the manual recordings for the duration of sleep behaviour in seconds (t = 2.805, df = 10, p = 0.019). However, when the duration was converted to a percentage of time spent asleep versus awake, no statistical difference was found between the two methods of observation (p > 0.05). Similarly, no difference was found for the number of bouts recorded (p > 0.05).

4. Discussion

In this study, we have demonstrated that automating the monitoring of behaviours is possible and reliable. It, therefore, offers a practical solution for mitigating common problems associated with measuring animal behaviour.
Despite observing a significant difference in sleep duration, the system showed a high similarity score when classifying sleep behaviour. It was just as precise as a human when recording the number of bouts and percentage of time spent asleep. Additionally, the system was able to record more sleep than manual observations in some cases, which highlights the problems associated with manually going through thousands of hours of recordings. Human observers typically take up to three times longer than the actual video duration to score, which can cause fatigue, leading to measurement inconsistencies over time [6,9].
Using a system based on the recent advances in artificial intelligence (AI), it is possible to optimise data collection, increasing precision, replicability, and experimental throughput [7,14,16]. Furthermore, these systems could become a valuable instrument for monitoring behaviours that are laborious to assess, such as resting [15,16]. As AI continues to learn while it is being used it is possible that it could detect patterns of small changes of activity (e.g., tiny twitches) or find patterns in changes in body postures, which could be associated with the shifts between different sleep phases (i.e., from non-REM to REM). Integrating remote EEG detection and AI prediction systems with humans is accurate when detecting sleep stages (range 76–96% [66,67]). Additionally, AI algorithms can be developed to integrate data from multiple sources, such as sleep posture, heart rate variability, and electrocardiogram (ECG) signals, to provide a more comprehensive understanding of sleep stages and their physiological correlations [67,68]. In non-human species, investigations using remote EEG and sleep postures have been attempted in cows; however, posture was not a good predictor for light and deep NREM sleep, indicating that further research is still necessary and the technology needs further development to improve results [69].
The downside of using behavioural observations in our sleep research was the extensive hours of video monitoring necessary for the BlyzerDS System to provide an efficient and accurate solution. The system is still being developed to make it more accurate and precise in measuring sleep parameters. The problems associated with image processing were mostly related to image quality and ‘noise’ in the footage. For example, as the cameras are exposed to the weather, on windy days, they move significantly, blurring the images, which lead to inaccurate measuring. The poor lighting conditions at the kennels also made it challenging to identify the dogs if they were in darker corners. These conditions can be mitigated by adjusting the CNN algorithm to compensate for movement and light, although this would require several months of future work.
CNN-based systems such as the BlyzerDS system, with further improvements and new training data sets, can be modified to record more categories of behaviour and the behaviour of other species. This could lead to a system that can be universally used by people working with animals in many different environments.

5. Conclusions

The use of CNN-based systems such as the BlyzerDS system can potentially improve how we conduct research into animal behaviour. Even in its initial stages, the system showed good precision when evaluating the sleep behaviour of dogs. Developing an autonomous system for behaviour analysis may help mitigate common problems associated with processing large video data sets manually: an extremely time-consuming, tedious, and error-prone task.

Author Contributions

Conceptualization, I.S., A.Z., C.S.d.A. and R.J.Y.; methodology, I.S., A.Z, C.S.d.A. and R.J.Y.; software A.Z. and N.F.; validation, A.Z. and N.F.; formal analysis I.S., A.Z. and N.F.; data curation I.S., A.Z. and C.S.d.A.; writing—original draft preparation, I.S., C.S.d.A. and R.J.Y.; writing—review and editing All authors; supervision, C.S.d.A. and R.J.Y.; project administration, I.S. and A.Z.; funding acquisition I.S. and A.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by CNPq, grant n 202351/2015-7.

Institutional Review Board Statement

This study was submitted to and approved by the Science & Technology Research Ethics Panel of the University of Salford Manchester (STR1617-80) and by the Commission of Ethical Use of Animals in Research of the Universidade Federal de Ouro Preto, Minas Gerais–Brazil (Protocol 2017/04).

Informed Consent Statement

Not applicable.

Data Availability Statement

Raw behavioural data used in the analysis is available online (Mendeley Data, V1, https://doi.org/10.17632/vfd7m2x38k.1 accessed on 1 June 2018).

Acknowledgments

The authors would like to express their gratitude to Hugo Costa, DVM, Chief Veterinarian of the CCA Kennels, and his staff for providing constant support during data collection. They also extend their thanks to Aleksandr Sintica and Dmitry Kaplun for their support in the early stages of the system development. Additionally, I.S. thanks CNPq for providing scholarship funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Fraser, D.; Weary, D.M.; Pajor, E.A.; Milligan, B.N. A Scientific Conception of Animal Welfare That Reflects Ethical Concerns. Anim. Welf. 1997, 6, 187–205. [Google Scholar] [CrossRef]
  2. Hill, S.P.; Broom, D.M. Measuring Zoo Animal Welfare: Theory and Practice. Zoo Biol. 2009, 28, 531–544. [Google Scholar] [CrossRef]
  3. Mason, G.; Mendl, M. Why Is There No Simple Way of Measuring Animal Welfare? Anim. Welf. 1993, 2, 301–319. [Google Scholar] [CrossRef]
  4. Fonio, E.; Golani, I.; Benjamini, Y. Measuring Behavior of Animal Models: Faults and Remedies. Nat. Methods 2012, 9, 1167–1170. [Google Scholar] [CrossRef]
  5. Levitis, D.A.; Lidicker, W.Z.; Freund, G. Behavioural Biologists Do Not Agree on What Constitutes Behaviour. Anim. Behav. 2009, 78, 103–110. [Google Scholar] [CrossRef]
  6. Anderson, D.J.; Perona, P. Toward a Science of Computational Ethology. Neuron 2014, 84, 18–31. [Google Scholar] [CrossRef]
  7. Egnor, S.E.R.; Branson, K. Computational Analysis of Behavior. Annu. Rev. Neurosci. 2016, 39, 217–236. [Google Scholar] [CrossRef]
  8. Friard, O.; Gamba, M. BORIS: A Free, Versatile Open-Source Event-Logging Software for Video/Audio Coding and Live Observations. Methods Ecol. Evol. 2016, 7, 1325–1330. [Google Scholar] [CrossRef]
  9. Button, K.S.; Ioannidis, J.P.A.; Mokrysz, C.; Nosek, B.A.; Flint, J.; Robinson, E.S.J.; Munafò, M.R. Power Failure: Why Small Sample Size Undermines the Reliability of Neuroscience. Nat. Rev. Neurosci. 2013, 14, 365–376. [Google Scholar] [CrossRef]
  10. Burton, A.C.; Neilson, E.; Moreira, D.; Ladle, A.; Steenweg, R.; Fisher, J.T.; Bayne, E.; Boutin, S. Wildlife Camera Trapping: A Review and Recommendations for Linking Surveys to Ecological Processes. J. Appl. Ecol. 2015, 52, 675–685. [Google Scholar] [CrossRef]
  11. Vélez, J.; McShea, W.; Shamon, H.; Castiblanco-Camacho, P.J.; Tabak, M.A.; Chalmers, C.; Fergus, P.; Fieberg, J. An Evaluation of Platforms for Processing Camera-trap Data Using Artificial Intelligence. Methods Ecol. Evol. 2023, 14, 459–477. [Google Scholar] [CrossRef]
  12. Leorna, S.; Brinkman, T. Human vs. Machine: Detecting Wildlife in Camera Trap Images. Ecol. Inform. 2022, 72, 101876. [Google Scholar] [CrossRef]
  13. Lu, W.; Zhao, Y.; Wang, J.; Zheng, Z.; Feng, L.; Tang, J. MammalClub: An Annotated Wild Mammal Dataset for Species Recognition, Individual Identification, and Behavior Recognition. Electronics 2023, 12, 4506. [Google Scholar] [CrossRef]
  14. Barnard, S.; Calderara, S.; Pistocchi, S.; Cucchiara, R.; Podaliri-Vulpiani, M.; Messori, S.; Ferri, N. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour. PLoS ONE 2016, 11, e0158748. [Google Scholar] [CrossRef] [PubMed]
  15. Pons, P.; Jaen, J.; Catala, A. Assessing Machine Learning Classifiers for the Detection of Animals’ Behavior Using Depth-Based Tracking. Expert Syst. Appl. 2017, 86, 235–246. [Google Scholar] [CrossRef]
  16. Valletta, J.J.; Torney, C.; Kings, M.; Thornton, A.; Madden, J. Applications of Machine Learning in Animal Behaviour Studies. Anim. Behav. 2017, 124, 203–220. [Google Scholar] [CrossRef]
  17. Gomez Villa, A.; Salazar, A.; Vargas, F. Towards Automatic Wild Animal Monitoring: Identification of Animal Species in Camera-Trap Images Using Very Deep Convolutional Neural Networks. Ecol. Inform. 2017, 41, 24–32. [Google Scholar] [CrossRef]
  18. Rushen, J.; Chapinal, N.; De Passillé, A.M. Automated Monitoring of Behavioural-Based Animal Welfare Indicators. Anim. Welf. 2012, 21, 339–350. [Google Scholar] [CrossRef]
  19. Van De Weerd, H.A.; Bulthuis, R.J.A.; Bergman, A.F.; Schlingmann, F.; Tolboom, J.; Van Loo, P.L.P.; Remie, R.; Baumans, V.; Van Zutphen, L.F.M. Validation of a New System for the Automatic Registration of Behaviour in Mice and Rats. Behav. Process. 2001, 53, 11–20. [Google Scholar] [CrossRef]
  20. Noldus, L.P.J.J.; Spink, A.J.; Tegelenbosch, R.A.J. Computerised Video Tracking, Movement Analysis and Behaviour Recognition in Insects. Comput. Electron. Agric. 2002, 35, 201–227. [Google Scholar] [CrossRef]
  21. Fontaine, E.; Lentink, D.; Kranenbarg, S.; Müller, U.K.; Van Leeuwen, J.L.; Barr, A.H.; Burdick, J.W. Automated Visual Tracking for Studying the Ontogeny of Zebrafish Swimming. J. Exp. Biol. 2008, 211, 1305–1316. [Google Scholar] [CrossRef]
  22. Noldus, L.P.J.J.; Spink, A.J.; Tegelenbosch, R.A.J. Ethovision Video Tracking System. Behav. Res. Methods Instrum. Comput. 2001, 33, 398–414. [Google Scholar] [CrossRef]
  23. Cangar, Ö.; Leroy, T.; Guarino, M.; Vranken, E.; Fallon, R.; Lenehan, J.; Mee, J.; Berckmans, D. Automatic Real-Time Monitoring of Locomotion and Posture Behaviour of Pregnant Cows Prior to Calving Using Online Image Analysis. Comput. Electron. Agric. 2008, 64, 53–60. [Google Scholar] [CrossRef]
  24. Oczak, M.; Ismayilova, G.; Costa, A.; Viazzi, S.; Sonoda, L.T.; Fels, M.; Bahr, C.; Hartung, J.; Guarino, M.; Berckmans, D.; et al. Analysis of Aggressive Behaviours of Pigs by Automatic Video Recordings. Comput. Electron. Agric. 2013, 99, 209–217. [Google Scholar] [CrossRef]
  25. Dawkins, M.S.; Cain, R.; Roberts, S.J. Optical Flow, Flock Behaviour and Chicken Welfare. Anim. Behav. 2012, 84, 219–223. [Google Scholar] [CrossRef]
  26. Luyster, F.S.; Strollo, P.J.; Zee, P.C.; Walsh, J.K. Sleep: A Health Imperative. Sleep 2012, 35, 727–734. [Google Scholar] [CrossRef]
  27. Siegel, J.M. Sleep in Animals: A State of Adaptive Inactivity. In Principles and Practice of Sleep Medicine; Kryger, M.H., Dement, W.C., Roth, T., Eds.; Elsevier: Philadelphia, USA, 2011; pp. 126–138. [Google Scholar]
  28. Vassalli, A.; Dijk, D.J. Sleep Function: Current Questions and New Approaches. Eur. J. Neurosci. 2009, 29, 1830–1841. [Google Scholar] [CrossRef]
  29. Jun, J.C.; Polotsky, V.Y. Stressful Sleep. Eur. Respir. J. 2016, 47, 366–368. [Google Scholar] [CrossRef]
  30. Sadeh, A.; Keinan, G.; Daon, K. Effects of Stress on Sleep: The Moderating Role of Coping Style. Health Psychol. 2004, 23, 542–545. [Google Scholar] [CrossRef]
  31. Langford, F.M.; Cockram, M.S. Is Sleep in Animals Affected by Prior Waking Experiences? Anim. Welf. 2010, 19, 215–222. [Google Scholar] [CrossRef]
  32. Guillaumin, M.C.C.; McKillop, L.E.; Cui, N.; Fisher, S.P.; Foster, R.G.; De Vos, M.; Peirson, S.N.; Achermann, P.; Vyazovskiy, V.V. Cortical Region-Specific Sleep Homeostasis in Mice: Effects of Time of Day and Waking Experience. Sleep 2018, 41, zsy079. [Google Scholar] [CrossRef]
  33. Morgan, K.N.; Tromborg, C.T. Sources of Stress in Captivity. Appl. Anim. Behav. Sci. 2007, 102, 262–302. [Google Scholar] [CrossRef]
  34. Lesku, J.A.; Roth, T.C.; Rattenborg, N.C.; Amlaner, C.J.; Lima, S.L. History and Future of Comparative Analyses in Sleep Research. Neurosci. Biobehav. Rev. 2009, 33, 1024–1036. [Google Scholar] [CrossRef]
  35. Balzamo, E.; Van Beers, P.; Lagarde, D. Scoring of Sleep and Wakefulness by Behavioral Analysis from Video Recordings in Rhesus Monkeys: Comparison with Conventional EEG Analysis. Electroencephalogr. Clin. Neurophysiol. 1998, 106, 206–212. [Google Scholar] [CrossRef] [PubMed]
  36. McShane, B.B.; Galante, R.J.; Biber, M.; Jensen, S.T.; Wyner, A.J.; Pack, A.I. Assessing REM Sleep in Mice Using Video Data. Sleep 2012, 35, 433–442. [Google Scholar] [CrossRef]
  37. Siegel, J.M. Clues to the Functions of Mammalian Sleep. Nature 2005, 437, 1264–1271. [Google Scholar] [CrossRef]
  38. Frank, M.G. Mammalian Sleep. In Encyclopedia of Sleep; Elsevier: Amsterdam, The Netherlands, 2013; pp. 63–65. [Google Scholar]
  39. Madan, V.; Jha, S.K. Sleep Alterations in Mammals: Did Aquatic Conditions Inhibit Rapid Eye Movement Sleep? Neurosci. Bull. 2012, 28, 746–758. [Google Scholar] [CrossRef]
  40. Greening, L.; McBride, S. A Review of Equine Sleep: Implications for Equine Welfare. Front. Vet. Sci. 2022, 9, 916737. [Google Scholar] [CrossRef] [PubMed]
  41. Ternman, E.; Pastell, M.; Agenäs, S.; Strasser, C.; Winckler, C.; Nielsen, P.P.; Hänninen, L. Agreement between Different Sleep States and Behaviour Indicators in Dairy Cows. Appl. Anim. Behav. Sci. 2014, 160, 12–18. [Google Scholar] [CrossRef]
  42. Owczarczak-Garstecka, S.C.; Burman, O.H.P.P. Can Sleep and Resting Behaviours Be Used as Indicators of Welfare in Shelter Dogs (Canis Lupus Familiaris)? PLoS ONE 2016, 11, e0163620. [Google Scholar] [CrossRef]
  43. Takagi, N.; Saito, M.; Ito, H.; Tanaka, M.; Yamanashi, Y. Sleep-related Behaviors in Zoo-housed Giraffes (Giraffa Camelopardalis Reticulata): Basic Characteristics and Effects of Season and Parturition. Zoo Biol. 2019, 38, 490–497. [Google Scholar] [CrossRef] [PubMed]
  44. Yngvesson, J.; Wedin, M.; Gunnarsson, S.; Jönsson, L.; Blokhuis, H.; Wallenbeck, A. Let Me Sleep! Welfare of Broilers (Gallus Gallus Domesticus) with Disrupted Resting Behaviour. Acta Agric. Scand. Sect. A Anim. Sci. 2017, 67, 123–133. [Google Scholar] [CrossRef]
  45. Udell, M.A.R.R.; Wynne, C.D.L.D.L. A Review of Domestic Dogs’ (Canis Familiaris) Human-Like Behaviors: Or Why Behavior Analysts Should Stop Worrying and Love Their Dogs. J. Exp. Anal. Behav. 2008, 89, 247–261. [Google Scholar] [CrossRef] [PubMed]
  46. Feuerbacher, E.; Wynne, C. A History of Dogs as Subjects in North American Experimental Psychological Research. Comp. Cogn. Behav. Rev. 2011, 6, 46–71. [Google Scholar] [CrossRef]
  47. Toth, L.A.; Bhargava, P. Animal Models of Sleep Disorders. Comp. Med. 2013, 63, 91–104. [Google Scholar] [PubMed]
  48. Bódizs, R.; Kis, A.; Gácsi, M.; Topál, J. Sleep in the Dog: Comparative, Behavioral and Translational Relevance. Curr. Opin. Behav. Sci. 2020, 33, 25–33. [Google Scholar] [CrossRef]
  49. Arden, R.; Bensky, M.K.; Adams, M.J. A Review of Cognitive Abilities in Dogs, 1911 Through 2016. Curr. Dir. Psychol. Sci. 2016, 25, 307–312. [Google Scholar] [CrossRef]
  50. Benz-Schwarzburg, J.; Monsó, S.; Huber, L. How Dogs Perceive Humans and How Humans Should Treat Their Pet Dogs: Linking Cognition With Ethics. Front. Psychol. 2020, 11, 584037. [Google Scholar] [CrossRef] [PubMed]
  51. Udell, M.A.R.R.; Dorey, N.R.; Wynne, C.D.L.L. What Did Domestication Do to Dogs? A New Account of Dogs’ Sensitivity to Human Actions. Biol. Rev. 2010, 85, 327–345. [Google Scholar] [CrossRef]
  52. Jukan, A.; Masip-Bruin, X.; Amla, N. Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review. ACM Comput. Surv. 2017, 50, 10. [Google Scholar] [CrossRef]
  53. Belda, B.; Enomoto, M.; Case, B.C.; Lascelles, B.D.X. Initial Evaluation of PetPace Activity Monitor. Vet. J. 2018, 237, 63–68. [Google Scholar] [CrossRef]
  54. Weiss, G.M.; Nathan, A.; Kropp, J.B.; Lockhart, J.W. WagTag: A Dog Collar Accessory for Monitoring Canine Activity Levels. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, 8–12 September 2013; pp. 405–414. [Google Scholar] [CrossRef]
  55. Ladha, C.; Hoffman, C.L. A Combined Approach to Predicting Rest in Dogs Using Accelerometers. Sensors 2018, 18, 2649. [Google Scholar] [CrossRef]
  56. Olsen, A.; Evans, R.; Duerr, F. Evaluation of Accelerometer Inter-Device Variability and Collar Placement in Dogs. Vet. Evid. 2016, 1, 1–9. [Google Scholar] [CrossRef]
  57. Amir, S.; Zamansky, A.; van der Linden, D. K9-Blyzer—Towards Video-Based Automatic Analysis of Canine Behavior. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 1–5. [Google Scholar]
  58. Baba, M.; Pescaru, D.; Gui, V.; Jian, I. Stray Dogs Behavior Detection in Urban Area Video Surveillance Streams. In Proceedings of the 2016 12th IEEE International Symposium on Electronics and Telecommunications (ISETC), Timisoara, Romania, 27–28 October 2016; pp. 313–316. [Google Scholar]
  59. Mealin, S.; Domínguez, I.X.; Roberts, D.L. Semi-Supervised Classification of Static Canine Postures Using the Microsoft Kinect. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–4. [Google Scholar]
  60. Karpathy, A.; Toderici, G.; Shetty, S.; Leung, T.; Sukthankar, R.; Li, F.F. Large-Scale Video Classification with Convolutional Neural Networks. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1725–1732. [Google Scholar] [CrossRef]
  61. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
  62. Zamansky, A.; Sinitca, A.M.; Kaplun, D.I.; Plazner, M.; Schork, I.G.; Young, R.J.; de Azevedo, C.S. Analysis of Dogs’ Sleep Patterns Using Convolutional Neural Networks. In Lecture Notes in Computer Science; Tetko, I.V., Kůrková, V., Karpov, P., Theis, F., Eds.; Springer International Publishing: Cham, Switzerland, 2019; Volume 11729, pp. 472–483. ISBN 978-3-030-30507-9. [Google Scholar]
  63. Bateson, M.; Martin, P. Measuring Behaviour, 3rd ed.; Cambridge University Press: Cambridge, UK, 2021; ISBN 9781108776462. [Google Scholar]
  64. Dytham, C. Choosing and Using Statistics: A Biologist’s Guide, 3rd ed.; Wiley-Blackwell: Oxford, UK, 2011; ISBN 978-1-405-19839-4. [Google Scholar]
  65. IBM Corp. IBM SPSS Statistics for Windows, Version 26.0. IBM Corp: Armonk, NY, USA, 2019.
  66. Nakamura, T.; Goverdovsky, V.; Morrell, M.J.; Mandic, D.P. Automatic Sleep Monitoring Using Ear-EEG. IEEE J. Transl. Eng. Health Med. 2017, 5, 2800108. [Google Scholar] [CrossRef]
  67. Watson, N.F.; Fernandez, C.R. Artificial Intelligence and Sleep: Advancing Sleep Medicine. Sleep Med. Rev. 2021, 59, 101512. [Google Scholar] [CrossRef]
  68. Tripathi, P.; Ansari, M.A.; Gandhi, T.K.; Mehrotra, R.; Heyat, M.B.; Akhtar, F.; Ukwuoma, C.C.; Muaad, A.Y.M.; Kadah, Y.M.; Al-Antari, M.A.; et al. Ensemble Computational Intelligent for Insomnia Sleep Stage Detection via the Sleep ECG Signal. IEEE Access 2022, 10, 108710–108721. [Google Scholar] [CrossRef]
  69. Hunter, L.B.; O’Connor, C.; Haskell, M.J.; Langford, F.M.; Webster, J.R.; Stafford, K.J. Lying Posture Does Not Accurately Indicate Sleep Stage in Dairy Cows. Appl. Anim. Behav. Sci. 2021, 242, 105427. [Google Scholar] [CrossRef]
Figure 1. Detection of dogs by the BlyzerDS system using a neural network. Different colours in the bounding boxes show the system correctly scoring two individuals (blue = asleep; green = awake).
Figure 1. Detection of dogs by the BlyzerDS system using a neural network. Different colours in the bounding boxes show the system correctly scoring two individuals (blue = asleep; green = awake).
Animals 14 01109 g001
Table 1. Summary of sleep metrics as recorded by the automated system (BlyzerDS-system) and compared with the recordings from manual (human) observations for 11 nights of behavioural observations. Sleep is represented in hours–minutes–seconds; bouts are reported as counts.
Table 1. Summary of sleep metrics as recorded by the automated system (BlyzerDS-system) and compared with the recordings from manual (human) observations for 11 nights of behavioural observations. Sleep is represented in hours–minutes–seconds; bouts are reported as counts.
Sleep
System
Sleep
Manual
% DifferenceBouts
System
Bouts Manual% Difference
12:05:5609:31:331.1514152.94
12:52:4910:41:490.7171523.53
11:42:2609:59:510.3919175.88
11:15:1110:04:420.0622205.88
12:37:1209:41:491.4212158.82
05:26:0404:58:310.131085.88
11:43:3909:58:310.4315150.00
10:48:4011:05:581.38231911.76
11:56:1611:15:380.59122126.47
12:20:2110:10:570.7413168.82
08:05:4710:05:202.6818180.00
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Schork, I.; Zamansky, A.; Farhat, N.; de Azevedo, C.S.; Young, R.J. Automated Observations of Dogs’ Resting Behaviour Patterns Using Artificial Intelligence and Their Similarity to Behavioural Observations. Animals 2024, 14, 1109. https://doi.org/10.3390/ani14071109

AMA Style

Schork I, Zamansky A, Farhat N, de Azevedo CS, Young RJ. Automated Observations of Dogs’ Resting Behaviour Patterns Using Artificial Intelligence and Their Similarity to Behavioural Observations. Animals. 2024; 14(7):1109. https://doi.org/10.3390/ani14071109

Chicago/Turabian Style

Schork, Ivana, Anna Zamansky, Nareed Farhat, Cristiano Schetini de Azevedo, and Robert John Young. 2024. "Automated Observations of Dogs’ Resting Behaviour Patterns Using Artificial Intelligence and Their Similarity to Behavioural Observations" Animals 14, no. 7: 1109. https://doi.org/10.3390/ani14071109

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop