**Preface to "Enhancing Farm-Level Decision Making through Innovation"**

Enhanced farm-level data sources and information in agricultural systems can allow farmers to make more timely and informed interventions that ultimately help productivity. More sustainable production practices are important for future food supplies. This Special Issue explores the use of applications that implement modelling approaches for different animal and plant systems.

Models of biological systems can be used to explore changes in climatic conditions and inform plant management options [1]. In this Special Issue, it was notable that five of the six papers investigated image analysis approaches as a means to monitor animals or plants. Image analysis has gained in interest due to continued developments in image capture technology and processing. The work of Yang Wu and Lihong Xu [2] found that image generation can improve disease detection in tomato leaves. Furthermore, Cavendish et al. [3], Waters et al. [4] and McDonagh et al. [5] showed how camera surveillance technology can be used for the high-level detection of different behaviors in cattle and sheep, with the potential to enhance management at critical periods of productive life, such as parturition. Compared to other sensor technologies, camera surveillance allows both the mother and its offspring to be monitored remotely. Shyan Ghajar and Benjamin Tracy [6] discussed how developments in proximal sensing techniques now provide opportunities to collect detailed grassland data that were previously lacking. The authors highlight that proximal sensing technologies, such as handheld sensors or sensors mounted on unmanned aerial vehicles, can provide a range of measures, such as plant species, height, biomass and nutritional content. However, the cost and complexity of new hardware and software solutions can often be barriers and hinder current adoption.

This collection of papers highlights how innovation in farming systems can support the sustainable development of food production

Funding: This research received no external funding.

Conflicts of Interest: The authors declare no conflicts of interest.

## **References**

1. Yang, J.; He, Y.; Luo, S.; Ma, X.; Li, Z.; Lin, Z.; Zhang, Z. Optimizing the Optimal Planting Period for Potato Based on Different Water-Temperature Year Types in the Agro-Pastoral Ecotone of North China. *Agriculture* **2021**, *11*, 1061. https://doi.org/10.3390/agriculture11111061

2. Wu, Y.; Xu, L. Image Generation of Tomato Leaf Disease Identification Based on Adversarial-VAE. *Agriculture* **2021**, *11*, 981. https://doi.org/10.3390/agriculture11100981

3. Cavendish, B.; McDonagh, J.; Tzimiropoulos, G.; Slinger, K.R.; Huggett, Z.J.; Bell, M.J. Changes in Dairy Cow Behavior with and without Assistance at Calving. *Agriculture* **2021**, *11*, 722. https://doi.org/10.3390/agriculture11080722

4. Waters, B.E.; McDonagh, J.; Tzimiropoulos, G.; Slinger, K.R.; Huggett, Z.J.; Bell, M.J. Changes in Sheep Behavior before Lambing. *Agriculture* **2021**, *11*, 715. https://doi.org/10.3390/agriculture11080715

5. McDonagh, J.; Tzimiropoulos, G.; Slinger, K.R.; Huggett, Z.J.; Down, P.M.; Bell, M.J. Detecting Dairy Cow Behavior Using Vision Technology. *Agriculture* **2021**, *11*, 675. https://doi.org/10.3390/agriculture11070675

6. Ghajar, S.; Tracy, B. Proximal Sensing in Grasslands and Pastures. *Agriculture* **2021**, *11*, 740. https://doi.org/10.3390/agriculture11080740

**Matt J. Bell**

*Editor*

## *Communication* **Detecting Dairy Cow Behavior Using Vision Technology**

**John McDonagh 1,\*, Georgios Tzimiropoulos 2, Kimberley R. Slinger 3, Zoë J. Huggett 3, Peter M. Down <sup>4</sup> and Matt J. Bell <sup>5</sup>**


**Abstract:** The aim of this study was to investigate using existing image recognition techniques to predict the behavior of dairy cows. A total of 46 individual dairy cows were monitored continuously under 24 h video surveillance prior to calving. The video was annotated for the behaviors of standing, lying, walking, shuffling, eating, drinking and contractions for each cow from 10 h prior to calving. A total of 19,191 behavior records were obtained and a non-local neural network was trained and validated on video clips of each behavior. This study showed that the non-local network used correctly classified the seven behaviors 80% or more of the time in the validated dataset. In particular, the detection of birth contractions was correctly predicted 83% of the time, which in itself can be an early warning calving alert, as all cows start contractions several hours prior to giving birth. This approach to behavior recognition using video cameras can assist livestock management.

**Keywords:** dairy cows; computer vision; behaviors; monitoring; management

## **1. Introduction**

At a time when the general public has concerns about how livestock are managed and their welfare, tools that can improve animal welfare standards and increase the public acceptance of farming are required. In recent years, the expectation has been for each stockperson to look after more animals, as input costs (including labor) have increased and finding skilled farm workers has become more challenging, and with the increased size of the average dairy herd. With these challenges have come high-quality digital camera systems that provide 24 h video surveillance capabilities, and the opportunity for farmers to monitor their livestock remotely and whilst carrying out other farm tasks. The use of cameras to monitor animals and their behaviors manually has been available for decades, with animal behavior and welfare concerns commonly directed at housed livestock production, such as dairy cows [1,2]. The monitoring of animals is essential for their welfare and survival [3].

Automated image analysis techniques have developed that allow continuous monitoring during the day and night, and require no prior training by the user other than interpreting the output. Such continuous monitoring is not possible for a stockperson. Recent technological advances in the field of computer vision based on the technique of deep learning [4,5] have emerged which now make automated monitoring of video feeds feasible. Computer vision combined with artificial intelligence (neural networks) can be used for a number of animal monitoring tasks such as recognizing the type of animals (recognition), detecting where the animals (and any other objects of interest) are located in the image (detection), localizing their body parts, and even segmenting their exact shape

**Citation:** McDonagh, J.; Tzimiropoulos, G.; Slinger, K.R.; Huggett, Z.J.; Down, P.M.; Bell, M.J. Detecting Dairy Cow Behavior Using Vision Technology. *Agriculture* **2021**, *11*, 675. https://doi.org/10.3390/ agriculture11070675

Academic Editor: Eva Voslarova

Received: 11 June 2021 Accepted: 15 July 2021 Published: 17 July 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

(silhouette) from the image. Furthermore, adaptations of neural networks for analyzing video can be used for a number of tasks such as recognition of specific animal behaviors (e.g., standing, lying, walking, eating, and drinking) [6]. Major benefits of image analysis are that it does not rely on human interpretation or intervention, transponder attachments or invasive equipment (e.g., boluses and collars). Furthermore, it may provide more information compared to other monitoring systems at a relatively low cost. However, the technology does rely on obtaining a large number of high-quality images. The need for high-quality image datasets for agricultural solutions has been recognized by others [7]. Vision-based monitoring can not only detect and track individuals but also groups of animals (i.e., herd, flock or mother with offspring). Vision technology that can continuously monitor individual animals can potentially provide an objective assessment of an abnormal behavioral state to allow early intervention and improved awareness by a stockperson.

The objective of this study was to investigate using existing image recognition techniques to predict the behavior of dairy cows. This study collected a large number of high-quality video images for a range of cow behaviors. Such a dataset was found to be lacking but was required in the current study to train a computer vision model.

## **2. Materials and Methods**

Approval for this study was obtained from the University of Nottingham animal ethics committee before commencement (approval number 151, 2017).

## *2.1. Data*

Video cameras (5 Mp, 30 m IR. Hikvision HD Bullet; Hangzhou, China) were used to record Holstein–Friesian dairy cows at the Nottingham University Dairy Centre (Sutton Bonington, Leicestershire, UK) prior to calving. Cameras were recording at 20 frames per second, with a frame width of 640 pixels and height of 360 pixels. Three calving pens with two surveillance cameras looking into each pen were used to obtain 24 h video footage of 46 individual cows between April and June 2018. Both cameras on each pen allowed full coverage of the area (10 m × 7 m) and were approximately at a 45-degree angle looking into the pen at a height of 4 m. Each calving pen holds a maximum of eight cows. Several days prior to calving, each cow was moved into one of the three calving pens so that the entire calving process could be monitored.

## *2.2. Image Annotation*

The video recording for each cow was annotated from 10 h before calving by three observers using custom-made scripts in the PyTorch 1.5 framework to label video clips. The PyTorch framework was used as it allows several steps in the processing of images to be carried out, such as behavioral annotations, video segmentation and model development using the Python programming language as discussed below. The start of the observation period was determined as 10 h from when the calf was fully expelled at birth using the video recording. Seven behaviors were recorded (Table 1).


**Table 1.** Studied behaviors and their description.

A total of 19,191 individual behavioral observations were obtained from all 46 cows. For the analysis, 15 video clips of each behavior that ranged between three to ten seconds were extracted from individual cow footage to provide a total of 3969 video clips for analysis (Table 2). If there were more than 15 video clips, then they would be evenly sampled from available data. There were 248–686 video clips for each behavior for training and validation. To ensure accuracy of video annotation and subsequent behavioral video clips extracted, each behavioral video clip was checked by a single trained observer to be correctly labelled and any errors corrected if required.

**Table 2.** Number of video clips for each behavior class in the training and validation datasets.


The output of the behavior annotations from each video clip was described in a *N\*3* matrix, where *N* is the total number of behaviors in the video (Table 3). Start and end frames for annotated behaviors are recorded for each video clip. Each of the retained video clips were cropped to remove excessive background and to focus on a single cow (Figure 1).

**Table 3.** Example matrix of behavior annotations.


**Figure 1.** Example of cropped and scaled videos. Top row shows a cow walking, middle row shows a cow shuffling and bottom row is of a cow eating.

To be compliant with the non-local network [8], we used a fixed-size bounding box that fully covered the cow over all frames (this is to emulate [8], who used the entire frame). We used the image annotation tool ViTBAT [9] to generate the bounding boxes. The steps taken to process images for model development are illustrated in Figure 2.

**Figure 2.** Illustration of steps in data acquisition and image processing.

## *2.3. Computer Vision Model Used for Behavior Recognition*

A custom-made script in the PyTorch 1.5 framework was used to combine the behavioral matrix with cropped video images. This was performed using a non-local network [8] using the ResNet-50 architecture [10]. Further detailed explanations are discussed in prior research [8,10]. As shown in Equation (1), the non-local block computes the response at a position as a weighted sum of the features at all positions in the input feature maps and is defined as follows:

$$y\_i = \frac{1}{\mathbb{C}(\mathbf{x})} \sum\_{\forall j} f(\mathbf{x}\_i, \mathbf{x}\_j) \lg(\mathbf{x}\_j),\tag{1}$$

where *x* is the input features, *y* is the output features (same size as *x*), *i* is the current position of interest, *j* enumerates over all possible positions, *C*(*x*) is the normalization factor *<sup>C</sup>*(*x*) = <sup>∑</sup>∀*<sup>j</sup> <sup>f</sup>* - *xi*, *xj* , *g* is a linear embedding *g* - *xj* = *Wgxj*, where *Wg* is learned weight matrix and *f* - *xi*, *xj* is a pairwise function that computes the correlations between the feature at location *i* and those at all possible positions *j*.

The non-local network [8] is initialized using weights that are pre-trained on the Kinetics image dataset [11], which includes 400 behaviors for humans. This approach has been shown by [12] to improve action recognition accuracy by using a pre-trained initialization starting point for modelling. To decrease training and testing times, the current study used 8-frame input clips. The 8-frame clips were generated by randomly cropping out 64 consecutive frames from the training video and then keeping 8 frames that are evenly separated by a stride of 8 frames (Figure 3). Additionally, while training, the spatial size is fixed to 224 pixels squared, which is randomly cropped from a video or its horizontal flip, whose shorter side is randomly scaled between 256 and 320 pixels.

## *2.4. Computer Vision Model Validation*

To validate the performance of the model, we performed spatially fully convolutional inference as described by [8]. Briefly, the shorter side is resized to 256 pixels and 3 crops of 256x256 pixels are used to cover the entire spatial domain. The final predicted output is the average score for 10 evenly spaced 8 frame clips sampled along the temporal dimension of a full-length video (Figure 4).

**Figure 4.** Ten clips of eight frames are sampled from blocks (64 frames) which are evenly sampled over the entire video. Each clip produces its own score, and the final output is the average of all the scores (a total of 5 blocks are shown for illustration purposes.)

## **3. Results and Discussion**

Despite scientific value, pressing need and direct impact on animal health and welfare, very little attention has been paid in developing an annotated video dataset of dairy cow behaviors. Most research to date has been based on wearable accelerometer-based activity monitoring sensors [13–15]. We introduce a new large-scale video dataset for the purpose of cow behavior classification. Image banks containing a large number of high-quality (i.e., accurate and high-resolution) images for different applications are needed to develop vision-based technologies, such as behavior recognition in animals, as suggested by other studies [7]. This study showed that automated monitoring of the cow during parturition is possible, which for a high-value animal is beneficial to assist the stockperson and enhance animal welfare.

Our dataset consisted of almost 4000 video clips of individual animal behaviors, each between 3 and 10 s in length, which were on pregnant dairy cows prior to calving. There was over 9 h and 42 min of captured video data, which was split into approximately 7 h and 48 min for training and 1 h and 54 min for validation. In the field of computer vision, action recognition has been applied on humans with a high degree of success [8]. We show that the same model pre-trained on a dataset devised for human action recognition, namely Kinetics [11], can be successfully adapted to detect the behavior of dairy cows. As shown in Table 4, the accuracy of identifying contractions while lying was 83%—this in itself is sufficient enough to predict the birth of a calf, as a cow will generally start contractions approximately 1 to 2 h prior to giving birth. Standing, lying, eating and drinking behaviors all scored greater that 84% and can also help with the monitoring of animal well-being. Furthermore, changes in duration or frequency of behaviors studied may help identify abnormal behavior patterns that can assist in animal management. For example, eating and drinking can be detected with a high level of accuracy at over 90%, and these behaviors can be used to identify health problems [16].


**Table 4.** Evaluation of model predictions against validation dataset.

<sup>1</sup> The target row shows how many video clips were tested for each behavior. <sup>2</sup> Output row shows how many behavior video clips the model classified correctly. <sup>3</sup> The percentage of target behavior video clips correctly classified.

> As well as working with cows, the proposed computer vision approach could be adapted for other livestock species such as pigs, poultry, sheep, and horses to predict birth and identify behavior patterns or behaviors that occur over many hours, which may be missed by subjective and observational sampling. Furthermore, because the calving pen is continuously monitored, it should also be possible to detect and track the behaviors of the mother and its newborn offspring, which is not feasible using standard predictive animal monitoring applications that are currently being used by the livestock industry.

> The development of behavior recognition using continuous camera surveillance within the farm environment is challenging. The current study identified several potential causes of error in computer model predictions which are limitations of current vision-based monitoring (Table 5).

**Table 5.** Potential causes of error in animal vision-based model predictions.


#### **4. Conclusions**

We show that computer vision can be successfully applied to predict individual dairy cow behaviors with an accuracy of 80% or more for the behaviors studied. This approach could be used for early detection of abnormal behavior in animals, birth events and the need for assistance. Computer vision technology may help a stockperson make more timely decisions based on the continuous tracking of individuals within groups of animals.

**Author Contributions:** Conceptualization, M.J.B. and G.T.; methodology, M.J.B. and G.T.; software, J.M.; validation, J.M.; formal analysis, J.M.; investigation, J.M.; resources, M.J.B.; data curation, M.J.B., K.R.S., Z.J.H. and J.M.; writing—original draft preparation, J.M. and M.J.B.; writing—review and editing, J.M., M.J.B., G.T. and P.M.D.; visualization, J.M.; supervision, M.J.B. and G.T.; project administration, M.J.B.; funding acquisition, M.J.B. and G.T. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research was funded by the Douglas Bomford Trust, the Engineering and Physical Sciences Research Council and the Biotechnology and Biological Sciences Research Council.

**Institutional Review Board Statement:** The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Animal Ethics Committee at the University of Nottingham (approval number 151, 2017).

**Informed Consent Statement:** Not applicable.

**Data Availability Statement:** The analyzed datasets are available from the corresponding author on request.

**Conflicts of Interest:** The authors declare no conflict of interest.

## **References**


## *Article* **Changes in Sheep Behavior before Lambing**

**Beatrice E. Waters 1, John McDonagh 2, Georgios Tzimiropoulos 3, Kimberley R. Slinger 1, Zoë J. Huggett <sup>1</sup> and Matt J. Bell 4,\***


**Abstract:** The aim of this study was to assess the duration and frequency of behavioral observations of pregnant ewes as they approached lambing. An understanding of behavioral changes before birth may provide opportunities for enhanced visual monitoring at this critical stage in the animal's life. Behavioral observations for 17 ewes in late pregnancy were recorded during two separate time periods, which were 4 to 6 weeks before lambing and before giving birth. It was normal farm procedure for the sheep to come indoors for 6 weeks of close monitoring before lambing. The behaviors of standing, lying, walking, shuffling and contraction behaviors were recorded for each animal during both time periods. Over both time periods, the ewes spent a large proportion of their time either lying (0.40) or standing (0.42), with a higher frequency of standing (0.40) and shuffling (0.28) bouts than other behaviors. In the time period before giving birth, the frequency of lying and contraction bouts increased and the standing and walking bouts decreased, with a higher frequency of walking bouts in ewes that had an assisted lambing. The monitoring of behavioral patterns, such as lying and contractions, could be used as an alert to the progress of parturition.

**Keywords:** behavior; birth; management; observations; sheep

## **1. Introduction**

A stockperson's ability to assess animal behavior is a key component of their ability to recognize and treat ill-health, and evaluate the wellbeing of their livestock [1,2]. The visual assessment of livestock by humans is subjective and has several limitations such as the cost of labor and time to regularly observe individual animals. Hence, several monitoring technologies have been proposed in recent years that predict animal behaviors from movement sensors on cattle or sheep [3–8]. New technologies that provide an objective measure of animal behavior, such as sensors and cameras, could provide an aid to improve animal management [9]. Furthermore, monitoring equipment can continuously and remotely track livestock, something that would be unrealistic and too costly for human observers to replicate [3].

Lambing is a critical time in the productive life of sheep and the development of the newborn offspring that will eventually be sold or retained as flock replacements. Sheep will often have multiples at birth, which can be physically challenging, stressful and a painful process for the mother and offspring that may require a farmer's intervention [10]. The parturition period is associated with several physiological, hormonal and behavioral changes in the pregnant animal, with restless behavior exhibited by nesting and reduced appetite along with birth contractions, which increase in frequency and intensity as birth progresses [10,11]. Studying cows, Huzzy et al. [12] found a dramatic increase in the

**Citation:** Waters, B.E.; McDonagh, J.; Tzimiropoulos, G.; Slinger, K.R.; Huggett, Z.J.; Bell, M.J. Changes in Sheep Behavior before Lambing. *Agriculture* **2021**, *11*, 715. https:// doi.org/10.3390/agriculture11080715

Academic Editors: Claudia Arcidiacono and Eva Voslarova

Received: 17 June 2021 Accepted: 27 July 2021 Published: 29 July 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

number of positional changes such as lying or standing at calving, and reported that the animals tended to isolate themselves from the rest of the herd. Although there are several studies on changes in cattle behavior before calving [10,12–15], there are few studies on pregnant sheep [16,17]. The need for further research into enhanced monitoring approaches of sheep during parturition has been identified by others [17], and assist a stockperson during this critical period. The hypothesis of the current study was that there is a change in sheep behavior before giving birth, and this change can be visually observed. This insight may assist lambing management and future monitoring technologies.

The objective of this study was to assess the duration and frequency of behavioral observations of pregnant sheep as they approached lambing. The sheep studied followed normal husbandry procedure of being housed as a group at about 6 weeks before lambing to allow for closer monitoring.

### **2. Materials and Methods**

Approval for this study was obtained from the University of Nottingham animal ethics committee before the study commenced (approval number 198, 2018).

#### *2.1. Data*

A total of 17 pregnant ewes were monitored using video camera surveillance (5 Mp, 30 m IR. Hikvision HD Bullet; Hangzhou, China) at the Nottingham University Farm (Sutton Bonington, Leicestershire, UK; 52.8282◦ N 1.2485◦ W, 48 m a.s.l) when indoors before lambing from February to March 2019. The study was designed to have similar numbers of primiparous and multiparous ewes. The ewes monitored were predominantly Lleyn breed, with 9 primiparous and 8 multiparous. Normal husbandry procedure for the flock were followed, whereby all sheep came indoors for closer monitoring as a single group at about 6 weeks before lambing, and returned to pasture after lambing. The sheep were group housed on straw bedding, with an open feed trough for forage and supplementary feed and a single water trough. A single camera was used to obtain continuous video footage of each ewe. The camera position allowed full coverage of the area and at an approximate 45-degree angle looking into the sheep pen. When the sheep were housed at the start of the study they were weighed, marked with a number for camera observations and individual identification recorded, and vaccinated for pasteurella and clostridial disease, but after this the sheep were not handled until they had given birth. The sheep did not receive any other health treatments, such for endo or ectoparasites, during the study. The average age of primiparous ewes was 1.9 (s.d. 0.03) years and multiparous 4.3 (s.d. 1.0) years. The average bodyweight of primiparous ewes was 57.9 (s.d. 2.6) kg and multiparous 64.2 (s.d. 6.4) kg. Sheep were group fed *ad libitum* haylage consisting of 9.7 MJ/kg for metabolizable energy (ME), 486, 548, 138, 59 g/kg for dry matter, neutral detergent fiber, crude protein and sugar, respectively (Sciantec Analytical Services, Cawood, UK; using near-infrared spectroscopy analysis). Additionally, sheep were supplemented with 350 g/day oats with wheat distillers grain mix (13.4 MJ/kg ME, 860, 250, 228 and 60 g/kg for dry matter, neutral detergent fiber, crude protein and sugar, respectively). The diet was about 75% forage on a dry matter basis. The feed was allocated as 2% of the average bodyweight for the group of ewes (about 1.2 kg/day), with the diet formulated based on an estimated energy and protein requirement of 11 MJ/kg ME and 160 g/kg crude protein in the diet [18]. The same diet was fed throughout the study and the amount allocated to the group was reduced as sheep gave birth and were removed from the lambing pen. Need for a birth to be assisted by farm staff was recorded for each ewe. The average daily temperature was 5.7 ◦C, rainfall was 2.9 mm and humidity was 90% during the study.

## *2.2. Observations*

Two observation periods were used to investigate changes in behavior: Period 1 (4–6 weeks before lambing) and Period 2 (at lambing). There were 10 h of annotated video recordings for each ewe from Period 1 and three hours before the first lamb was born for Period 2. Tracking of sheep in Period 2 before lambing was challenging because of the similar appearance and behavioral changes. Three observers used custom made scripts in PyTorch 1.5 framework to record the behavior profile of each ewe with time. A total of 8257 individual behavioral observations were recorded from all 17 ewes. To ensure accuracy of video behavior annotations, the video was segmented into short clips for each behavior, and all video clips subsequently checked for accuracy by one of the three observers. Five behaviors were recorded, which were:


#### *2.3. Statistical Analysis*

The duration of behaviors in seconds and behavior frequency were determined for both time periods. A total of 170 behavior records were obtained from 17 ewes (17 × 5 behaviors × 2 time periods).

Behavior records were analyzed using a generalized linear mixed model in Genstat Version 19.1 (Lawes Agricultural Trust, 2018). A binomial error distribution and a logit link function was fitted to the fixed effects of assistance, time period, behavior and parity for the dependent variables of duration and frequency of behaviors in Equation (1):

$$\mathbf{Y}\_{\rm ijkl} = \boldsymbol{\mu} + \mathbf{A}\_i \times \mathbf{T}\_{\rm j} \times \mathbf{B}\_{\rm k} + \mathbf{P}\_{\rm l} + \mathbf{E}\_{\rm ijkl} \tag{1}$$

where Yijkl is the dependent variable of behavior duration or frequency; μ = overall mean; Ai = fixed effect for assistance at lambing (i = 0 for unassisted or 1 for assisted); Tj = fixed effect of time period (j = 1 or 2); Bk = fixed effect of behavior (k = standing, lying, walking, shuffling, contractions); Pl = fixed effect of parity (l = primiparous or multiparous); Eijkl = random error term.

The back-transformed predicted means for behavior duration and frequency were expressed as the proportion of total time or count during each time period. Significance was attributed at *p* < 0.05.

## **3. Results**

Of the 17 lambings, two were triplets, 13 were twins and two were singles. There were four primiparous and two multiparous ewes that required assistance by the farm stockperson, with all other lambings being unassisted.

Differences were found in the duration of behaviors (*p* < 0.001) with most of the time spent either lying (0.40) or standing (0.42), with other behaviors being 0.08 or less across time periods studied (Figure 1).

There was no effect of parity, time period, lambing assistance on duration of behaviors (*p* > 0.05; Table 1 and Figure 2).

**Figure 1.** Predicted mean (±SEM) proportion of time spent in doing different behaviors.

**Table 1.** Effects of parity, time period, and lambing assistance on the duration of behaviors as a proportion of time.


<sup>1</sup> Period 1 was observations obtained 4 to 6 weeks before lambing when the sheep came indoors for close monitoring, and Period 2 was observations obtained before the ewe gave birth. <sup>2</sup> Births were either assisted or unassisted by farm staff. <sup>3</sup> Predicted mean values shown in Figure 1. <sup>4</sup> Predicted mean values shown in Figure 2. <sup>5</sup> Significance was attributed at *p* < 0.05.

**Figure 2.** Predicted mean (±SEM) proportion of time that were (**a**) contractions, (**b**) lying, (**c**) shuffling, (**d**) standing and (**e**) walking behavior for assisted (dashed line) or non-assisted (solid line) lambing in time Periods 1 or 2, with Period 2 ending with the lambing event.

Differences were also found in the frequency of behaviors (*p* < 0.001) with standing (0.40) and shuffling (0.28) being the most frequent, with other behaviors being 0.09 or less across time periods studied (Figure 3).

In the time period before lambing, the frequency of lying and contraction bouts increased and the standing and walking bouts decreased (*p* < 0.001; Table 2), with a higher frequency of walking bouts in sheep that had an assisted lambing (*p* < 0.01; Figure 4). There was no effect of parity on frequency of behaviors.

**Figure 3.** Predicted mean (±SEM) proportion of observations for different behaviors.

**Table 2.** Effects of parity, time period, and lambing assistance on the frequency of behaviors as a proportion of observations.


<sup>1</sup> Period 1 was observations obtained 4 to 6 weeks before lambing when the sheep came indoors for close monitoring, and Period 2 was observations obtained before the ewe gave birth. <sup>2</sup> Births were either assisted or unassisted by farm staff. <sup>3</sup> Predicted mean values shown in Figure 3. <sup>4</sup> Predicted mean values shown in Figure 4. <sup>5</sup> Significance was attributed at *p* < 0.05.

**Figure 4.** Predicted mean (±SEM) proportion of observations that were (**a**) contractions, (**b**) lying, (**c**) shuffling, (**d**) standing and (**e**) walking behavior for assisted (dashed line) or unassisted (solid line) lambing in time Periods 1 or 2, with Period 2 ending with the lambing event. \*\* *p* < 0.01.

#### **4. Discussion**

The current study found that sheep spend most of their time either standing or lying during pregnancy. There are surprisingly few studies investigating the duration and frequency of behaviors of pregnant sheep. The current study suggests pregnant ewes spend about 10 h per day lying and 10 h per day standing. This is similar to other ruminant animals such as cattle when indoors, which have been found to spend between 10–12 h per day lying [19]. The duration of behaviors did not appear to change during parturition; however, the frequency of lying bouts, including contractions, increased in the period before lambing. The sheep in the current study were within the last trimester of pregnancy, and the lack of general movement can be expected in heavily pregnant animals with little

distance to their food. Previous studies have shown that there are few differences in the behavior of ewes before and during parturition because of factors such as breed, age of ewe, nutrition, climate or location [16]. The current study also found no significant effect of parity. In cows, Barrier et al. [13] found assisted animals displayed more frequent contractions than those that were unassisted, and that this was often because of awkward positioning of the calf at birth. A general state of restlessness is common in animals as parturition approaches and can also be seen in cow studies, characterized by an increase in lying frequencies as observed in the current study, general increased activity, reduced feed intake, and an intensified stress response [12,14]. In the current study, inclusion of additional behaviors, such as eating and drinking, would have been useful information as these behaviors are known to change as lambing approaches. However, they could not reliably be observed in sheep that were group housed as in the current study. A study by Black and Krawczel [20] found a higher lying frequency was associated with difficult calvings and that the cows that were not exercised were more uncomfortable during parturition. The sheep with an assisted birth in the current study had a higher frequency of walking bouts compared to unassisted births, which suggests restless behavior patterns. Generally, the sheep before lambing in the current study reduced their standing and walking bouts as lying bouts increased. Fogarty et al. [17] found a general increase in walking behavior and frequency of posture changes (i.e., standing and lying) in ewes before lambing. The change in frequencies of standing, lying and walking may provide useful indicators for tracking the progression of birth.

Sheep are often managed in large groups. This makes close inspection of individual animals difficult and timing of observations important for animal husbandry. Therefore, to enhance a farmer's capacity to manage individual animals in large groups, and detect animals who are ill or injured, animal tracking technology has been developed [21,22]. Sensor technologies are not however free from challenges; these devices are extremely sensitive and can be prone to damage from the dirt and dust that comes with farm environments. Their success will rely on the cost-benefit for livestock farming and added value to farm operations such as supporting the intense monitoring of parturition during day or night periods [23]. Sensor technologies could benefit sheep production by allowing more frequent and effective behavior observations at this key stage. Increased behavior monitoring would be extremely beneficial during parturition, as mortality will affect both animal welfare and farm productivity. In sheep, accelerometers have previously been used to detect behavioral states such as high and low general activity or some combinations of lying, standing, grazing, walking and/or running [3,17,24]. Use of an accelerometer, with machine learning, has been found to accurately predict 91% of lambing events within 3 h of birth based on body posture alone [25]. Sensors detecting key behaviors, such as those studied, present new opportunities for a continuous and real-time objective measurement in farm animals [23].

The current study involved a relatively small flock of 17 ewes because of challenges associated with complete video surveillance of animals over two time periods, and obstruction of view in group housing. Although the number of animals studied may have affected the results of this study, they appear consistent with other animal studies as mentioned above. Multiple cameras may have helped increase surveillance coverage and increased the number of animals studied. However, the results from this study suggest that observing changes in lying bouts and detection of contractions could assist farmers in monitoring parturition to enhance sheep husbandry.

#### **5. Conclusions**

This current study investigating group housed ewes during late pregnancy found an increased frequency of lying bouts, including contractions, before lambing. Pregnant ewes spent a large proportion of their time either lying or standing, with a higher frequency of standing and shuffling bouts. Ewes that required assistance at lambing had more walking bouts compared to ewes that were unassisted. The monitoring of behavioral patterns, such as lying and contractions, could be used as an alert to the progress of parturition.

**Author Contributions:** Conceptualization, M.J.B. and G.T.; methodology, M.J.B. and G.T.; software, J.M.; validation, J.M.; formal analysis, B.E.W. and M.J.B.; investigation, B.E.W. and M.J.B.; resources, M.J.B.; data curation, M.J.B., K.R.S., Z.J.H. and J.M.; writing—original draft preparation, B.E.W. and M.J.B.; writing—review and editing, B.E.W. and M.J.B.; visualization, J.M.; supervision, M.J.B.; project administration, M.J.B.; funding acquisition, M.J.B. and G.T. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research was funded by the Douglas Bomford Trust, the Engineering and Physical Sciences Research Council and the Biotechnology and Biological Sciences Research Council.

**Institutional Review Board Statement:** The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Animal Ethics Committee at the University of Nottingham (approval number 198, 2018).

**Informed Consent Statement:** Not applicable.

**Data Availability Statement:** The analyzed datasets are available from the corresponding author on request.

**Conflicts of Interest:** The authors declare no conflict of interest.

## **References**

