Next Article in Journal
Resistance and Resilience of Nine Plant Species to Drought in Inner Mongolia Temperate Grasslands of Northern China
Next Article in Special Issue
Optimal Kinematic Task Position Determination—Application and Experimental Verification for the UR-5 Manipulator
Previous Article in Journal
Dependence of Electric Pulse Mediated Growth Factor Release on the Platelet Rich Plasma Separation Method
Previous Article in Special Issue
On the Unification of Legged and Aerial Robots for Planetary Exploration Missions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

CageView: A Smart Food Control and Monitoring System for Phenotypical Research In Vivo

1
Department of Mechanical and Industrial Engineering, Faculty of Engineering and Architectural Science, Ryerson University, Toronto, ON M5B 2K3, Canada
2
Department of Occupational Therapy, College of Rehabilitation Sciences, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, MB R3E 0T6, Canada
3
Department of Computer Science, Faculty of Science, University of Manitoba, Winnipeg, MB R3E 0T6, Canada
4
Small Animal and Materials Imaging Core Facility, Central Animal Care Services, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, MB R3E 0T6, Canada
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(10), 4966; https://doi.org/10.3390/app12104966
Submission received: 1 April 2022 / Revised: 9 May 2022 / Accepted: 11 May 2022 / Published: 14 May 2022

Abstract

:
The present work introduces an automated and smart system (named CageView) used to monitor a mouse, detect motion, and control access to food in accordance with experimental schedules. We describe the components of the CageView platform and give a summarized description on how we employed a convolutional neural network to detect and recognize a mouse in real time before presenting the results of a case study. In particular, CageView is a programmable and remotely operable system such that (1) an experimenter at a remote workstation may set up a feeding and fasting schedule that allows feeding and fasting without requiring the physical presence of a staff member, (2) the experimenter can control access to food in real time regardless of the preset schedule, (3) the experimenter has real-time access to a live video feed to assess the mouse, (4) an artificial intelligence system tracks the mouse’s location and physical activity, and (5) a record is kept of activity, which can be displayed as a 2D representation of mouse movement or a histogram showing mouse movement in 15-min blocks for the duration of the experiment.

1. Introduction

Experimental studies on rodents require housing under highly regulated and controlled conditions. A major requirement for all animal housing in experimental facilities is health monitoring, which uses one or more assessment techniques. One of the most basic techniques in animal health monitoring is a simple assessment of the degree and freedom of animal movement. This is particularly important when an experiment reaches humane endpoints (the point at which an experiment must be terminated to prevent unnecessary suffering) or follows surgical procedures, as reduced physical activity is a valuable surrogate for discomfort in experimental animals. Reduced physical activity can also signal the onset of labor and birth in pregnant rodents, indicating the approximate time of birth, which may be of value in some experiments. While physical monitoring of experimental animals is straightforward during scheduled working hours, the standards of animal care often require additional monitoring outside scheduled working hours.
Scheduling of staff outside regular working hours is also required when a fasting regime is part of an experimental design. Under normal housing conditions, rodents are given access to food ad libitum (i.e., as desired). However, there are many experimental circumstances where food intake must be restricted, such as positron emission tomography (PET) experiments utilizing glucose analogs as a tracer. Such food restrictions must often be initiated outside regularly scheduled work hours.
It can be seen that there are many circumstances where animal experimentation requires scheduling of staff outside regular working hours. This requirement can cause staffing difficulties, additional costs in overtime payments, and significant inconvenience. Experimenters need an affordable system that can automate fasting and rodent activity monitoring while allowing real-time remote visual access to a rodent cage.
A number of systems have been described that can be used to control access to food and monitor rodent activity. However, these systems are generally relatively complex, difficult to integrate into normal animal housing, or expensive. For example, Matikainen-Ankney et al. [1] introduced an open source system for monitoring and measuring food intake and motivation in rodent cages. Their system consists of a rotary pellet dispenser for small food pieces, two nose poke sensors for operant behavior, visual and auditory stimuli, and a small screen for experimenter feedback. This device can measure circadian patterns of food intake over multiple days.
Singh et al. [2] offered a behavior-monitoring solution to be incorporated into available rodent home cages. In their solution, an infrared (IR) Pi Camera module, a wide-angle lens, and IR LEDs are placed on the home cage lid through a custom-made hole at its centroid and sealed with hot glue. This system uses a Raspberry Pi to connect to the computer network either via a local area network (LAN) cable or Wi-Fi so that the user can view a live feed of the animal activity in the home cage via a customized web interface. Ingley et al. [3] described an automatic cage surveillance system that was placed proximate to a single or multiple cages for detecting the levels of carbon dioxide and ammonia, rate of airflow, temperature, and humidity within the animal cage. Ingley et al. [4] proposed an animal containment system to accommodate an external device for rodent monitoring. Coiro et al. [5] described a system that provided automatic remote monitoring and control of an environment of ventilated racks of animal cages. They also incorporated the means for querying the status parameters of a ventilated rack of animal cages, alert means for alerting a digital system for certain status parameters, and a receiving subsystem to obtain control input through a wireless communication interface. Tecott and Goulding [6] developed a behavioral monitoring system useful for the analysis of complex behaviors in several animal species. Their system allows continuous monitoring of the feeding, drinking, and movement of animals with a high temporal and spatial resolution.
Iannello [7] developed a non-intrusive cage monitoring system based on electrical capacitance sensing technology. The system can provide 24/7 animal activity metrics including distance walked, average speed, activation density, and occupancy directly from the home cage while keeping the cages in conventional racks, eliminating the need for dedicated personnel or labs. Hong et al. [8] introduced a high throughput system for behavior recognition and motion tracking of rats and mice. However, their system needs the home cage to be placed on top of its motion-sensing table to determine the animal’s behavioral data. Their technology is based on measuring the forces induced by the animal’s movement and uses pattern recognition algorithms to translate the measured forces into understandable behaviors and tracking parameters. Brown et al. [9] introduced a cage-monitoring system based on using pyroelectric or passive infrared sensors. Their system can be used for phenotypical measurement of circadian rhythms and sleep in laboratory mice. Flores et al. [10] proposed a non-invasive approach utilizing piezoelectric films for rodent sleep and motion detection. In their proposed system, piezoelectric film strips are placed on the floor of the cage which produce electrical outputs proportional to the distortion of the strips. As they postulated, the predominant body movement during sleep was associated with breathing, while other motor activities played more important roles while awake. They also developed a pattern recognition system to identify periods of sleep and waking based on analyzing the generated signals.
Genewsky et al. [11] introduced a simplified microwave-based motion detector system to be used for home cage activity monitoring. In their system, the wave generator is located next to the mouse cage and emits electromagnetic waves to the mouse cage. The reflected waves with modulated frequencies are then sent to the processing unit to detect and save the history of the mouse motion. Shrestha [12] developed a system based on the RFID detection method which incorporated an RFID tag to the animal’s body at the food accessing place. This system was able to monitor each individual animal in a group-housed cage with a high temporal resolution.
Most of the available solutions for food control and cage monitoring are either too expensive to incorporate into laboratory equipment or lack the flexibility to be incorporated within existing cages. In many cases, they require an alteration in the current structure of the entire cage and laboratory facility. Furthermore, some of the existing methodologies such as electromyography (EMG), electroencephalography (EEG), and radio frequency identification (RFID) are invasive methods and are not ideal for high-throughput screening [13]. Consequently, food control and activity monitoring for animals in biomedical research labs is usually performed by trained technicians and can be both inconvenient and labor-intensive.
In the current article, we introduce a technology named CageView, through which food control and activity monitoring of rodents can be accomplished using an automated, AI-enabled, programmable, and remotely controllable system. This system enables the experimenters to conduct their experiments with greater convenience and increased capacity.

2. Structure, Arrangement, and Mechanisms

In this section, we briefly describe the structure and framework of CageView for food control and activity monitoring. Three components are required for such a system:
  • Programmable food hopper;
  • On-line visual monitoring capability;
  • Motion sensing and tracking system.
CageView accomplishes this with:
  • An actuator for linear displacements of the food access door controlled by a custom-designed interface to set the feeding and fasting schedule;
  • A vision unit with visible and near-infrared cameras and a near mid-infrared LED for day and nighttime monitoring, which forms a video streaming and data transmission system using wireless or wired communication networks;
  • A trained convolutional neural network (CNN) that detects the animal position in the image which enables movement measurement.
Figure 1 presents a schematic representation of CageView and the communication between its different subsystems. As shown in the figure, food access and animal activity data are transmitted from the laboratory location to the experimenter’s desktop application through the communication network. In parallel, the experimenter can send control commands to the laboratory location where the mouse cage is housed.
The placement of the feeding mechanism and the vision system is shown in Figure 2a. The feeding mechanism is mounted on the food-holding portion of a standard wire cage rack and placed in a standard cage. The vision unit is placed next to the plastic cage to detect the animal and monitor and record its activity. Figure 2b shows the feeding mechanism, which can be mounted on or within a mouse cage and comprises a custom-designed prismatic mechanism for opening and closing the food access door. The actuation system embedded in the custom-designed prismatic mechanism generates the necessary inputs to drive the main shaft connected to the food access door, and a processing unit using a custom-designed electronic board is arranged to operate and control the actuation system of the prismatic mechanism. In Figure 2c, the vision unit, which should be placed in proximity of a cage, is shown. The vision unit comprises a Raspberry Pi single-board computer, a visible-near infrared camera system to capture the animal’s image, a sensory system, and a data transmission and video streaming system to send and receive the data, photos, and videos to and from the interface software through a wired or wireless connection, including Wi-Fi or Bluetooth. In the vision unit, the processing unit receives the sensory system data, controls the camera, and analyzes the light level.
The feeding mechanism is designed to be mounted on conventional cages used in most research centers and has a sliding mechanism that can enable or disable the animal’s access to food. The sliding mechanism operates based on a rack and pinion configuration attached to a 6-V geared DC motor. The DC motor is commanded by hardware located inside the feeder. The hardware is equipped with a Bluetooth Low Energy (BLE) module and a 10~12-V power supply. The BLE module is responsible for receiving the feeding and fasting schedule from the app and running the DC motor accordingly. Therefore, the feeder mechanism changes the food access according to the schedule set by the experimenter. The experimenter is also able to override the programmed food access by opening or closing the food access door using the built-in function override function of the interface program. The interface program can also provide the history of the animal’s access to food and change the food access schedule conveniently and according to the experimenter’s needs.
The vision unit includes a set of light sensors, a camera, and a trained animal detection algorithm based on artificial intelligence or a neural network. The camera may be used for remote monitoring of the animal, which is needed in many applications such as post-surgical monitoring, humane endpoint monitoring, and pregnancy or mating strategy monitoring. The vision unit is connected to the interface program and provides a live video stream of the animal on the experimenter’s computer. Additionally, as is required in many biomedical research studies, the vision unit is able to detect the animal and its location within the cage and provide a history of the animal’s activity, its location, and the required statistics and analytics for the experimenter.
CageView also includes a software interface installed on a desktop computer through which the feeding and fasting schedule is set up, the status of access to food can be changed, the history of food access may be viewed, the animal may be monitored in real time, and the record of an animal’s activity and its metrics or parameters may be viewed. Figure 3 shows a sample screenshot of the interface software. It is worth mentioning that the software can be simultaneously connected to any arrangement of multiple feeding mechanisms and vision units without limitations. The software allows the experimenter to set the feeding and fasting schedule for 7 consecutive days for time windows of 30 min. However, if required, the experimenter can change the animal’s access to food regardless of the preset schedule. The interface software records the animal’s activity for the entire duration of the experiment with no time limit.

3. Trained Convolutional Neural Network

In order to automate the animal activity monitoring step, we utilized a deep learning technique where we collected a dataset of streamed images of mice inside the cages and trained a convolutional neural network (CNN) to detect the mouse in the images. This section briefly discusses the dataset and the CNN model used for object detection.

3.1. Dataset

To collect a dataset covering various situations that the cage and the mice can shape, we captured images across numerous days with different cages and different mice with varying colors (white, black, and brown). We gathered images during all hours of the day to cover the different lighting conditions that the experiment environment might introduce. Overall, we collected a dataset of 27,735 images, where 24,398 images contained a mouse, and the bounding boxes were annotated manually. The 3337 images with no mouse were used as background images to help the training of the network.

3.2. Mouse Detection Using YOLOv5

YOLOv5 [14] is a modern open source object detection solution with a set of CNN architectures performing the task with high accuracy and in real time. For our purposes, we trained the YOLOv5s model, which generated adequate results for our dataset. We tried both transfer learning by freezing the first 10 layers of the network and training the network from scratch. As the dataset images gradually increased (currently more than 27,000), we found that training all the layers of the network from scratch was the better option. Although transfer learning had satisfactory results during training, we found that the model trained from scratch tended to generalize to the new pictures in the real environment better. The training converged successfully around 200 epochs with a precision of 0.9825 and a mean average precision ([email protected]) of 0.995. With the satisfactory accuracy of YOLOv5s, the fast real-time execution time of the network made it perfect for our monitoring purposes.

3.3. Activity Measurement

While having the neural network detect the bounding boxes for the mouse inside each frame (see Figure 4), mapping to the rodent activity was required. For that purpose, we used the generated bounding boxes in the following two steps:
(1)
Activity Heatmap: To summarize the recorded movement trajectory of the mouse in a way that illustrated which part of the cage was passed more often, we first divided the space inside of the cage into 1 × 1 cm cells, forming a grid. When the grid was defined, we then calculated the occurrences of detecting the mouse position in each one of the cells. In order to accomplish this, we used the center of each bounding box as an estimate of the position of the mouse. Finally, by having the detection occurrences for each cell, we then illustrated the occurrences in an activity heatmap, where the corresponding number of a cell was mapped to a color scale. The higher the number, the brighter the color would be (see Figure 3). The activity heatmap helps researchers better understand how the mouse moves inside the cage during the course of the experiment.
(2)
Distance Measurement: Although the activity heatmap illustrates the location distribution of the mouse during the experiment, it does not reflect any information about the distance of the recorded movement trajectory. To estimate the traveled distance of the mouse movement, we differentiated the position of each of two consecutive bounding boxes in image pixels. Having the cage positioned at a fixed location by the camera and the physical dimensions of the cage, the CageView app converted the pixel distance into the physical distance in centimeters. An approximation of the traveled distance of a mouse is also an indicator of how active the mouse was during the recorded movement trajectory.

4. A Case Study

To demonstrate the effectiveness of CageView in conducting experiments, here we considered a common procedure that required fasting and monitoring of mice. This information was taken from a real-life experiment conducted in the Small Animal and Materials Imaging Core Facility at the University of Manitoba.
Positron emission tomography (PET) imaging is a 3D imaging modality requiring the injection of experimental animals with a radioactive tracer. The most common tracer used in PET imaging is fluorodeoxyglucose (FDG) labeled with 18F-FDG. This tracer is taken up by cells in the same way as glucose is taken up. However, once inside cells, 18F-FDG cannot enter the normal metabolic pathways that glucose enters and cannot be transported out of cells. This leads to an accumulation of 18F-FDG that is dependent upon the metabolic activity, making it a useful marker of tumor activity.
Prior to a PET scan, a period of fasting is required to reduce blood glucose levels. This reduces competition between endogenous glucose and 18F-FDG for transport into cells, improving 18F-FDG uptake and thus improving the signal in PET scans. For experimental rodents, this period of fasting is typically 8–10 h. As 18F-FDG has a very short half-life (110 min), it is not feasible to start PET experiments late in the day, as the quantity of activity required to ensure an adequate supply after many hours of decay would be prohibitive. This means that PET imaging is usually started as early as possible in the morning, necessitating that animal fasting commences in the early hours of the morning. Currently, this fasting is achieved by manually removing food from the cage. Because mice are typically scanned singly and sequentially, food removal must also be performed singly and sequentially. The time between food removals is dictated by the duration of the PET scan. This can entail a worker removing food at 12:00 a.m., 1:00 a.m., 2:00 a.m., 3:00 a.m., and so on. A typical experimental workflow is shown in Table 1, assuming the imaging 6 mice and that 18F-FDG is available at 9:30 a.m.
Assuming 18F-FDG was available at 9:30 a.m., the first mouse can be injected at 10:00 a.m. A 10-h fasting period requires the food to be removed from this mouse at midnight. The animal is then placed on a warming pad at 9:00 a.m. (warming reduces 18F-FDG uptake in brown adipose tissues, which reduces tumor uptake and produces hot spots on images related to brown adipose tissue) for 1 h. After the warming period, the animal is injected with 18F-FDG, returned to the warm cage for 30 min, and then imaged for 15–30 min. Following imaging, the animal is returned to the designated holding room (a holding room covered by an institutional radiation safety permit), where it must remain for 10 half-lives before it can be returned to normal housing. In the case of 18F-FDG, this in effect requires a 24-h period in the holding room.
As these experiments are time-sensitive, a 30-min window is typically scheduled between scans to accommodate any unexpected problems during imaging. This means that unexpected delays in one scan do not impact the next scheduled scan. Following this schedule, fasting is initiated at midnight and then every hour until 5:00 a.m. Manually removing food according to this schedule is clearly problematic, and CageView controls food access without manual intervention.
A further advantage of CageView in such PET experiments is the ability to remotely monitor the activity of a mouse. All animals must be monitored by qualified staff at least once per day. If the experiment described above were to be conducted on a Friday, this necessitates monitoring of the animals in the dedicated holding room on Saturday. Access to this holding room is restricted to staff trained in radiation safety and named on the institutional radiation safety permit associated with the holding room. Such trained staff are generally not available on weekends, so many imaging facilities will not allow PET experiments in which the animal recovers (and therefore requires housing over the weekend) to be conducted on Fridays. Remote access to the live video stream and activity logs allows qualified staff to assess the well-being of the animal from home. Therefore, recovery PET experiments may be conducted on Fridays.

5. Conclusions

A smart food control and monitoring system was introduced which helps the experimenters conveniently schedule and control an animal’s access to food and monitor their status. It is a programmable, sensorized, and easy-to-handle system that is designed to provide tightly controlled conditions where the food intake, behavior, and intensity of the animal’s activity must be accurately measured. It eliminates the need for technicians to manually control the feeding and fasting schedule of the animal. It also gives the experimenter remote access for condition control and monitoring.
The feeding mechanism is designed to be mounted on conventional cages used in most research centers and has a sliding door that can enable and disable the animal’s access to food. The mechanism will change the food access according to the schedule set by the experimenter. Regardless of the preset schedule, the experimenter is also able to change the food access by opening and closing the food access door using the built-in function of the interface program. The vision unit takes advantage of a set of light sensors, a camera, and a trained YOLOv5s CNN real-time object detection model. We can confirm that a modern CNN model can provide satisfactory results (0.9825 precision) if provided with a large enough dataset (~27,000). The camera can be used for remote monitoring of the animal, which is needed in many applications, such as post-surgical monitoring, humane endpoint monitoring, and pregnancy monitoring. The vision unit also provides a live video stream of the animal on the experimenter’s computer. In addition, as is required in many biomedical research studies, the vision unit can detect the animal’s location within the cage, provide a history of the animal’s activity, and compile statistics and analytics for the experimenter. The feeding mechanism and the vision unit are connected to the interface program using wired or wireless connections.
In short, the key values of the described technology can be summarized as follows:
  • Eliminating the need for an in-person presence to change the animals’ access to food;
  • No scheduling of staff at irregular hours;
  • Remote visual monitoring;
  • The ability to check on animal well-being at any time (monitoring anywhere);
  • Availability of recorded and logged animal activities for analysis and comparison to other procedures;
  • Automated and reliable food control;
  • Reduced need for safety-trained staff to risk exposure to radiation through remote monitoring;
  • Increased capacity of conducting the experiments on different days of the week.
Future work will focus on the description, validation, and testing of an algorithm that was used to approximate the distance traveled by a mouse, as well as more information on the dataset while comparing and reporting different object detection CNN performances for mice inside the cage.

6. Patents

The CageView technology has been disclosed in the Maddahi Y. and Maddahi A. Methods and apparatus for monitoring, feeding, and checking animals. United States Patent. US 63/321,368, 2022.

Author Contributions

Conceptualization, M.S., A.M. and M.J.; formal analysis, M.S. and A.M.; investigation, M.J.; methodology, M.J. and K.Z.; resources, M.J. and K.Z.; software, A.M. and A.M.N.; supervision, K.Z.; validation, all authors; writing—original draft, M.S. and A.M.; writing—review and editing, all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank Tactile Robotics Ltd. for providing in-kind contributions and free sets of CageView for testing and conducting the experiments.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
EMGElectromyography
EEGElectroencephalography
18F-FDG18F-fluorodeoxyglucose
GTTGlucose tolerance test
IRInfrared
LANLocal area network
PETPositron emission tomography
RFIDRadio frequency identification

References

  1. Matikainen-Ankney, B.A.; Earnest, T.; Ali, M.; Casey, E.; Wang, J.G.; Sutton, A.K.; Legaria, A.A.; Barclay, K.M.; Murdaugh, L.B.; Norris, M.R.; et al. An open-source device for measuring food intake and operant behavior in rodent home-cages. Elife 2021, 10, e66173. [Google Scholar] [CrossRef] [PubMed]
  2. Singh, S.; Bermudez-Contreras, E.; Nazari, M.; Sutherland, R.J.; Mohajerani, M.H. Low-cost solution for rodent home-cage behavior monitoring. PLoS ONE 2019, 14, e0220751. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Ingley, H.A.; Hahn, D.W.; Battles, A.H. Microfield Interface Device for Monitoring Animal Cage Environments. U.S. Patent 6,998,980, 14 February 2006. [Google Scholar]
  4. Ingley, H.A.; Hahn, D.W.; Battles, A.H. Rodent Cage to Accommodate Monitoring Devices. U.S. Patent 7,497,187, 3 March 2009. [Google Scholar]
  5. Coiro, M.A.; Miller, S.J.; Curtin, D.L. Remote Animal Cage Environmental Monitoring and Control System. U.S. Patent 11/473,171, 8 November 2007. [Google Scholar]
  6. Tecott, L.H.; Goulding, E. Animal Cage Behavior System. U.S. Patent 7,086,350, 8 August 2006. [Google Scholar]
  7. Iannello, F. Non-intrusive high throughput automated data collection from the home cage. Heliyon 2019, 5, e01454. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Hong, J.I.; Park, I.Y.; Kim, H.A. Understanding the molecular mechanisms underlying the pathogenesis of arthritis pain using animal models. Int. J. Mol. Sci. 2020, 21, 533. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Brown, L.A.; Hasan, S.; Foster, R.G.; Peirson, S.N. COMPASS: Continuous open mouse phenotyping of activity and sleep status. Wellcome Open Res. 2017, 1, 2. [Google Scholar] [CrossRef] [PubMed]
  10. Flores, A.E.; Flores, J.E.; Deshpande, H.; Picazo, J.A.; Xie, X.; Franken, P.; Heller, H.C.; Grahn, D.A.; O’Hara, B.F. Pattern recognition of sleep in rodents using piezoelectric signals generated by gross body movements. IEEE Trans. Biomed. Eng. 2007, 54, 228–233. [Google Scholar] [CrossRef] [PubMed]
  11. Genewsky, A.; Heinz, D.E.; Kaplick, P.M.; Kilonzo, K.; Wotjak, C.T. A simplified microwave-based motion detector for home cage activity monitoring in mice. J. Biol. Eng. 2017, 11, 36. [Google Scholar] [CrossRef] [PubMed]
  12. Shrestha, Y.B. System for Monitoring Feeding Behavior of Each Individual Animal in a Group-Housed Cage. U.S. Patent 16/036,901, 17 January 2019. [Google Scholar]
  13. Pack, A.I.; Galante, R.J.; Maislin, G.; Cater, J.; Metaxas, D.; Lu, S.; Zhang, L.; Smith, R.V.; Kay, T.; Lian, J.; et al. Novel method for high-throughput phenotyping of sleep in mice. Physiol. Genom. 2007, 2, 232–238. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Jocher, G.; Stoken, A.; Chaurasia, A.; Borovec, J.; Kwon, Y.; Michael, K.; Liu, C.; Fang, J.; Abhiram, V.; Skalski, S.P.; et al. Ultralytics/yolov5: V6.0—YOLOv5n ‘Nano’ models, Roboflow integration, TensorFlow export, OpenCV DNN support. Zenodo Tech. Rep. 2021. [Google Scholar] [CrossRef]
Figure 1. Schematic representation of CageView, including the system, tested in the laboratory of the University of Manitoba Department of Animal Science (right box), schematic of the communication system (middle box), and the experimenter’s desktop-based site in the remote location (left box).
Figure 1. Schematic representation of CageView, including the system, tested in the laboratory of the University of Manitoba Department of Animal Science (right box), schematic of the communication system (middle box), and the experimenter’s desktop-based site in the remote location (left box).
Applsci 12 04966 g001
Figure 2. (a) Overall scheme of the apparatus that consists of the plastic frame, feeding mechanism, vision unit, food container, and power system. (b) Feeding mechanism and (c) vision unit mounted on and placed next to the mouse cage, respectively.
Figure 2. (a) Overall scheme of the apparatus that consists of the plastic frame, feeding mechanism, vision unit, food container, and power system. (b) Feeding mechanism and (c) vision unit mounted on and placed next to the mouse cage, respectively.
Applsci 12 04966 g002
Figure 3. Sample screenshot of the interface software of the desktop application showing the live stream of the cage, feeding and fasting schedule, mouse trajectory, history of the animal’s access to food, and the experimenter’s options to open or close the food access door.
Figure 3. Sample screenshot of the interface software of the desktop application showing the live stream of the cage, feeding and fasting schedule, mouse trajectory, history of the animal’s access to food, and the experimenter’s options to open or close the food access door.
Applsci 12 04966 g003
Figure 4. Sample outputs are generated by the neural network. One can see the confidence level of the neural network for each detected mouse bounding box. The top left image shows a white mouse, while the rest are black. Different lighting conditions are illustrated as well.
Figure 4. Sample outputs are generated by the neural network. One can see the confidence level of the neural network for each detected mouse bounding box. The top left image shows a white mouse, while the rest are black. Different lighting conditions are illustrated as well.
Applsci 12 04966 g004
Table 1. A typical experimental workflow for PET scan of a group of 6 mice.
Table 1. A typical experimental workflow for PET scan of a group of 6 mice.
Mouse 1Mouse 2Mouse 3Mouse 4Mouse 5Mouse 6
Fast12:00 a.m.01:00 a.m.02:00 a.m.03:00 a.m.04:00 a.m.05:00 a.m.
Warm09:00 a.m.10:00 a.m.11:00 a.m.12:00 p.m.1:00 p.m.2:00 p.m.
Inject10:00 a.m.11:00 a.m.12:00 p.m.1:00 p.m.2:00 p.m.3:00 p.m.
Scan10:30 a.m.11:30 a.m.12:30 p.m.1:30 p.m.2:30 p.m.3:30 p.m.
Holding11:00 a.m.12:00 p.m.1:00 p.m.2:00 p.m.3:00 p.m.4:00 p.m.
MonitoringNext dayNext dayNext dayNext dayNext dayNext day
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Saeedi, M.; Maddahi, A.; Nassiri, A.M.; Jackson, M.; Zareinia, K. CageView: A Smart Food Control and Monitoring System for Phenotypical Research In Vivo. Appl. Sci. 2022, 12, 4966. https://doi.org/10.3390/app12104966

AMA Style

Saeedi M, Maddahi A, Nassiri AM, Jackson M, Zareinia K. CageView: A Smart Food Control and Monitoring System for Phenotypical Research In Vivo. Applied Sciences. 2022; 12(10):4966. https://doi.org/10.3390/app12104966

Chicago/Turabian Style

Saeedi, Mohammad, Ali Maddahi, Amir Mahdi Nassiri, Michael Jackson, and Kourosh Zareinia. 2022. "CageView: A Smart Food Control and Monitoring System for Phenotypical Research In Vivo" Applied Sciences 12, no. 10: 4966. https://doi.org/10.3390/app12104966

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop