Computer Methods for Automatic Locomotion and Gesture Tracking in Mice and Small Animals for Neuroscience Applications: A Survey
Abstract
:1. Introduction
2. Problem Statement
- Spatial resolution in most consumer-grade video cameras is not sufficient for effective tracking when the temporal resolution increases. Usually, cameras increase the frames per second ratio by decreasing the image resolution.
- Limbs might move faster at one point in time while they might be stationary at another point in time, rendering the development of uniform motion model impossible.
- The limbs might overlap with each other or other body parts, therefore, presenting occlusions.
- Some settings require specific lighting conditions, which may make automated gesture recognition more difficult.
3. Motion Tracking Principles in Videos
3.1. Background Subtraction-Based Approaches
3.2. Statistical and Learning-Based Approaches
4. Major Trends
- Human gait parameters and motion patterns are inherently different than those of four-legged animals/rodents/lab animals.Human gait can be solely represented by an inverted pendulum model while in rats/four-legged animals, the inverted pendulum model represents only a percentage of the gait (up to 70% according to some researchers). Moreover, the degree of freedom for human gait is different than that of four-legged animals. [44,45,46,47].
- Gesture tracking techniques developed for humans are mostly optimized for the environments in which humans dwell, therefore, they can’t be directly imported for lab environments.
- Human subjects do not need to be trained to perform a supervised task. For example, let’s say a neuroscientist wants to investigate the effect of a certain neurophysiological regime on physical activity, he/she can simply ask the test subject to either walk or exercise. The same cannot be said for small animals/rodents. They need to be trained on a treadmill, therefore, the tracking methods developed for humans might not have the same efficiency for rodents.
- For reliable behavioral phenotyping, the gesture tracking/pose estimation should be highly accurate, therefore, they often need more fine-tuning.
5. Hardware Based Methods
6. Video Tracking Aided by Hardware
6.1. Semi-Automated
6.2. Completely Automated
6.2.1. Background Subtraction-Based Approaches
6.2.2. Statistical/Learning-Based Approaches
7. Video Tracking Methods Mostly Dependent on Software-Based Tracking
7.1. Semi-Automated
Background Subtraction-Based Approaches
7.2. Completely Automated
7.2.1. Background Subtraction-Based Approaches
7.2.2. Statistical/Learning-Based Approaches
8. Applications
- Research on behavioral phenotyping needs huge volumes of annotated data to understand and classify rodents’ and animals’ behaviors. By looking at the current state-of-the-art of gesture tracking/pose estimation methods, a researcher working on behavioral phenotyping can choose the gesture tracking/pose estimation method most suitable to their needs [6].
- Research on depression analysis in normal and transgenic mice/animals can also benefit from this survey because for reliable quantification of depression, the researcher needs to understand the mice/animals behavior and once they have an appropriate quantification of behavior in terms of pose, locomotion, and gait patterns, they can understand how this behavior changes in response to genetic mutations. Scientists can now breed genetically-altered mice called "transgenic mice" that carry genes that are similar to those that cause human diseases. Likewise, select genes can be turned off or made inactive, creating "knockout mice," which can be used to evaluate the effects of cancer-causing chemicals (carcinogens) and assess drug safety, according to the FBR [123,124,125,126,127,128,129].
- Research on the effects of drugs and cancer on locomotion can also benefit from the methods described in this survey as gesture tracking/pose estimation can be used to understand the changes in locomotion patterns of rats/animals in response to tumors and drugs [134,135,136,137,138,139,140,141,142].
- Researchers working on systems biology can also benefit from the methods described in this survey as with a proper gesture/gait analysis method. They can provide useful numerical evidence to understand the behavior of biological systems under different physical and pathological conditions [153].
9. Conclusions
Future Research
- One of the most relevant shortcomings of the field is the lack of public databases to validate new algorithms. Different approaches are tested on the (usually private) data from the lab developing the solution. Building a standardized gesture tracking dataset which can be used as a benchmark would similarly benefit the community as large object recognition databases (PASCAL, ImageNet or MS COCO) allowed significant progress in the Computer Vision literature.
- Currently, large amounts of non-labeled data samples are in existence (thousands of video hours). The use of unsupervised learning algorithms that could benefit the parameter learning of supervised methods is one of the most challenging future research lines. Since unsupervised and weakly supervised gesture tracking/pose estimation is being researched for other species, extending it to rodents/small animals will make the large volumes of unlabeled data useful [154,155,156,157,158,159].
- Data augmentation using synthetic samples. Now, methods based on GANs are obtaining extraordinary results in Computer Vision. Using GAN networks can help generate large amounts of annotated training data. The annotated data can then be used to validate gesture tracking/pose estimation techniques for rodents/small animals and those techniques can be further fine-tuned by a small set of human-annotated data.
- Combine hardware-based methods with markers to create large scale databases for further automated learning just from the image. Up until, physical markers and specialized hardware have been used only in specific settings. They can be used to generate large volumes of annotated data by careful data acquisition as the markers can be reliably tracked by specialized hardware.
- Besides, the use of semi-supervised and weakly-supervised learning algorithms could benefit the community. The challenge in this particular case is to minimize the user intervention (supervision) maximizing the improvements on the accuracy.
- Very few of the surveyed approaches in the software-based method section consider temporal coherence while developing a solution for gesture tracking/pose estimation of rodents and small animals. Since locomotion is temporally coherent, machine learning methods such as Long Short Term Memory networks can be efficiently trained to track the rodents’ pose by evaluating the specific pose history.
- Finally, deep learning methods have been shown to outperform many computer vision tasks. For instance, deep learning-based methods for gesture tracking/pose estimation in humans. Exploring these validated approaches can increase the reliability of gesture tracking/pose estimation in rodents/small animals [160,161].
Author Contributions
Funding
Conflicts of Interest
Appendix A. Summary of Selected Approaches
Type | Code Availability | Performance | Real Time or Offline | Need Specialized Setup & Invasiveness | |
---|---|---|---|---|---|
[59] | Commercial | Paid | Comparison with ground truth not provided. One paper reports the reproducibility: 2.65% max SD | Yes | Yes |
[71] | Commercial | Paid | Comparison with ground truth not provided. One paper reports the reproducibility: 1.57% max SD | Yes | Yes |
[49] | Research | data and code for demo available at http://bit.do/eTTai | tracking performance not reported, behavioral classification of 12 traits reported to be max at 71% | Tracking real time, classification offline | yes |
[51] | Research | not available | tracking: SD of only 0.034% when compared with ground truth, Max SD of 1.71 degrees in estimating joint angle | real time legs and joints tracking | yes, invasive |
[52] | Research | not available | tracking performance not reported explicitly | real time whisker tracking | yes, semi-invasive |
[53] | Research | available on request | whisker tracking performance not reported explicitly | real time single whisker tracking | yes, semi-invasive |
[54] | Research | not available | head motion tracked correctly with a max false positive of 13% | real time head and snout tracking | yes, semi-invasive |
[55] | Research | not available | head motion tracked continuously with a reported SD of only 0.5 mm | real time head and snout tracking | yes, semi-invasive |
[57] | Research | not available | head motion tracked with an accuracy of 96.3% and the tracking can be reproduced over multiple studies with a correlation coefficient of 0.78 | real time head tracking | yes, semi-invasive |
[89] | Research | code and demo data available at https://goo.gl/vYaYPy | they reported a correlation between whisking amplitude and velocity as a measure of reliability, R = 0.89 | Offline head and whisker tracking | no, invasive |
[90] | Research | not available | Tracking and gait prediction with confidence of 95%, deviation between human annotator and computer at 8% | Offline | yes, semi-invasive |
[93] | Research | not available | Paw tracked with an accuracy of 88.5 on transparent floor and 83.2% on opaque floor | Offline | yes, semi-invasive |
[95] | Research | code available at https://goo.gl/58DQij | tail and paws tracked with an accuracy >90% | Real time | yes, semi-invasive |
[97] | Research | not available | 5 class behavioral classification problem, accuracy in bright condition is 95.34 and in dark conditions is 89.4% | offline | yes, non-invasive |
[100] | Research | not available | 6 behavioral class accuracy: 66.9%, 4 behavioral class accuracy: 76.3% | offline | yes, non-invasive |
[98] | Research | code available at https://goo.gl/eY2Yza | whisker detection rate: 76.9%, peak spatial error in whisker detection: 10 pixels | offline | yes, non-invasive |
[101] | Research | not available | Peak deviation between human annotator and automated annotation: 0.5 mm with a camera of 6 pixel/mm resolution | offline | yes, non-invasive |
[92] | Research | not available | Tracking accuracy >90% after the algorithm was assisted by human users in 3–5% of the frames | offline | yes, semi-invasive |
[105] | Research | code available at https://goo.gl/Gny89o | A max deviation of 17.7% between human and automated whisker annotation | offline | yes, non-invasive |
[107] | Research | not available | Maximum paw detection error: 5.9%, minimum error : 0.4% | offline | no, non-invasive |
[110] | Research | Source code at https://goo.gl/zesyez, demo data at https://goo.gl/dn2L3y | Behavioral classification: 1% false positive rate | offline | no, semi-invasive |
[108] | Research | Source code available at https://goo.gl/JCv3AV | Whisker tracing accuracy: max error of 0.45 pixels | offline | no, non-invasive |
[116] | Research | not available | Correlation with annotated data; for whiskers r = 0.78, for limbs r = 0.85 | real time | no, non-invasive |
[109] | Research | code available at https://goo.gl/V54mpL | Velocity calculated by AGATHA was off from manually calculated velocity by 1.5% | real time | no, non-invasive |
[117] | Research | code available at http://bit.ly/2vgJUbr | Detected pose matched ground truth with an accuracy of pixels | real time on GPUs | no, non-invasive |
[119] | Research | code available at https://bit.ly/2XuJmPv | No performance metric reported | offline | no, non-invasive |
References
- Deori, B.; Thounaojam, D.M. A Survey on Moving Object Tracking in Video. Int. J. Inf. Theory 2014, 3, 31–46. [Google Scholar] [CrossRef]
- Yilmaz, A.; Javed, O.; Shah, M. Object tracking: A survey. ACM Comput. Surv. (CSUR) 2006, 38, 13. [Google Scholar] [CrossRef]
- Wei, J.; Yang, M.; Liu, F. Learning Spatio-Temporal Information for Multi-Object Tracking. IEEE Access 2017, 5, 3869–3877. [Google Scholar] [CrossRef]
- Kratz, L.; Nishino, K. Tracking with local spatio-temporal motion patterns in extremely crowded scenes. In Proceedings of the2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; IEEE: Piscataway, NJ, USA, 2010. [Google Scholar]
- Teng, Z.; Xing, J.; Wang, Q.; Lang, C.; Feng, S.; Jin, Y. Robust object tracking based on temporal and spatial deep networks. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017. [Google Scholar]
- Sousa, N.; Almeida, O.F.X.; Wotjak, C.T. A hitchhiker’s guide to behavioral analysis in laboratory rodents. Genes Brain Behav. 2006, 5, 5–24. [Google Scholar] [CrossRef] [PubMed]
- Crawley, J.N. Behavioral phenotyping of rodents. Comp. Med. 2003, 53, 140–146. [Google Scholar] [PubMed]
- Crawley, J.N. Behavioral phenotyping of transgenic and knockout mice: Experimental design and evaluation of general health, sensory functions, motor abilities, and specific behavioral tests. Brain Res. 1999, 835, 18–26. [Google Scholar] [CrossRef]
- Crawley, J.N. What’s Wrong with My Mouse? Behavioral Phenotyping of Transgenic and Knockout Mice; John Wiley & Sons: Hoboken, NJ, USA, 2007. [Google Scholar]
- Tekalp, A.M.; Tekalp, A.M. Digital Video Processing; Prentice Hall PTR: Upper Saddle River, NJ, USA, 1995; Volume 1. [Google Scholar]
- Bovik, A.C. Handbook of Image and Video Processing; Academic Press: Cambridge, MA, USA, 2010. [Google Scholar]
- Borst, A.; Egelhaaf, M. Principles of visual motion detection. Trends Neurosci. 1989, 12, 297–306. [Google Scholar] [CrossRef] [Green Version]
- Hu, W.; Tan, T.; Wang, L.; Maybank, S. A survey on visual surveillance of object motion and behaviors. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2004, 34, 334–352. [Google Scholar] [CrossRef]
- Welch, G.; Foxlin, E. Motion tracking survey. IEEE Comput. Graph. Appl. 2002, 22, 24–38. [Google Scholar] [CrossRef]
- De-gui, X.; Sheng-sheng, Y.; Jing-li, Z. Motion tracking with fast adaptive background subtraction. Wuhan Univ. J. Nat. Sci. A 2003, 8, 35–40. [Google Scholar]
- Zhang, R.; Ding, J. Object tracking and detecting based on adaptive background subtraction. Procedia Eng. 2012, 29, 1351–1355. [Google Scholar] [CrossRef]
- Saravanakumar, S.; Vadivel, A.; Ahmed, C.G.S. Multiple human object tracking using background subtraction and shadow removal techniques. In Proceedings of the 2010 International Conference on Signal and Image Processing (ICSIP), Chennai, India, 15–17 December 2010; IEEE: Piscataway, NJ, USA, 2010. [Google Scholar]
- Kim, I.; Awan, T.W.; Soh, Y. Background subtraction-based multiple object tracking using particle filter. In Proceedings of the 2014 International Conference on Systems, Signals and Image Processing (IWSSIP), Dubrovnik, Croatia, 12–15 May 2014; IEEE: Piscataway, NJ, USA, 2014. [Google Scholar]
- Zhang, L.; Liang, Y. Motion human detection based on background subtraction. In Proceedings of the 2010 Second International Workshop on Education Technology and Computer Science (ETCS), Wuhan, China, 6–7 March 2010; IEEE: Piscataway, NJ, USA, 2010; Volume 1. [Google Scholar]
- Shuigen, W.; Zhen, C.; Hua, D. Motion detection based on temporal difference method and optical flow field. In Proceedings of the Second International Symposium on Electronic Commerce and Security (ISECS’09), Nanchang, China, 22–24 May 2009; IEEE: Piscataway, NJ, USA, 2009; Volume 2. [Google Scholar]
- Singla, N. Motion detection based on frame difference method. Int. J. Inf. Comput. Technol. 2014, 4, 1559–1565. [Google Scholar]
- Lu, N.; Wang, J.; Wu, Q.H.; Yang, L. An Improved Motion Detection Method for realtime Surveillance. IAENG Int. J. Comput. Sci. 2008, 35. [Google Scholar] [CrossRef]
- Jing, G.; Siong, C.E.; Rajan, D. Foreground motion detection by difference-based spatial temporal entropy image. In Proceedings of the 2004 IEEE Region 10 Conference TENCON 2004, Chiang Mai, Thailand, 24 November 2004; IEEE: Piscataway, NJ, USA, 2004. [Google Scholar]
- Shaikh, S.H.; Saeed, K.; Chaki, N. Moving Object Detection Approaches, Challenges and Object Tracking. Moving Object Detection Using Background Subtraction; Springer: Cham, Switzerland, 2014; pp. 5–14. [Google Scholar]
- Denzler, J.; Schless, V.; Paulus, D.; Niemann, H. Statistical approach to classification of flow patterns for motion detection. In Proceedings of the International Conference on Image Processing 1996, Lausanne, Switzerland, 19 September 1996; IEEE: Piscataway, NJ, USA, 1996; Volume 1. [Google Scholar]
- Paragios, N.; Tziritas, G. Adaptive detection and localization of moving objects in image sequences. Signal Process. Image Commun. 1999, 14, 277–296. [Google Scholar] [CrossRef]
- Hu, W.; Xiao, X.; Fu, Z.; Xie, D.; Tan, T.; Maybank, S. A system for learning statistical motion patterns. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1450–1464. [Google Scholar] [PubMed] [Green Version]
- El Abed, A.; Dubuisson, S.; Béréziat, D. Comparison of statistical and shape-based approaches for non-rigid motion tracking with missing data using a particle filter. In International Conference on Advanced Concepts for Intelligent Vision Systems; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Chellappa, R.; Sankaranarayanan, A.C.; Veeraraghavan, A.; Turaga, P. Statistical methods and models for video-based tracking, modeling, and recognition. Found. Trends Signal Process. 2010, 3, 1–151. [Google Scholar] [CrossRef]
- Paragios, N.; Deriche, R. Geodesic active contours and level sets for the detection and tracking of moving objects. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 266–280. [Google Scholar] [CrossRef]
- Pless, R.; Brodsky, T.; Aloimonos, Y. Detecting independent motion: The statistics of temporal continuity. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 768–773. [Google Scholar] [CrossRef]
- Isard, M. Visual Motion Analysis by Probabilistic Propagation of Conditional Density. Ph.D. Thesis, University of Oxford, Oxford, UK, 1998. [Google Scholar]
- Comaniciu, D.; Meer, P. Mean shift: A robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 603–619. [Google Scholar] [CrossRef]
- Lucas, B.D.; Kanade, T. An iterative image registration technique with an application to stereo vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence, Vancouver, BC, Canada, 24–28 August 1981; pp. 674–679. [Google Scholar]
- Horn, P.B.K.; Schunck, B.G. Determining optical flow. Artif. Intell. 1981, 17, 185–203. [Google Scholar] [CrossRef] [Green Version]
- DWixson, L. Detecting salient motion by accumulating directionally—Consistent flow. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 774–780. [Google Scholar] [CrossRef]
- Shafie, A.A.; Hafiz, F.; Ali, M.H. Motion detection techniques using optical flow. World Acad. Sci. Eng. Technol. 2009, 56, 559–561. [Google Scholar]
- Aslani, S.; Mahdavi-Nasab, H. Optical flow based moving object detection and tracking for traffic surveillance. Int. J. Electr. Comput. Energ. Electron. Commun. Eng. 2013, 7, 1252–1256. [Google Scholar]
- Barron, J.L.; Fleet, D.J.; Beauchemin, S.S. Performance of optical flow techniques. Int. J. Comput. Vis. 1994, 12, 43–77. [Google Scholar] [CrossRef]
- Sun, S.; Kuang, Z.; Sheng, L.; Ouyang, W.; Zhang, W. Optical Flow Guided Feature: A Fast and Robust Motion Representation for Video Action Recognition. arXiv preprint 2017, arXiv:1711.11152. [Google Scholar]
- Li, P.; Wang, D.; Wang, L.; Lu, H. Deep visual tracking: Review and experimental comparison. Pattern Recognit. 2018, 76, 323–338. [Google Scholar] [CrossRef]
- Wang, N.; Yeung, D.Y. Learning a deep compact image representation for visual tracking. In Proceedings of the Advances in Neural Information Processing System, Lake Tahoe, NV, USA, 5–10 December 2013. [Google Scholar]
- Feng, X.; Mei, W.; Hu, D. A Review of Visual Tracking with Deep Learning. Adv. Intell. Syst. Res. 2016, 133, 231–234. [Google Scholar] [CrossRef]
- Irschick, D.J.; Jayne, B.C. Comparative three-dimensional kinematics of the hindlimb for high-speed bipedal and quadrupedal locomotion of lizards. J. Exp. Biol. 1999, 202, 1047–1065. [Google Scholar] [PubMed]
- Alexander, R.M. The gaits of bipedal and quadrupedal animals. Int. J. Robot. Res. 1984, 3, 49–59. [Google Scholar] [CrossRef]
- Berillon, G.; Daver, G.; D’août, K.; Nicolas, G.; de La Villetanet, B.; Multon, F.; Digrandi, G.; Dubreuil, G. Bipedal versus quadrupedal hind limb and foot kinematics in a captive sample of Papio anubis: Setup and preliminary results. Int. J. Primatol. 2010, 31, 159–180. [Google Scholar] [CrossRef]
- Liu, Y.; Ao, L.J.; Lu, G.; Leong, E.; Liu, Q.; Wang, X.H.; Zhu, X.L.; Sun, T.F.D.; Fei, Z.; Jiu, T. Quantitative gait analysis of long-term locomotion deficits in classical unilateral striatal intracerebral hemorrhage rat model. Behav. Brain Res. 2013, 257, 166–177. [Google Scholar] [CrossRef] [PubMed]
- Available online: https://nc3rs.org.uk/crackit/locowhisk-quantifying-rodent-locomotion-behaviours (accessed on 24 July 2019).
- Kain, J.; Stokes, C.; Gaudry, Q.; Song, X.; Foley, J.; Wilson, R.; De Bivort, B. Leg-tracking and automated behavioural classification in Drosophila. Nat. Commun. 2013, 4, 1910. [Google Scholar] [CrossRef] [PubMed]
- Roy, S.; Bryant, J.L.; Cao, Y.; Heck, D.H. High-precision, three-dimensional tracking of mouse whisker movements with optical motion capture technology. Front. Behav. Neurosci. 2011, 5, 27. [Google Scholar] [CrossRef] [PubMed]
- Tashman, S.; Anderst, W. In-vivo measurement of dynamic joint motion using high speed biplane radiography and CT: Application to canine ACL deficiency. J. Biomech. Eng. 2003, 125, 238–245. [Google Scholar] [CrossRef] [PubMed]
- Harvey, A.; Roberto Bermejo, H.; Philip Zeigler, M. Discriminative whisking in the head-fixed rat: Optoelectronic monitoring during tactile detection and discrimination tasks. Somatosens. Mot. Res. 2001, 18, 211–222. [Google Scholar] [PubMed]
- Bermejo, R.; Houben, D.; Zeigler, H.P. Optoelectronic monitoring of individual whisker movements in rats. J. Neurosci. Methods 1998, 83, 89–96. [Google Scholar] [CrossRef]
- Kyme, A.; Meikle, S.; Baldock, C.; Fulton, R. Tracking and characterizing the head motion of unanaesthetized rats in positron emission tomography. J. R. Soc. Interface 2012, 9, 3094–3107. [Google Scholar] [CrossRef] [PubMed]
- Kyme, A.; Zhou, V.; Meikle, S.; Fulton, R. Realtime 3D motion tracking for small animal brain PET. Phys. Med. Biol. 2008, 53, 2651–2666. [Google Scholar] [CrossRef]
- Kyme, A.Z.; Zhou, V.W.; Meikle, S.R.; Baldock, C.; Fulton, R.R. Optimised motion tracking for positron emission tomography studies of brain function in awake rats. PLoS ONE 2011, 6, E21727. [Google Scholar] [CrossRef]
- Pasquet, M.O.; Tihy, M.; Gourgeon, A.; Pompili, M.N.; Godsil, B.P.; Léna, C.; Dugué, G.P. Wireless inertial measurement of head kinematics in freely-moving rats. Sci. Rep. 2016, 6, 35689. [Google Scholar] [CrossRef]
- Hamers, F.P.; Lankhorst, A.J.; van Laar, T.J.; Veldhuis, W.B.; Gispen, W.H. Automated quantitative gait analysis during overground locomotion in the rat: Its application to spinal cord contusion and transection injuries. J. Neurotrauma 2001, 18, 187–201. [Google Scholar] [CrossRef] [PubMed]
- Dorman, C.W.; Krug, H.E.; Frizelle, S.P.; Funkenbusch, S.; Mahowald, M.L. A comparison of DigiGaitTM and TreadScanTM imaging systems: Assessment of pain using gait analysis in murine monoarthritis. J. Pain Res. 2014, 7, 25. [Google Scholar] [PubMed]
- Xiao, J.; Vemula, S.R.; Xue, Y.; Khan, M.M.; Kuruvilla, K.P.; Marquez-Lona, E.M.; Cobb, M.R.; LeDoux, M.S. Motor phenotypes and molecular networks associated with germline deficiency of Ciz1. Exp. Neurol. 2016, 283, 110–120. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Connell, J.W.; Allison, R.; Reid, E. Quantitative gait analysis using a motorized treadmill system sensitively detects motor abnormalities in mice expressing ATPase defective spastin. PLoS ONE 2016, 11, E0152413. [Google Scholar] [CrossRef] [PubMed]
- Sashindranath, M.; Daglas, M.; Medcalf, R.L. Evaluation of gait impairment in mice subjected to craniotomy and traumatic brain injury. Behav. Brain Res. 2015, 286, 33–38. [Google Scholar] [CrossRef] [PubMed]
- Neckel, N.D. Methods to quantify the velocity dependence of common gait measurements from automated rodent gait analysis devices. J. Neurosci. Methods 2015, 253, 244–253. [Google Scholar] [CrossRef] [Green Version]
- Lambert, C.; Philpot, R.; Engberg, M.; Johns, B.; Kim, S.; Wecker, L. Gait analysis and the cumulative gait index (CGI): Translational tools to assess impairments exhibited by rats with olivocerebellar ataxia. Behav. Brain Res. 2014, 274, 334–343. [Google Scholar] [CrossRef] [Green Version]
- Takano, M.; Komaki, Y.; Hikishima, K.; Konomi, T.; Fujiyoshi, K.; Tsuji, O.; Okano, H.; Toyama, Y.; Nakamura, M. In vivo tracing of neural tracts in tiptoe-walking yoshimura mice by diffusion tensor tractography. In Neuroprotection and Regeneration of the Spinal Cord; Springer: Tokyo, Japan, 2014; pp. 107–117. [Google Scholar]
- Hampton, T.G.; Amende, I. Treadmill gait analysis characterizes gait alterations in Parkinson’s disease and amyotrophic lateral sclerosis mouse models. J. Mot. Behav. 2009, 42, 1–4. [Google Scholar] [CrossRef]
- Beare, J.E.; Morehouse, J.R.; DeVries, W.H.; Enzmann, G.U.; Burke, D.A.; Magnuson, D.S.; Whittemore, S.R. Gait analysis in normal and spinal contused mice using the TreadScan system. J. Neurotrauma 2009, 26, 2045–2056. [Google Scholar] [CrossRef]
- Gellhaar, S.; Marcellino, D.; Abrams, M.; Galter, D. Chronic L-DOPA induces hyperactivity, normalization of gait and dyskinetic behavior in MitoPark mice. Genes Brain Behav. 2015, 14, 260–270. [Google Scholar] [CrossRef]
- Beare, J. Kinematic Analysis of Treadmill Walking in Normal and Contused Mice Using the TreadScan System; University of Louisville: Louisville, KY, USA, 2007. [Google Scholar]
- McMackin, M.Z.; Henderson, C.K.; Cortopassi, G.A. Neurobehavioral deficits in the KIKO mouse model of Friedreich’s ataxia. Behav. Brain Res. 2017, 316, 183–188. [Google Scholar] [CrossRef] [PubMed]
- Available online: http://cleversysinc.com/CleverSysInc/csi_products/gaitscan/ (accessed on 24 July 2019).
- Adamah-Biassi, E.B.; Stepien, I.; Hudson, R.L.; Dubocovich, M.L. Automated Video Analysis System Reveals Distinct Diurnal Behaviors in C57BL/6 and C3H/HeN Mice. Behav. Brain Res. 2013, 243, 306–312. [Google Scholar] [CrossRef] [PubMed]
- Kyzar, E.J.; Pham, M.; Roth, A.; Cachat, J.; Green, J.; Gaikwad, S.; Kalueff, A.V. Alterations in grooming activity and syntax in heterozygous SERT and BDNF knockout mice: The utility of behavior-recognition tools to characterize mutant mouse phenotypes. Brain Res. Bull. 2012, 89, 168–176. [Google Scholar] [CrossRef] [PubMed]
- Adamah-Biassi, E.B.; Stepien, I.; Hudson, R.L.; Dubocovich, M.L. Effects of the Melatonin Receptor Antagonist (MT2)/Inverse Agonist (MT1) Luzindole on Re-entrainment of Wheel Running Activity and Spontaneous Homecage Behaviors in C3H/HeN Mice. FASEB J. 2012, 26, 1042–1045. [Google Scholar]
- Kyzar, E.; Gaikwad, S.; Roth, A.; Green, J.; Pham, M.; Stewart, A.; Liang, Y.; Kobla, V.; Kalueff, A.V. Towards high-throughput phenotyping of complex patterned behaviors in rodents: Focus on mouse self-grooming and its sequencing. Behav. Brain Res. 2011, 225, 426–431. [Google Scholar] [CrossRef] [PubMed]
- Ou-Yang, T.H.; Tsai, M.L.; Yen, C.T.; Lin, T.T. An infrared range camera-based approach for three-dimensional locomotion tracking and pose reconstruction in a rodent. J. Neurosci. Methods 2011, 201, 116–123. [Google Scholar] [CrossRef] [PubMed]
- Cupido, A. Detecting Cerebellar Phenotypes with the Erasmus Ladder. Ph.D. Thesis, Erasmus MC. Dept of Clin Genet, Rotterdam, The Netherlands, 2009. [Google Scholar]
- Ha, S.; Lee, D.; Cho, Y.S.; Chung, C.; Yoo, Y.E.; Kim, J.; Lee, J.; Kim, W.; Kim, H.; Bae, Y.C.; et al. Cerebellar Shank2 regulates excitatory synapse density, motor coordination, and specific repetitive and anxiety-like behaviors. J. Neurosci. 2016, 36, 12129–12143. [Google Scholar] [CrossRef]
- Peter, S.; Michiel, M.; Stedehouder, J.; Reinelt, C.M.; Wu, B.; Zhou, H.; Zhou, K.; Boele, H.J.; Kushner, S.A.; Lee, M.G. Dysfunctional cerebellar Purkinje cells contribute to autism-like behaviour in Shank2-deficient mice. Nat. Commun. 2016, 7, 12627. [Google Scholar] [CrossRef]
- De Zeeuw, C.I.; Hoogland, T.M. Reappraisal of Bergmann glial cells as modulators of cerebellar circuit function. Front. Cell. Neurosci. 2015, 9, 246. [Google Scholar] [CrossRef]
- Sepulveda-Falla, D.; Barrera-Ocampo, A.; Hagel, C.; Korwitz, A.; Vinueza-Veloz, M.F.; Zhou, K.; Schonewille, M.; Zhou, H.; Velazquez-Perez, L.; Rodriguez-Labrada, R.; et al. Familial Alzheimer’s disease-associated presenilin-1 alters cerebellar activity and calcium homeostasis. J. Clin. Investig. 2014, 124, 1552–1567. [Google Scholar] [CrossRef]
- Veloz, M.F.V.; Zhou, K.; Bosman, L.W.; Potters, J.W.; Negrello, M.; Seepers, R.M.; Strydis, C.; Koekkoek, S.K.; De Zeeuw, C.I. Cerebellar control of gait and interlimb coordination. Brain Struct. Funct. 2015, 220, 3513–3536. [Google Scholar] [CrossRef] [PubMed]
- Available online: https://goo.gl/ippZrg (accessed on 24 July 2019).
- Vidal, P.M.; Karadimas, S.K.; Ulndreaj, A.; Laliberte, A.M.; Tetreault, L.; Forner, S.; Wang, J.; Foltz, W.D.; Fehlings, M.G. Delayed decompression exacerbates ischemia-reperfusion injury in cervical compressive myelopathy. JCI Insight 2017, 2. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tatenhorst, L.; Eckermann, K.; Dambeck, V.; Fonseca-Ornelas, L.; Walle, H.; da Fonseca, T.L.; Koch, J.C.; Becker, S.; Tönges, L.; Bähr, M.; et al. Fasudil attenuates aggregation of a-synuclein in models of Parkinson disease. Acta Neuropathol. Commun. 2016, 4, 39. [Google Scholar] [CrossRef] [PubMed]
- Zhou, M.; Zhang, W.; Chang, J.; Wang, J.; Zheng, W.; Yang, Y.; Wen, P.; Li, M.; Xiao, H. Gait analysis in three different 6-hydroxydopamine rat models of Parkinson’s disease. Neurosci. Lett. 2015, 584, 184–189. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.J.; Cheng, F.C.; Sheu, M.L.; Su, H.L.; Chen, C.J.; Sheehan, J.; Pan, H.C. Detection of subtle neurological alterations by the Catwalk XT gait analysis system. J. Neuroeng. Rehabil. 2014, 11, 62. [Google Scholar] [CrossRef] [PubMed]
- Hou, J.; Nelson, R.; Nissim, N.; Parmer, R.; Thompson, F.J.; Bose, P. Effect of combined treadmill training and magnetic stimulation on spasticity and gait impairments after cervical spinal cord injury. J. Neurotrauma 2014, 31, 1088–1106. [Google Scholar] [CrossRef] [PubMed]
- Knutsen, P.M.; Derdikman, D.; Ahissar, E. Tracking whisker and head movements in unrestrained behaving rodents. J. Neurophysiol. 2005, 93, 2294–2301. [Google Scholar] [CrossRef] [PubMed]
- Gravel, P.; Tremblay, M.; Leblond, H.; Rossignol, S.; de Guise, J.A. A semi-automated software tool to study treadmill locomotion in the rat: From experiment videos to statistical gait analysis. J. Neurosci. Methods 2010, 190, 279–288. [Google Scholar] [CrossRef] [PubMed]
- Lenhoff, M.W.; Santner, T.J.; Otis, J.C.; Peterson, M.G.; Williams, B.J.; Backus, S.I. Bootstrap prediction and confidence bands: A superior statistical method for analysis of gait data. Gait Posture 1999, 9, 10–17. [Google Scholar] [CrossRef]
- Bender, J.A.; Simpson, E.M.; Ritzmann, R.E. Computer-assisted 3D kinematic analysis of all leg joints in walking insects. PLoS ONE 2010, 5, E13617. [Google Scholar] [CrossRef]
- Nakamura, A.; Funaya, H.; Uezono, N.; Nakashima, K.; Ishida, Y.; Suzuki, T.; Wakana, S.; Shibata, T. Low-cost three-dimensional gait analysis system for mice with an infrared depth sensor. Neurosci. Res. 2015, 100, 55–62. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Plagemann, C.; Ganapathi, V.; Koller, D.; Thrun, S. Realtime identification and localization of body parts from depth images. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation (ICRA), Anchorage, Alaska, 3–8 May 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 3108–3113. [Google Scholar]
- Mendes, C.S.; Bartos, I.; Márka, Z.; Akay, T.; Márka, S.; Mann, R.S. Quantification of gait parameters in freely walking rodents. BMC Biol. 2015, 13, 50. [Google Scholar] [CrossRef] [PubMed]
- MouseWalker. Available online: http://biooptics.markalab.org/MouseWalker/ (accessed on 24 July 2019).
- Wang, Z.; Mirbozorgi, S.A.; Ghovanloo, M. Towards a kinect-based behavior recognition and analysis system for small animals. In Proceedings of the 2015 Biomedical Circuits and Systems Conference (BioCAS), Atlanta, GA, USA, 22–24 October 2015; IEEE: Piscataway, NJ, USA, 2015. [Google Scholar]
- Voigts, J.; Sakmann, B.; Celikel, T. Unsupervised whisker tracking in unrestrained behaving animals. J. Neurophysiol. 2008, 100, 504–515. [Google Scholar] [CrossRef] [PubMed]
- Nashaat, M.A.; Oraby, H.; Peña, L.B.; Dominiak, S.; Larkum, M.E.; Sachdev, R.N. Pixying behavior: A versatile realtime and post hoc automated optical tracking method for freely moving and head fixed animals. eNeuro 2017, 4. [Google Scholar] [CrossRef]
- Monteiro, J.P.; Oliveira, H.P.; Aguiar, P.; Cardoso, J.S. A depth-map approach for automatic mice behavior recognition. In Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; IEEE: Piscataway, NJ, USA, 2014. [Google Scholar]
- Petrou, G.; Webb, B. Detailed tracking of body and leg movements of a freely walking female cricket during phonotaxis. J. Neurosci. Methods 2012, 203, 56–68. [Google Scholar] [CrossRef]
- Xu, Q.; Cai, C.; Zhou, H.; Ren, H. A video tracking system for limb motion measurement in small animals. In Proceedings of the 2010 International Conference on Optoelectronics and Image Processing (ICOIP), Haikou, China, 11–12 November 2010; IEEE: Piscataway, NJ, USA, 2010; Volume 1. [Google Scholar]
- Hwang, S.; Choi, Y. Tracking the joints of arthropod legs using multiple images and inverse kinematics. Int. J. Precis. Eng. Manuf. 2015, 16, 669–675. [Google Scholar] [CrossRef]
- Aristidou, A.; Lasenby, J. FABRIK: A fast, iterative solver for the inverse kinematics problem. Graph. Models 2011, 73, 243–260. [Google Scholar] [CrossRef]
- Gyory, G.; Rankov, V.; Gordon, G.; Perkon, I.; Mitchinson, B.; Grant, R.; Prescott, T. An algorithm for automatic tracking of rat whiskers. In Proceedings of the 20th International Conference on Pattern Recognition (ICPR 2010), Istanbul, Turkey, 22 August 2010; Volume 2010. [Google Scholar]
- da Silva Aragão, R.; Rodrigues, M.A.B.; de Barros, K.M.F.T.; Silva, S.R.F.; Toscano, A.E.; de Souza, R.E.; Manhães-de Castro, R. Automatic system for analysis of locomotor activity in rodents—A reproducibility study. J. Neurosci. Methods 2011, 195, 216–221. [Google Scholar] [CrossRef]
- Leroy, T.; Stroobants, S.; Aerts, J.M.; D’Hooge, R.; Berckmans, D. Automatic analysis of altered gait in arylsulphatase A-deficient mice in the open field. Behav. Res. Methods 2009, 41, 787–794. [Google Scholar] [CrossRef] [Green Version]
- Clack, N.G.; O’Connor, D.H.; Huber, D.; Petreanu, L.; Hires, A.; Peron, S.; Svoboda, K.; Myers, E.W. Automated tracking of whiskers in videos of head fixed rodents. PlOS Comput. Biol. 2012, 8, E1002591. [Google Scholar] [CrossRef]
- Kloefkorn, H.E.; Pettengill, T.R.; Turner, S.M.; Streeter, K.A.; Gonzalez-Rothi, E.J.; Fuller, D.D.; Allen, K.D. Automated Gait Analysis Through Hues and Areas (AGATHA): A method to characterize the spatiotemporal pattern of rat gait. Ann. Biomed. Eng. 2017, 45, 711–725. [Google Scholar] [CrossRef]
- Dankert, H.; Wang, L.; Hoopfer, E.D.; Anderson, D.J.; Perona, P. Automated monitoring and analysis of social behavior in Drosophila. Nat. Methods 2009, 6, 297. [Google Scholar] [CrossRef] [PubMed]
- Bishop, C.M. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2007; p. 738. [Google Scholar]
- Otsu, N. A threshold selection method from gray level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
- Kim, H.J.; Shi, T.; Akdagli, S.; Most, S.; Yan, Y. Semi-Automated Tracking of Vibrissal Movements in Free-Moving Rodents Captured by High-Speed Videos. World Acad. Sci. Eng. Technol. Int. J. Biol. Biomol. Agric. Food Biotechnol. Eng. 2015, 9, 565–569. [Google Scholar]
- Palmér, T.; Åström, K.; Enqvist, O.; Ivica, N.; Petersson, P. Rat Paw Tracking for Detailed Motion Analysis. In Proceedings of the Visual observation and analysis of Vertebrate And Insect Behavior 2014, Stockholm, Sweden, 24 August 2014. [Google Scholar]
- Palmér, T.; Tamtè, M.; Halje, P.; Enqvist, O.; Petersson, P. A system for automated tracking of motor components in neurophysiological research. J. Neurosci. Methods 2012, 205, 334–344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Giovannucci, A.; Pnevmatikakis, E.A.; Deverett, B.; Pereira, T.; Fondriest, J.; Brady, M.; Wang, S.H.; Abbas, W.; Parés, P.; Masip, D. Automated gesture tracking in head-fixed mice. J. Neurosci. Methods 2018, 300, 184–195. [Google Scholar] [CrossRef] [PubMed]
- Mathis, A.; Mamidanna, P.; Cury, K.M.; Abe, T.; Murthy, V.N.; Mathis, M.W.; Bethge, M. DeepLabCut: Markerless Pose Estimation of User-Defined Body Parts with Deep Learning; Nature Publishing Group: London, UK, 2018. [Google Scholar]
- Insafutdinov, E.; Pishchulin, L.; Andres, B.; Andriluka, M.; Schiele, B. Deepercut: A deeper, stronger, and faster multi-person pose estimation model. In Proceedings of the European Conference on Computer Vision; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Arac, A.; Zhao, P.; Dobkin, B.H.; Carmichael, S.T.; Golshani, P. DeepBehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Front. Syst. Neurosci. 2019, 13, 20. [Google Scholar] [CrossRef]
- Available online: https://github.com/Russell91/TensorBox (accessed on 24 July 2019).
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime multi-person 2d pose estimation using part affinity fields. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Dirks, A.; Groenink, L.; Verdouw, M.P.; Gugten, J.v.d.; Hijzen, T.H.; Olivier, B. Behavioral analysis of transgenic mice overexpressing corticotropin-releasing hormone in paradigms emulating aspects of stress, anxiety, and depression. Int. J. Comp. Psychol. 2001, 14, 123–135. [Google Scholar]
- de Paula Nascimento-Castro, C.; Wink, A.C.; da Fônseca, V.S.; Bianco, C.D.; Winkelmann-Duarte, E.C.; Farina, M.; Rodrigues, A.L.S.; Gil-Mohapel, J.; de Bem, A.F.; Brocardo, P.S. Antidepressant effects of probucol on early-symptomatic YAC128 transgenic mice for Huntington’s disease. Neural Plast. 2018, 2018, 4056383. [Google Scholar] [CrossRef]
- Yun, S.; Donovan, M.H.; Ross, M.N.; Richardson, D.R.; Reister, R.; Farnbauch, L.A.; Fischer, S.J.; Riethmacher, D.; Gershenfeld, H.K.; Lagace, D.C.; et al. Stress-induced anxiety- and depressive-like phenotype associated with transient reduction in neurogenesis in adult nestin-CreERT2/diphtheria toxin fragment A transgenic mice. PLoS ONE 2016, 11, E0147256. [Google Scholar] [CrossRef]
- Krishnan, V.; Nestler, E.J. Animal Models of Depression: Molecular Perspectives. Molecular and Functional Models in Neuropsychiatry; Springer: Berlin/Heidelberg, Geramny, 2011; pp. 121–147. [Google Scholar]
- Shimogawa, T.; Sakaguchi, H.; Kikuchi, T.; Tsuchimochi, R.; Sano, N.; Torikoshi, S.; Ito, A.; Aoyama, T.; Iihara, K.; Takahashi, J. Therapeutic effects of combined cell transplantation and locomotor training in rats with brain injury. NPJ Regen. Med. 2019, 4, 13. [Google Scholar] [CrossRef] [PubMed]
- Lazar, J.; Moreno, C.; Jacob, H.J.; Kwitek, A.E. Impact of genomics on research in the rat. Genome Res. 2005, 15, 1717–1728. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lemieux, M.; Josset, N.; Roussel, M.; Couraud, S.; Bretzner, F. Speed-dependent modulation of the locomotor behavior in adult mice reveals attractor and transitional gaits. Front. Neurosci. 2016, 10, 42. [Google Scholar] [CrossRef] [PubMed]
- Lister, R.G. The use of a plus-maze to measure anxiety in the mouse. Psychopharmacology 1987, 92, 180–185. [Google Scholar] [CrossRef] [PubMed]
- Fraser, L.M.; Brown, R.E.; Hussin, A.; Fontana, M.; Whittaker, A.; O’Leary, T.P.; Lederle, L.; Holmes, A.; Ramos, A. Measuring anxiety- and locomotion-related behaviours in mice: A new way of using old tests. Psychopharmacology 2010, 211, 99–112. [Google Scholar] [CrossRef] [PubMed]
- Seibenhener, M.L.; Wooten, M.C. Use of the Open Field Maze to measure locomotor and anxiety-like behavior in mice. JoVE (J. Vis. Exp.) 2015, 96, E52434. [Google Scholar] [CrossRef]
- Tatem, K.S.; Quinn, J.L.; Phadke, A.; Yu, Q.; Gordish-Dressman, H.; Nagaraju, K. Behavioral and locomotor measurements using an open field activity monitoring system for skeletal muscle diseases. JoVE (J. Vis. Exp.) 2014, 91, E51785. [Google Scholar] [CrossRef]
- Lira, F.S.; Esteves, A.M.; Pimentel, G.D.; Rosa, J.C.; Frank, M.K.; Mariano, M.O.; Budni, J.; Quevedo, J.; dos Santos, R.V.; De Mello, M.T. Sleep pattern and locomotor activity are impaired by doxorubicin in non-tumor-bearing rats. Sleep Sci. 2016, 9, 232–235. [Google Scholar] [CrossRef] [Green Version]
- Schramm-Sapyta, N.L.; Cha, Y.M.; Chaudhry, S.; Wilson, W.A.; Swartzwelder, H.S.; Kuhn, C.M. Differential anxiogenic, aversive, and locomotor effects of THC in adolescent and adult rats. Psychopharmacology 2007, 191, 867–877. [Google Scholar] [CrossRef]
- Javadi-Paydar, M.; Nguyen, J.D.; Vandewater, S.A.; Dickerson, T.J.; Taffe, M.A. Locomotor and reinforcing effects of pentedrone, pentylone and methylone in rats. Neuropharmacology 2018, 134, 57–64. [Google Scholar] [CrossRef]
- Walker, R.B.; Fitz, L.D.; Williams, L.M.; McDaniel, Y.M. The effect on ephedrine prodrugs on locomotor activity in rats. Gen. Pharmacol. 1996, 27, 109–111. [Google Scholar] [CrossRef]
- Wellman, P.J.; Davis, K.W.; Clifford, P.S.; Rothman, R.B.; Blough, B.E. Changes in feeding and locomotion induced by amphetamine analogs in rats. Drug Alcohol Depend. 2009, 100, 234–239. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, J.J.; Kong, Q. Locomotor activity: A distinctive index in morphine self-administration in rats. PLoS ONE 2017, 12, E0174272. [Google Scholar] [CrossRef] [PubMed]
- Wiechman, B.E.; Wood, T.E.; Spratto, G.R. Locomotor activity in morphine-treated rats: Effects of and comparisons between cocaine, procaine, and lidocaine. Pharmacol. Biochem. Behav. 1981, 15, 425–433. [Google Scholar] [CrossRef]
- Walker, Q.D.; Schramm-Sapyta, N.L.; Caster, J.M.; Waller, S.T.; Brooks, M.P.; Kuhn, C.M. Novelty-induced locomotion is positively associated with cocaine ingestion in adolescent rats; anxiety is correlated in adults. Pharmacol. Biochem. Behav. 2009, 91, 398–408. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Marin, M.T.; Zancheta, R.; Paro, A.H.; Possi, A.P.; Cruz, F.C.; Planeta, C.S. Comparison of caffeine-induced locomotor activity between adolescent and adult rats. Eur. J. Pharmacol. 2011, 660, 363–367. [Google Scholar] [CrossRef] [Green Version]
- Dipoppa, M.; Ranson, A.; Krumin, M.; Pachitariu, M.; Carandini, M.; Harris, K.D. Vision and locomotion shape the interactions between neuron types in mouse visual cortex. Neuron 2018, 98, 602–615. [Google Scholar] [CrossRef] [PubMed]
- Dadarlat, M.C.; Stryker, M.P. Locomotion enhances neural encoding of visual stimuli in mouse V1. J. Neurosci. 2017, 37, 3764–3775. [Google Scholar] [CrossRef]
- Tresch, M.C.; Kiehn, O. Synchronization of motor neurons during locomotion in the neonatal rat: Predictors and mechanisms. J. Neurosci. 2002, 22, 9997–10008. [Google Scholar] [CrossRef]
- Vinck, M.; Batista-Brito, R.; Knoblich, U.; Cardin, J.A. Arousal and locomotion make distinct contributions to cortical activity patterns and visual encoding. Neuron 2015, 86, 740–754. [Google Scholar] [CrossRef]
- Dunn, T.W.; Mu, Y.; Narayan, S.; Randlett, O.; Naumann, E.A.; Yang, C.T.; Schier, A.F.; Freeman, J.; Engert, F.; Ahrens, M.B. Brain-wide mapping of neural activity controlling zebrafish exploratory locomotion. eLife 2016, 5, E12741. [Google Scholar] [CrossRef] [PubMed]
- Hesse, S. Locomotor therapy in neurorehabilitation. NeuroRehabilitation 2001, 16, 133–139. [Google Scholar] [PubMed]
- Turner, D.L.; Murguialday, A.R.; Birbaumer, N.; Hoffmann, U.; Luft, A. Neurophysiology of robot-mediated training and therapy: A perspective for future use in clinical populations. Front. Neurol. 2013, 4, 184. [Google Scholar] [CrossRef] [PubMed]
- Shik, M.L.; Orlovsky, G.N. Neurophysiology of locomotor automatism. Physiol. Rev. 1976, 56, 465–501. [Google Scholar] [CrossRef] [PubMed]
- Wernig, A. Locomotor programs versus ‘conventional’ physical therapy? Locomotor training. Spinal Cord 2012, 50, 641. [Google Scholar] [CrossRef]
- Leech, K.A.; Kinnaird, C.R.; Holleran, C.L.; Kahn, J.; Hornby, T.G. Effects of locomotor exercise intensity on gait performance in individuals with incomplete spinal cord injury. Phys. Ther. 2016, 96, 1919–1929. [Google Scholar] [CrossRef]
- Van Meer, P.; Raber, J. Mouse behavioural analysis in systems biology. Biochem. J. 2005, 389, 593–610. [Google Scholar] [CrossRef]
- O’Hara, S.; Lui, Y.M.; Draper, B.A. Unsupervised learning of human expressions, gestures, and actions. In Face Gesture; IEEE: Piscataway, NJ, USA, 2011. [Google Scholar]
- Lala, D.; Mohammad, Y.; Nishida, T. Unsupervised gesture recognition system for learning manipulative actions in virtual basketball. In Proceedings of the 1st International Conference on Human-Agent Interaction, Sapporo, Japan, 7–9 August 2013. [Google Scholar]
- Rhodin, H.; Salzmann, M.; Fua, P. Unsupervised geometry-aware representation for 3d human pose estimation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018. [Google Scholar]
- Simão, M.A.; Neto, P.; Gibaru, O. Unsupervised gesture segmentation by motion detection of a real-time data stream. IEEE Trans. Ind. Inform. 2016, 13, 473–481. [Google Scholar] [CrossRef]
- Dubost, F.; Adams, H.; Yilmaz, P.; Bortsova, G.; van Tulder, G.; Ikram, M.A.; Niessen, W.; Vernooij, M.; de Bruijne, M. Weakly Supervised Object Detection with 2D and 3D Regression Neural Networks. arXiv 2019, arXiv:1906.01891. [Google Scholar]
- Chen, C.H.; Tyagi, A.; Agrawal, A.; Drover, D.; MV, R.; Stojanov, S.; Rehg, J.M. Unsupervised 3D Pose Estimation with Geometric Self-Supervision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; IEEE: Piscataway, NJ, USA, 2019. [Google Scholar]
- Asadi-Aghbolaghi, M.; Clapes, A.; Bellantonio, M.; Escalante, H.J.; Ponce-López, V.; Baró, X.; Guyon, I.; Kasaei, S.; Escalera, S. A survey on deep learning based approaches for action and gesture recognition in image sequences. In Proceedings of the 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May–3 June 2017; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar]
- Toshev, A.; Szegedy, C. Deeppose: Human pose estimation via deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; IEEE: Piscataway, NJ, USA, 2014. [Google Scholar]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Abbas, W.; Masip Rodo, D. Computer Methods for Automatic Locomotion and Gesture Tracking in Mice and Small Animals for Neuroscience Applications: A Survey. Sensors 2019, 19, 3274. https://doi.org/10.3390/s19153274
Abbas W, Masip Rodo D. Computer Methods for Automatic Locomotion and Gesture Tracking in Mice and Small Animals for Neuroscience Applications: A Survey. Sensors. 2019; 19(15):3274. https://doi.org/10.3390/s19153274
Chicago/Turabian StyleAbbas, Waseem, and David Masip Rodo. 2019. "Computer Methods for Automatic Locomotion and Gesture Tracking in Mice and Small Animals for Neuroscience Applications: A Survey" Sensors 19, no. 15: 3274. https://doi.org/10.3390/s19153274
APA StyleAbbas, W., & Masip Rodo, D. (2019). Computer Methods for Automatic Locomotion and Gesture Tracking in Mice and Small Animals for Neuroscience Applications: A Survey. Sensors, 19(15), 3274. https://doi.org/10.3390/s19153274