Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot
Abstract
:1. Introduction
- robustness to light level variations, defined by the number of irradiance decades in which the visual sensor can operate;
- range of OF angular speeds (or magnitudes) covered, defined by the minimum and maximum values measured;
- accuracy and precision, defined by systematic errors and coefficients of variation;
- output refresh rate, defined by the instantaneous output frequency.
- in an OF range from 25 °/s to 1000 °/s;
- under irradiance conditions varying from 6 W·cm to 1.6 W·cm;
- with sampling rates between 100 Hz and 1 kHz;
- in real flight when fitted onto a 350-gram MAV.
2. Optics and Front-End Pixels of the MAPix Sensor
3. Optical Flow Computed by Time-Of-Travel-Based Algorithms
3.1. Time-Of-Travel Based on Signal Thresholding
3.2. Time-of-Travel Based on Signals’ Cross-Correlation
4. Measuring Optical Flow with a Moving Texture
4.1. Method
4.2. Results
5. Measuring Optical Flow in Flight
5.1. Method
5.2. In-Flight Results
5.3. Offline Results at a Low Sampling Rate
6. Discussion and Conclusions
Supplementary Materials
Acknowledgments
Author Contributions
Conflicts of Interest
Abbreviations
OF | Optical flow |
LMS | Local motion sensor |
MAPix | Michaelis–Menten auto-adaptive pixel |
References
- Srinivasan, M.V. Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics. Physiol. Rev. 2011, 91, 413–460. [Google Scholar] [CrossRef] [PubMed]
- Gibson, J.J. The perception of the visual world. J. Philos. 1951, 48, 788. [Google Scholar] [CrossRef]
- Nakayama, K.; Loomis, J. Optical velocity patterns, velocity-sensitive neurons, and space perception: A hypothesis. Perception 1974, 3, 63–80. [Google Scholar] [CrossRef] [PubMed]
- Franceschini, N.; Ruffier, F.; Serres, J.; Viollet, S. Optic Flow Based Visual Guidance: From Flying Insects to Miniature Aerial Vehicles; INTECH Open Access Publisher: Rijeka, Croatia, 2009. [Google Scholar]
- Serres, J.; Ruffier, F. Optic Flow-Based Robotics. In Wiley Encyclopedia of Electrical and Electronics Engineering; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2016. [Google Scholar]
- Moeckel, R.; Liu, S.C. Motion detection chips for robotic platforms. In Flying Insects and Robots; Springer: Berlin, Germany, 2009; pp. 101–114. [Google Scholar]
- Expert, F.; Viollet, S.; Ruffier, F. A mouse sensor and a 2-pixel motion sensor exposed to continuous illuminance changes. In Proceedings of the IEEE Sensors, Limerick, Ireland, 28–31 October 2011; pp. 974–977.
- Roubieu, F.L.; Expert, F.; Sabiron, G.; Ruffier, F. Two-Directional 1-g Visual Motion Sensor Inspired by the Fly’s Eye. IEEE Sens. J. 2013, 13, 1025–1035. [Google Scholar] [CrossRef]
- Floreano, D.; Pericet-Camara, R.; Viollet, S.; Ruffier, F.; Brückner, A.; Leitel, R.; Buss, W.; Menouni, M.; Expert, F.; Juston, R.; et al. Miniature curved artificial compound eyes. Proc. Natl. Acad. Sci. USA 2013, 110, 9267–9272. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Song, Y.M.; Xie, Y.; Malyarchuk, V.; Xiao, J.; Jung, I.; Choi, K.J.; Liu, Z.; Park, H.; Lu, C.; Kim, R.H.; et al. Digital cameras with designs inspired by the arthropod eye. Nature 2013, 497, 95–99. [Google Scholar] [CrossRef] [PubMed]
- Duhamel, P.E.J.; Perez-Arancibia, C.O.; Barrows, G.L.; Wood, R.J. Biologically inspired optical-flow sensing for altitude control of flapping-wing microrobots. IEEE/ASME Trans. Mechatron. 2013, 18, 556–568. [Google Scholar] [CrossRef]
- Mafrica, S.; Godiot, S.; Menouni, M.; Boyron, M.; Expert, F.; Juston, R.; Marchand, N.; Ruffier, F.; Viollet, S. A bio-inspired analog silicon retina with Michaelis-Menten auto-adaptive pixels sensitive to small and large changes in light. Opt. Express 2015, 23, 5614. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Duhamel, P.E.J.; Pérez-Arancibia, N.O.; Barrows, G.L.; Wood, R.J. Altitude feedback control of a flapping-wing microrobot using an on-board biologically inspired optical flow sensor. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN, USA, 14–18 May 2012; pp. 4228–4235.
- Kushleyev, A.; Mellinger, D.; Powers, C.; Kumar, V. Towards a swarm of agile micro quadrotors. Auton. Robots 2013, 35, 287–300. [Google Scholar] [CrossRef]
- Ma, K.Y.; Chirarattananon, P.; Fuller, S.B.; Wood, R.J. Controlled flight of a biologically inspired, insect-scale robot. Science 2013, 340, 603–607. [Google Scholar] [CrossRef] [PubMed]
- Dunkley, O.; Engel, J.; Sturm, J.; Cremers, D. Visual-inertial navigation for a camera-equipped 25g nano-quadrotor. In Proceedings of the IROS2014 Aerial Open Source Robotics Workshop, Chicago, IL, USA, 14–18 September 2014.
- Moore, R.J.; Dantu, K.; Barrows, G.L.; Nagpal, R. Autonomous MAV guidance with a lightweight omnidirectional vision sensor. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 3856–3861.
- Floreano, D.; Wood, R.J. Science, technology and the future of small autonomous drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed]
- Liu, C.; Prior, S.D.; Teacy, W.L.; Warner, M. Computationally efficient visual–inertial sensor fusion for Global Positioning System–denied navigation on a small quadrotor. Adv. Mech. Eng. 2016, 8. [Google Scholar] [CrossRef]
- Honegger, D.; Meier, L.; Tanskanen, P.; Pollefeys, M. An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany, 6–10 May 2013; pp. 1736–1741.
- Brandli, C.; Berner, R.; Yang, M.; Liu, S.C.; Delbruck, T. A 240 × 180 130 dB 3 μs latency global shutter spatiotemporal vision sensor. IEEE J. Solid-State Circuits 2014, 49, 2333–2341. [Google Scholar] [CrossRef]
- Rueckauer, B.; Delbruck, T. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor. Front. Neurosci. 2016, 10, 176. [Google Scholar] [CrossRef] [PubMed]
- Briod, A.; Zufferey, J.C.; Floreano, D. A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments. Auton. Robots 2016, 40, 789–803. [Google Scholar] [CrossRef]
- McGuire, K.; de Croon, G.; De Wagter, C.; Tuyls, K.; Kappen, H. Efficient Optical flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone. arXiv 2016. [Google Scholar]
- Falanga, D.; Mueggler, E.; Faessler, M.; Scaramuzza, D. Aggressive Quadrotor Flight through Narrow Gaps with Onboard Sensing and Computing. arXiv 2016. [Google Scholar]
- Mafrica, S.; Servel, A.; Ruffier, F. Minimalistic optic flow sensors applied to indoor and outdoor visual guidance and odometry on a car-like robot. Bioinspir. Biomim. 2016, 11, 066007. [Google Scholar] [CrossRef] [PubMed]
- Normann, R.A.; Perlman, I. The effects of background illumination on the photoresponses of red and green cones. J. Physiol. 1979, 286, 491. [Google Scholar] [CrossRef] [PubMed]
- Viollet, S.; Godiot, S.; Leitel, R.; Buss, W.; Breugnon, P.; Menouni, M.; Juston, R.; Expert, F.; Colonnier, F.; L’Eplattenier, G.; et al. Hardware architecture and cutting-edge assembly process of a tiny curved compound eye. Sensors 2014, 14, 21702–21721. [Google Scholar] [CrossRef] [PubMed]
- Sabiron, G.; Chavent, P.; Raharijaona, T.; Fabiani, P.; Ruffier, F. Low-speed optic-flow sensor onboard an unmanned helicopter flying outside over fields. In Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1742–1749.
- Expert, F.; Roubieu, F.L.; Ruffier, F. Interpolation based “time of travel” scheme in a Visual Motion Sensor using a small 2D retina. In Proceedings of the IEEE Sensors, Taipei, Taiwan, 28–31 October 2012; pp. 1–4.
- Roubieu, F.L.; Serres, J.R.; Colonnier, F.; Franceschini, N.; Viollet, S.; Ruffier, F. A biomimetic vision-based hovercraft accounts for bees’ complex behaviour in various corridors. Bioinspir. Biomim. 2014, 9, 036003. [Google Scholar] [CrossRef] [PubMed]
- Roubieu, F.L.; Serres, J.; Franceschini, N.; Ruffier, F.; Viollet, S. A fully-autonomous hovercraft inspired by bees: Wall following and speed control in straight and tapered corridors. In Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics, ROBIO, Guangzhou, China, 11–14 December 2012; pp. 1311–1318.
- Expert, F.; Ruffier, F. Flying over uneven moving terrain based on optic-flow cues without any need for reference frames or accelerometers. Bioinspir. Biomim. 2015, 10, 026003. [Google Scholar] [CrossRef] [PubMed]
- Zufferey, J.C.; Floreano, D. Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Trans. Robot. 2006, 22, 137–146. [Google Scholar] [CrossRef]
- Zufferey, J.C.; Klaptocz, A.; Beyeler, A.; Nicoud, J.D.; Floreano, D. A 10-gram vision-based flying robot. Adv. Robot. 2007, 21, 1671–1684. [Google Scholar] [CrossRef]
- Beyeler, A.; Zufferey, J.C.; Floreano, D. Vision-based control of near-obstacle flight. Auton. Robots 2009, 27, 201–219. [Google Scholar] [CrossRef]
- Land, M.F. Visual Acuity in Insects. Annu. Rev. Entomol. 1997, 42, 147–177. [Google Scholar] [CrossRef] [PubMed]
- Michaelis, L.; Menten, M.L. Die kinetik der invertinwirkung. Biochem. z 1913, 49, 352. [Google Scholar]
- Franceschini, N.; Riehle, A.; Le Nestour, A. Directionally Selective Motion Detection by Insect Neurons. In Facets of Vision; Springer: Berlin, Germany, 1989. [Google Scholar]
- Hassenstein, B.; Reichardt, W. Systemtheoretische analyse der zeit-, reihenfolgen-und vorzeichenauswertung bei der bewegungsperzeption des rüsselkäfers chlorophanus. Z. Naturforsch. B 1956, 11, 513–524. [Google Scholar] [CrossRef]
- Albus, J.S.; Hong, T.H. Motion, depth, and image flow. In Proceedings of the 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, USA, 13–18 May 1990; pp. 1161–1170.
- Manecy, A.; Marchand, N.; Ruffier, F.; Viollet, S. X4-MaG: A Low-Cost Open-Source Micro-Quadrotor and Its Linux-Based Controller. Int. J. Micro Air Veh. 2015, 7, 89–110. [Google Scholar] [CrossRef]
Thresholding Method | Cross-Correlation Method | |||||||
---|---|---|---|---|---|---|---|---|
Sampling Rate () | 1 kHz | 500 Hz | 250 Hz | 100 Hz | 1 kHz | 500 Hz | 250 Hz | 100 Hz |
CPU Load (%) | 2.2 | 1.1 * | * | * | overload | 52.5 | 26.3 * | 10.5 * |
Precision σ (°/s) | 43 | 44 | 49 | 47 | - | 16 | 17 | 20 |
Refresh rate (Hz/10 LMSs) | 99 | 51 | 48 | 36 | - | 1195 | 701 | 264 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vanhoutte, E.; Mafrica, S.; Ruffier, F.; Bootsma, R.J.; Serres, J. Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot. Sensors 2017, 17, 571. https://doi.org/10.3390/s17030571
Vanhoutte E, Mafrica S, Ruffier F, Bootsma RJ, Serres J. Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot. Sensors. 2017; 17(3):571. https://doi.org/10.3390/s17030571
Chicago/Turabian StyleVanhoutte, Erik, Stefano Mafrica, Franck Ruffier, Reinoud J. Bootsma, and Julien Serres. 2017. "Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot" Sensors 17, no. 3: 571. https://doi.org/10.3390/s17030571
APA StyleVanhoutte, E., Mafrica, S., Ruffier, F., Bootsma, R. J., & Serres, J. (2017). Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot. Sensors, 17(3), 571. https://doi.org/10.3390/s17030571