Robotics in Extreme Environments

A special issue of Robotics (ISSN 2218-6581).

Deadline for manuscript submissions: closed (15 December 2019) | Viewed by 36635

Special Issue Editor


E-Mail Website
Guest Editor
Royal Society Industry Fellow for Nuclear Robotics
Director, National Centre for Nuclear Robotics
Director, Birmingham Extreme Robotics Lab
Director, A.R.M Robotics Ltd.
Chair in Robotics, University of Birmingham, Edgbaston, Birmingham B15 2TT, UK
Interests: autonomous grasping and manipulation; computer vision and sensing and perception; AI and machine learning; mechanical design and kinematics and dynamics; variable autonomy and shared control and mixed initiative systems; human factors and human-robot interfaces

Special Issue Information

Dear Colleagues,

We are pleased to invite you to submit your papers to this Special Issue of Robotics, "Robotics in Extreme Environments". Extreme environments can be defined as those that are so hazardous that it would be undesirable or impossible to send a human worker into the environment. Such applications are of special importance to the robotics research community, because they demand the use of robots and often cannot be done at all without major new advances in robotics. In contrast, while research on, e.g., household helper robots is certainly interesting, such jobs can still be done by human workers if needed at the present time.

Extreme environment applications also include those of major societal and economic importance, making these applications central to enabling advanced robotics research to have real and substantial societal and economic impact. For this reason, the UK’s new Industrial Strategy has recently invested £93million into R&D on robotics and AI for Extreme Environments, the European Commission has invested in several major €multi-million robotics projects in this area via their Horizon 2020 robotics funding programme, and other nations are increasingly following suit.

A particularly important example of Extreme Environment robotics applications is that of nuclear decommissioning. The UK alone contains an estimated 4.9 million tons of legacy nuclear waste, the largest environmental remediation task in Europe, which is expected to cost up to £220 billion and take more than 100 years to complete. Worldwide decommissioning costs are of order $1 trillion. Cleaning up the environment for future generations is of major societal impact, and this is also an industry of major economic importance, where the use of advanced robotics will be key. This is a high-consequence industry, which is understandably very conservative about new technologies. The industry has previously had remarkably little penetration of robotics technologies, and no use of autonomy whatsoever. However, very recent landmark work has now achieved autonomous robot control inside radioactive environments for the first time, and the industry is now starting to embrace these new technologies.

Other extreme environment applications of robotics include: Inspection and maintenance of underwater and offshore infrastructure; space and planetary exploration; exploitation of increasingly deep mines; bomb disposal; rescue robotics; asbestos removal from older buildings; replacing human workers on construction sites; and numerous other applications.

This Special Issue aims to introduce the latest research progress in the field of robotics for Extreme Environments. A very wide variety of highly interdisciplinary work is needed, and research in these diverse areas is welcomed for submission to the Special Issue. Areas for submission include, but are not limited to:

  • Both autonomous methods and also teleoperation which is critical to EE applications.
  • Grasping and manipulation.
  • Mobility systems (walking, flying, swimming, crawling, etc.).
  • Exploration, mapping and navigation in unknown or partially known environments.
  • Vision, sensing and perception.
  • AI and learning for industrial objects and scenes in extreme applications.
  • Mechanical, materials and other design issues of novel robots and sensors.
  • Human-robot interfaces, including teleoperation, VR/AR and multi-modal information cues.
  • Variable autonomy, shared control, and mixed-initiative systems.
  • Communication with remote robots in extreme environments.
  • Measurement and modelling of the effects of extreme environments (radiation, extreme temperatures, pressures, chemical and other issues) on sensors, embedded systems and other key robotics components, materials and design considerations.
  • Design of robots and critical components for increased resilience in extreme environments.

This Special Issue particularly welcomes papers which describe practical applications, implementations and real deployments of robotic systems into extreme environments.

Prof. Rustam Stolkin
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Robotics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • extreme environment 
  • nuclear robotics 
  • space robotics 
  • underwater robotics 
  • radiation 
  • teleoperation 
  • autonomy 
  • manipulation 
  • grasping 
  • SLAM 
  • AI and learning
  • Vision and perception 
  • Human-Robot Interaction (HRI)

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 7368 KiB  
Article
Design and Implementation of a Quadruped Amphibious Robot Using Duck Feet
by Saad Bin Abul Kashem, Shariq Jawed, Jubaer Ahmed and Uvais Qidwai
Robotics 2019, 8(3), 77; https://doi.org/10.3390/robotics8030077 - 05 Sep 2019
Cited by 15 | Viewed by 10439
Abstract
Roaming complexity in terrains and unexpected environments pose significant difficulties in robotic exploration of an area. In a broader sense, robots have to face two common tasks during exploration, namely, walking on the drylands and swimming through the water. This research aims to [...] Read more.
Roaming complexity in terrains and unexpected environments pose significant difficulties in robotic exploration of an area. In a broader sense, robots have to face two common tasks during exploration, namely, walking on the drylands and swimming through the water. This research aims to design and develop an amphibious robot, which incorporates a webbed duck feet design to walk on different terrains, swim in the water, and tackle obstructions on its way. The designed robot is compact, easy to use, and also has the abilities to work autonomously. Such a mechanism is implemented by designing a novel robotic webbed foot consisting of two hinged plates. Because of the design, the webbed feet are able to open and close with the help of water pressure. Klann linkages have been used to convert rotational motion to walking and swimming for the animal’s gait. Because of its amphibian nature, the designed robot can be used for exploring tight caves, closed spaces, and moving on uneven challenging terrains such as sand, mud, or water. It is envisaged that the proposed design will be appreciated in the industry to design amphibious robots in the near future. Full article
(This article belongs to the Special Issue Robotics in Extreme Environments)
Show Figures

Figure 1

17 pages, 10685 KiB  
Article
MallARD: An Autonomous Aquatic Surface Vehicle for Inspection and Monitoring of Wet Nuclear Storage Facilities
by Keir Groves, Andrew West, Konrad Gornicki, Simon Watson, Joaquin Carrasco and Barry Lennox
Robotics 2019, 8(2), 47; https://doi.org/10.3390/robotics8020047 - 18 Jun 2019
Cited by 22 | Viewed by 7981
Abstract
Inspection and monitoring of wet nuclear storage facilities such as spent fuel pools or wet silos is performed for a variety of reasons, including nuclear security and characterisation of storage facilities prior to decommissioning. Until now such tasks have been performed by personnel [...] Read more.
Inspection and monitoring of wet nuclear storage facilities such as spent fuel pools or wet silos is performed for a variety of reasons, including nuclear security and characterisation of storage facilities prior to decommissioning. Until now such tasks have been performed by personnel or, if the risk to health is too high, avoided. Tasks are often repetitive, time-consuming and potentially dangerous, making them suited to being performed by an autonomous robot. Previous autonomous surface vehicles (ASVs) have been designed for operation in natural outdoor environments and lack the localisation and tracking accuracy necessary for operation in a wet nuclear storage facility. In this paper the environmental and operational conditions are analysed, applicable localisation technologies selected and a unique aquatic autonomous surface vehicle (ASV) is designed and constructed. The ASV developed is holonomic, uses a LiDAR for localisation and features a robust trajectory tracking controller. In a series of experiments the mean error between the present ASV’s planned path and the actual path is approximately 1 cm, which is two orders of magnitude lower than previous ASVs. As well as lab testing, the ASV has been used in two deployments, one of which was in an active spent fuel pool. Full article
(This article belongs to the Special Issue Robotics in Extreme Environments)
Show Figures

Graphical abstract

24 pages, 2687 KiB  
Article
Vision-Based Assisted Tele-Operation of a Dual-Arm Hydraulically Actuated Robot for Pipe Cutting and Grasping in Nuclear Environments
by Manuel Bandala, Craig West, Stephen Monk, Allahyar Montazeri and C. James Taylor
Robotics 2019, 8(2), 42; https://doi.org/10.3390/robotics8020042 - 04 Jun 2019
Cited by 40 | Viewed by 10723
Abstract
This article investigates visual servoing for a hydraulically actuated dual-arm robot, in which the user selects the object of interest from an on-screen image, whilst the computer control system implements via feedback control the required position and orientation of the manipulators. To improve [...] Read more.
This article investigates visual servoing for a hydraulically actuated dual-arm robot, in which the user selects the object of interest from an on-screen image, whilst the computer control system implements via feedback control the required position and orientation of the manipulators. To improve on the current joystick direct tele-operation commonly used as standard in the nuclear industry, which is slow and requires extensive operator training, the proposed assisted tele-operation makes use of a single camera mounted on the robot. Focusing on pipe cutting as an example, the new system ensures that one manipulator automatically grasps the user-selected pipe, and appropriately positions the second for a cutting operation. Initial laboratory testing (using a plastic pipe) shows the efficacy of the approach for positioning the manipulators, and suggests that for both experienced and inexperienced users, the task is completed significantly faster than via tele-operation. Full article
(This article belongs to the Special Issue Robotics in Extreme Environments)
Show Figures

Figure 1

21 pages, 3383 KiB  
Article
The Auto-Complete Graph: Merging and Mutual Correction of Sensor and Prior Maps for SLAM
by Malcolm Mielle, Martin Magnusson and Achim J. Lilienthal
Robotics 2019, 8(2), 40; https://doi.org/10.3390/robotics8020040 - 29 May 2019
Cited by 8 | Viewed by 6163
Abstract
Simultaneous Localization And Mapping (SLAM) usually assumes the robot starts without knowledge of the environment. While prior information, such as emergency maps or layout maps, is often available, integration is not trivial since such maps are often out of date and have uncertainty [...] Read more.
Simultaneous Localization And Mapping (SLAM) usually assumes the robot starts without knowledge of the environment. While prior information, such as emergency maps or layout maps, is often available, integration is not trivial since such maps are often out of date and have uncertainty in local scale. Integration of prior map information is further complicated by sensor noise, drift in the measurements, and incorrect scan registrations in the sensor map. We present the Auto-Complete Graph (ACG), a graph-based SLAM method merging elements of sensor and prior maps into one consistent representation. After optimizing the ACG, the sensor map’s errors are corrected thanks to the prior map, while the sensor map corrects the local scale inaccuracies in the prior map. We provide three datasets with associated prior maps: two recorded in campus environments, and one from a fireman training facility. Our method handled up to 40% of noise in odometry, was robust to varying levels of details between the prior and the sensor map, and could correct local scale errors of the prior. In field tests with ACG, users indicated points of interest directly on the prior before exploration. We did not record failures in reaching them. Full article
(This article belongs to the Special Issue Robotics in Extreme Environments)
Show Figures

Figure 1

Back to TopTop