applsci-logo

Journal Browser

Journal Browser

New Insights into Collaborative Robotics

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Robotics and Automation".

Deadline for manuscript submissions: closed (20 November 2022) | Viewed by 7499

Special Issue Editor


E-Mail Website
Guest Editor
Faculty of Engineering, Scuola Universitaria Professionale della Svizzera Italiana, 11, 6928 Manno, Switzerland
Interests: collaborative robotics; robot control; human–robot interaction; dynamic modeling and identification

Special Issue Information

Dear Colleagues,

Collaborative robotics are capable of collaborating and interacting with humans. They increase efficiency and productivity, as well as break new ground in manufacturing and related fields. After continuous design and optimization, collaborative robotics have been widely used in production and life. This Special Issue is dedicated to collecting manuscripts related to collaborative robotics, exploring new technologies and insights in the field.

Prof. Dr. Loris Roveda
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • collaborative robotics
  • cobot
  • robotic manipulator
  • industry 4.0
  • human-robot interaction

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 4607 KiB  
Article
Anthropomorphic Grasping of Complex-Shaped Objects Using Imitation Learning
by Jae-Bong Yi, Joonyoung Kim, Taewoong Kang, Dongwoon Song, Jinwoo Park and Seung-Joon Yi
Appl. Sci. 2022, 12(24), 12861; https://doi.org/10.3390/app122412861 - 14 Dec 2022
Cited by 7 | Viewed by 2272
Abstract
This paper presents an autonomous grasping approach for complex-shaped objects using an anthropomorphic robotic hand. Although human-like robotic hands have a number of distinctive advantages, most of the current autonomous robotic pickup systems still use relatively simple gripper setups such as a two-finger [...] Read more.
This paper presents an autonomous grasping approach for complex-shaped objects using an anthropomorphic robotic hand. Although human-like robotic hands have a number of distinctive advantages, most of the current autonomous robotic pickup systems still use relatively simple gripper setups such as a two-finger gripper or even a suction gripper. The main difficulty of utilizing human-like robotic hands lies in the sheer complexity of the system; it is inherently tough to plan and control the motions of the high degree of freedom (DOF) system. Although data-driven approaches have been successfully used for motion planning of various robotic systems recently, it is hard to directly apply them to high-DOF systems due to the difficulty of acquiring training data. In this paper, we propose a novel approach for grasping complex-shaped objects using a high-DOF robotic manipulation system consisting of a seven-DOF manipulator and a four-fingered robotic hand with 16 DOFs. Human demonstration data are first acquired using a virtual reality controller with 6D pose tracking and individual capacitive finger sensors. Then, the 3D shape of the manipulation target object is reconstructed from multiple depth images recorded using the wrist-mounted RGBD camera. The grasping pose for the object is estimated using a residual neural network (ResNet), K-means clustering (KNN), and a point-set registration algorithm. Then, the manipulator moves to the grasping pose following the trajectory created by dynamic movement primitives (DMPs). Finally, the robot performs one of the object-specific grasping motions learned from human demonstration. The suggested system is evaluated by an official tester using five objects with promising results. Full article
(This article belongs to the Special Issue New Insights into Collaborative Robotics)
Show Figures

Figure 1

41 pages, 11434 KiB  
Article
Implementation and Evaluation of Dynamic Task Allocation for Human–Robot Collaboration in Assembly
by Christoph Petzoldt, Dario Niermann, Emily Maack, Marius Sontopski, Burak Vur and Michael Freitag
Appl. Sci. 2022, 12(24), 12645; https://doi.org/10.3390/app122412645 - 9 Dec 2022
Cited by 10 | Viewed by 4793
Abstract
Human–robot collaboration is becoming increasingly important in industrial assembly. In view of high cost pressure, resulting productivity requirements, and the trend towards human-centered automation in the context of Industry 5.0, a reasonable allocation of individual assembly tasks to humans or robots is of [...] Read more.
Human–robot collaboration is becoming increasingly important in industrial assembly. In view of high cost pressure, resulting productivity requirements, and the trend towards human-centered automation in the context of Industry 5.0, a reasonable allocation of individual assembly tasks to humans or robots is of central importance. Therefore, this article presents a new approach for dynamic task allocation, its integration into an intuitive block-based process planning framework, and its evaluation in comparison to both manual assembly and static task allocation. For evaluation, a systematic methodology for comprehensive assessment of task allocation approaches is developed, followed by a corresponding user study. The results of the study show for the dynamic task allocation on the one hand a higher fluency in the human–robot collaboration with good adaptation to process delays, and on the other hand a reduction in the cycle time for assembly processes with sufficiently high degrees of parallelism. Based on the study results, we draw conclusions regarding assembly scenarios in which manual assembly or collaborative assembly with static or dynamic task allocation is most appropriate. Finally, we discuss the implications for process planning when using the proposed task allocation framework. Full article
(This article belongs to the Special Issue New Insights into Collaborative Robotics)
Show Figures

Figure 1

Back to TopTop