Recent Advances in Robotics, Factory Automation and Intelligent Networked Systems

A special issue of Machines (ISSN 2075-1702). This special issue belongs to the section "Industrial Systems".

Deadline for manuscript submissions: 30 August 2024 | Viewed by 9209

Special Issue Editor


E-Mail Website
Guest Editor
International Frequency Sensor Association (IFSA), 08860 Castelldefels, Spain
Interests: smart sensors; optical sensors; frequency measurements
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Industry 4.0 holds a lot of potential and is expected to register a substantial growth in the near future. It is an integrated system which consists of an automation tool, robotic control, and communications. According to the modern market study, the global industry 4.0 market is projected to grow from USD 116.14 billion in 2021 to USD 337.10 billion in 2028 at a CAGR of 16.4% in the 2021–2028 period.

This Special Issue contains extended selected papers from the 4th IFSA Winter Conference on Automation, Robotics and Communications for Industry 4.0 (ARCI' 2024), 7–9 February 2024 INNSBRUCK, AUSTRIA. (https://arci-conference.com/)

Topics of interest include but are not limited to:

  • Industrial automation and control;
  • Industrial robots;
  • Control devices and instruments;
  • Mechatronic systems;
  • Systems and control engineering;
  • Machine design for Industry 4.0;

Dr. Sergey Y. Yurish
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Machines is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • automation
  • robotics
  • mechatronics
  • networks
  • control

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 5259 KiB  
Article
A Connective Framework for Social Collaborative Robotic System
by Syed Osama Bin Islam and Waqas Akbar Lughmani
Machines 2022, 10(11), 1086; https://doi.org/10.3390/machines10111086 - 17 Nov 2022
Cited by 2 | Viewed by 1413
Abstract
Social intelligence in robotics appeared quite recently in the field of artificial intelligence (AI) and robotics. It is becoming increasingly evident that social and interaction skills are essentially required in any application where robots need to interact with humans. While the workspaces have [...] Read more.
Social intelligence in robotics appeared quite recently in the field of artificial intelligence (AI) and robotics. It is becoming increasingly evident that social and interaction skills are essentially required in any application where robots need to interact with humans. While the workspaces have transformed into fully shared spaces for performing collaborative tasks, human–robot collaboration (HRC) poses many challenges to the nature of interactions and social behavior among the collaborators. The complex dynamic environment coupled with uncertainty, anomaly, and threats raises questions about the safety and security of the cyber-physical production system (CPPS) in which HRC is involved. Interactions in the social sphere include both physical and psychological safety issues. In this work, we proposed a connective framework that can quickly respond to changing physical and psychological safety state of a CPPS. The first layer executes the production plan and monitors the changes through sensors. The second layer evaluates the situations in terms of their severity as anxiety by applying a quantification method that obtains support from a knowledge base. The third layer responds to the situations through the optimal allocation of resources. The fourth layer decides on the actions to mitigate the anxiety through the allocated resources suggested by the optimization layer. Experimental validation of the proposed method was performed on industrial case studies involving HRC. The results demonstrated that the proposed method improves the decision-making of a CPPS experiencing complex situations, ensures physical safety, and effectively enhances the productivity of the human–robot team by leveraging psychological comfort. Full article
Show Figures

Figure 1

17 pages, 22561 KiB  
Article
Autonomous Visual Navigation for a Flower Pollination Drone
by Dries Hulens, Wiebe Van Ranst, Ying Cao and Toon Goedemé
Machines 2022, 10(5), 364; https://doi.org/10.3390/machines10050364 - 10 May 2022
Cited by 8 | Viewed by 3322
Abstract
In this paper, we present the development of a visual navigation capability for a small drone enabling it to autonomously approach flowers. This is a very important step towards the development of a fully autonomous flower pollinating nanodrone. The drone we developed is [...] Read more.
In this paper, we present the development of a visual navigation capability for a small drone enabling it to autonomously approach flowers. This is a very important step towards the development of a fully autonomous flower pollinating nanodrone. The drone we developed is totally autonomous and relies for its navigation on a small on-board color camera, complemented with one simple ToF distance sensor, to detect and approach the flower. The proposed solution uses a DJI Tello drone carrying a Maix Bit processing board capable of running all deep-learning-based image processing and navigation algorithms on-board. We developed a two-stage visual servoing algorithm that first uses a highly optimized object detection CNN to localize the flowers and fly towards it. The second phase, approaching the flower, is implemented by a direct visual steering CNN. This enables the drone to detect any flower in the neighborhood, steer the drone towards the flower and make the drone’s pollinating rod touch the flower. We trained all deep learning models based on an artificial dataset with a mix of images of real flowers, artificial (synthetic) flowers and virtually rendered flowers. Our experiments demonstrate that the approach is technically feasible. The drone is able to detect, approach and touch the flowers totally autonomously. Our 10 cm sized prototype is trained on sunflowers, but the methodology presented in this paper can be retrained for any flower type. Full article
Show Figures

Figure 1

13 pages, 1625 KiB  
Article
Obstacle Detection for Autonomous Guided Vehicles through Point Cloud Clustering Using Depth Data
by Micael Pires, Pedro Couto, António Santos and Vítor Filipe
Machines 2022, 10(5), 332; https://doi.org/10.3390/machines10050332 - 02 May 2022
Cited by 6 | Viewed by 3171
Abstract
Autonomous driving is one of the fastest developing fields of robotics. With the ever-growing interest in autonomous driving, the ability to provide robots with both efficient and safe navigation capabilities is of paramount significance. With the continuous development of automation technology, higher levels [...] Read more.
Autonomous driving is one of the fastest developing fields of robotics. With the ever-growing interest in autonomous driving, the ability to provide robots with both efficient and safe navigation capabilities is of paramount significance. With the continuous development of automation technology, higher levels of autonomous driving can be achieved with vision-based methodologies. Moreover, materials handling in industrial assembly lines can be performed efficiently using automated guided vehicles (AGVs). However, the visual perception of industrial environments is complex due to the existence of many obstacles in pre-defined routes. With the INDTECH 4.0 project, we aim to develop an autonomous navigation system, allowing the AGV to detect and avoid obstacles based in the processing of depth data acquired with a frontal depth camera mounted on the AGV. Applying the RANSAC (random sample consensus) and Euclidean clustering algorithms to the 3D point clouds captured by the camera, we can isolate obstacles from the ground plane and separate them into clusters. The clusters give information about the location of obstacles with respect to the AGV position. In experiments conducted outdoors and indoors, the results revealed that the method is very effective, returning high percentages of detection for most tests. Full article
Show Figures

Figure 1

Back to TopTop