ORPP—An Ontology for Skill-Based Robotic Process Planning in Agile Manufacturing
Abstract
:1. Introduction
2. Literature Review and Problem Description
2.1. Literature Review
- Is there an existing ontology for robotic process planning that covers the whole industrial manufacturing domain—light-out manufacturing and collaborative manufacturing?
- Is there any traceability of high-level process KPIs to low-level robotic control?
2.1.1. Human-Robot Collaboration
2.1.2. Autonomous Robotics
2.1.3. Industrial Manufacturing
2.2. Problem Definition
- There is no existing ontology for robotic process planning that covers the entire manufacturing process. Most of the ontologies are focused on the service robotics domain aiming to achieve robot autonomy.
- There is an increasing amount of effort that has been put into Human–Robot Collaboration (HRC). However, in the industrial robotics area, the amount of work is less extensive.
- No key performance indicators (KPI) were included for robotic process planning and control because the research focus was mainly on the robot control level, which is the closest to the robot. Process on the other hand is at the top. It goes down to task, and then the task can be performed by robot.
- The research focus is either tool-centric or product-centric.
3. Methods—Ontology Development
3.1. Ontology Purpose Identification and Requirements Elicitation (Phase 1)
3.1.1. Competency Questions
3.1.2. Modularization
- achieves modularity, reusability, extensibility, knowledge sharing, and knowledge reasoning
- contains process, task, and robot action knowledge
- creates a link between processes management and robot action
- based on upper-level ontology, extends domain-level ontology, and adds application-level ontology
- combines the tool-centric approach (a manufacturing task is described based on available hardware and software components) and the product-centric approach (describing the product and associated production steps independent from specific production resources) [22]
- supports the populating of a graphical user interface (GUI) to design, create, simulate, and run the robotic processes
- supports the PDDL task planning
3.2. Ontology Conceptualization and Formalization (Phase 2&3)
3.2.1. Conceptualization
- Find the relevant terms for the ontology.
- ○
- From Literature
- ○
- From Use Cases in the ACROBA Project
- Define classes and their properties.
3.2.2. Formalization
- SUMO Concepts
- CORA(X) Concepts
- ORPP Core Concepts and Axioms
- ○
- It adds a new process layer. It groups the robotic tasks together and provides an overview for the management with definable requirements/goals and KPIs. The process is the starting point of any requirement engineering analysis; it is highly important from the modelling perspective. Since one can track the process down to the primitive level, each robotic action is registered in the system and the KPIs can be calculated based on the information from the primitive layer. These KPIs then provide the evaluation of the requirements/goals.
- ○
- It extends the task layer to three task types (Robot Task, Human Task and HRC Task). Separating human and HRC tasks enables the application of different levels of safety controls.
- ○
- It enriches skill layers with skills themselves, meaning that skills can be created by not only primitives but also by skills. This enables the reuse of skills that are already created.
(subclass RobotProcess Process) (=> (and (RobotProcess ?x) (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?x) (Robot ?agent))) (exists (?task) (and (part ?task ?x) (RobotTask ?task)) (exists (?task) (and (part ?task ?x) (or (HumanTask ?task) (RobotTask ?task) (HumanRobotCollaborationTask ?task))))) (instance ?x Process)) |
(subclass Task Process) (=> (and (Task ?x) (part ?x ?robotProcess) (or (RobotTask ?x) (HumanTask ?x) (HumanRobotCollaborationTask ?x))) (instance ?x Process)) |
(subclass RobotTask Task) (=> (and (RobotTask ?x) (partOf ?x ?robotProcess) (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?robotProcess) (Robot ?agent))) (not (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?robotProcess) (Human ?agent)))) (exists (?skill) (and (part ?skill ?x) (RobotSkill ?skill)))) (and (Task ?x) (instance ?x Process))) |
(subclass HumanTask Task) (=> (and (HumanTask ?x) (partOf ?x ?robotProcess) (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?robotProcess) (Human ?agent))) (not (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?robotProcess) (Robot ?agent))))) (and (Task ?x) (instance ?x Process))) |
(subclass HumanRobotCollaborationTask Task) (=> (and (HumanRobotCollaborationTask ?x) (partOf ?x ?robotProcess) (exists (?human ?robot) (and (Agent ?human) (Agent ?robot) (participatesIn ?human ?robotProcess) (participatesIn ?robot ?robotProcess) (Human ?human) (Robot ?robot)))) (and (Task ?x) (instance ?x Process))) |
(subclass RobotSkill Process) (=> (and (RobotSkill ?x) (partOf ?x ?taskOrSkill) (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?taskOrSkill) (Robot ?agent))) (not (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?taskOrSkill) (Human ?agent)))) (exists (?primitiveOrSkill) (and (part ?primitiveOrSkill ?x) (or (Primitive ?primitiveOrSkill) (RobotSkill ?primitiveOrSkill))))) (and (Process ?x) (instance ?x Process))) |
(subclass RobotPrimitive Process) (=> (and (RobotPrimitive ?x) (partOf ?x ?skill) (exists (?agent) (and (Agent ?agent) (participatesIn ?agent ?skill) (or (Robot ?agent) (RobotPart ?agent))) (not (Human ?agent)))) (and (Process ?x) (instance ?x Process))) |
(subclass HumanRobotGroup Group) (=> (and (HumanRobotGroup ?group) (exists (?member) (and (Agent ?member) (memberOf ?member ?group) (or (Human ?member) (Robot ?member))))) (instance ?group Group)) |
(subclass Requirement Proposition) (=> (and (Requirement ?req) (or (evaluates ?req ?task) (evaluates ?req ?process))) (instance ?req Proposition)) |
(subclass KPI Proposition) (=> (and (KPI ?kpi) (specifies ?kpi ?requirement)) (and (subclass ?kpi Proposition) (subclass ?requirement Requirement) (disjointSubclass KPI Requirement))) |
(subclass Human Agent) (=> (and (Human ?human) (participatesIn ?human ?task) (or (instance ?task HumanTask) (instance ?task HumanRobotCollaborationTask)) (actsOn ?human ?environment)) (and (Agent ?human) (canActOnOwn ?human) (producesChanges ?human) (instance ?human Human))) |
- PDDL [30] Concepts
- ○
- Types in the PDDL are represented as classes in the PDDL ontology.
- ○
- Objects are instances of types in the PDDL, and they are mapped to ontology to individuals.
- ○
- The PDDL uses predicates to define relationships between objects, and these are described as object properties in ontology.
- ○
- Initial state specifies the state of the world before the start in the PDDL. Ontology uses object property assertions to represent these individual instances. Individual instances in the ontology are associated with their properties using object property assertions. It is the same for the goal state.
- ○
- Actions in the PDDL represent possible transitions between states. They can be created as classes and properties in ontology.
- ○
- Constraints are rules and limitations for actions in the PDDL. In oncology, they are described using additional axioms and properties.
- ACROBA Concepts
- RTMN Concepts
4. Results—Ontology Application
4.1. Ontology Implementation (Phase 4)
4.2. Ontology Testing (Phase 5)—The Use of ORPP in the ACROBA Project
- Populating the GUI (Orange Arrow 1.)
- Supporting task execution (Orange Arrow 2.)
- Supporting PDDL task planning (Orange Arrow 3.)
4.2.1. PCB Assembly Application—Example of Orange Arrow 1&2
4.2.2. PDDL Task Planning Application- Example of Orange Arrow 3
- Cell SetupFigure 14 is a picture of the demonstrator. It shows the setup of the cell.This cell consists of the following components.
- ○
- Robot: UR5e
- ○
- Gripper: ROBOTIQ 2F-85
- ○
- Camera: Zivid M+
- ○
- There is a “grid” that has three by three holes, and their positions are (00,01,02,10,11,12,20,21,22)
- ○
- There are “cylinder” and “circular-container”. A cylinder can be put into a circular-container. Both cylinders and circular-containers have three colors: red, yellow, and green.
- ○
- There are three bins for the cylinder (they are bin red, bin yellow, and bin green), and each bin contains six cylinders. Each bin contains one color of the cylinders
- ○
- There are three hangers for the circular-container (they are hanger red, hanger yellow, and hanger green), and each hanger contains one color of circular-containers.
- The Scenario
- ○
- The Initial State
- ○
- The Goal State
- ○
- The Constrains
- The PDDL (Planning Domain Definition Language) Definitions
- ○
- PDDL Domain Description
(define (domain assembly-dom) (:types hole cylinder circular-container bin color ) (:predicates (hasColor ?obj - cylinder ?color - color) (contains ?bin - bin ?cylinder - cylinder) (empty ?bin - bin) (onGrid ?container - circular-container ?hole - hole) (placed ?cylinder - cylinder ?container - circular-container) (goal-state-reached) ) (:action pick :parameters (?cylinder - cylinder ?bin - bin) :precondition (and (contains ?bin ?cylinder) (empty ?cylinder)) :effect (and (not (contains ?bin ?cylinder)) (not (empty ?cylinder))) ) (:action place :parameters (?cylinder - cylinder ?container - circular-container ?hole - hole) :precondition (and (empty ?container) (onGrid ?container ?hole) (not (placed ?cylinder ?container))) :effect (and (not (empty ?container)) (placed ?cylinder ?container))) ) |
- ○
- PDDL Problem Description
(define (problem assembly-prob) (:domain assembly-dom) (:objects h00 h01 h02 h10 h11 h12 h20 h21 h22 - hole c1 c2 c3 c4 c5 c6 c7 c8 c9 c10 c11 c12 c13 c14 c15 c16 c17 c18 - cylinder cc00 cc01 cc02 cc10 cc11 cc12 cc20 cc21 cc22 - circular-container bin-red bin-yellow bin-green - bin red yellow green - color ) (:init ; initial state of bins (contains bin-red c1) (hasColor c1 red) (contains bin-red c2) (hasColor c2 red) (contains bin-red c3) (hasColor c3 red) (contains bin-red c4) (hasColor c4 red) (contains bin-red c5) (hasColor c5 red) (contains bin-red c6) (hasColor c6 red) (contains bin-yellow c7) (hasColor c7 yellow) (contains bin-yellow c8) (hasColor c8 yellow) (contains bin-yellow c9) (hasColor c9 yellow) (contains bin-yellow c10) (hasColor c10 yellow) (contains bin-yellow c11) (hasColor c11 yellow) (contains bin-green c13) (hasColor c13 green) (contains bin-green c14) (hasColor c14 green) (contains bin-green c15) (hasColor c15 green) (contains bin-green c16) (hasColor c16 green) (contains bin-green c17) (hasColor c17 green) ; Initial state of circular containers on the grid (onGrid cc00 h00) (hasColor cc00 yellow) (onGrid cc01 h01) (hasColor cc01 yellow) (onGrid cc02 h02) (hasColor cc02 red) (onGrid cc10 h10) (hasColor cc10 yellow) (onGrid cc11 h11) (hasColor cc11 yellow) (onGrid cc12 h12) (hasColor cc12 red) (onGrid cc20 h20) (hasColor cc20 green) (onGrid cc21 h21) (hasColor cc21 green) (onGrid cc22 h22) (hasColor cc22 green) ; Initial state of cylinders in circular containers (placed c12 cc11) (placed c18 cc21) ; Initial state of bins (not (empty bin-red)) (not (empty bin-yellow)) (not (empty bin-green)) ) (:goal (and (placed c7 cc00) (placed c8 cc01) (placed c1 cc02) (placed c9 cc10) (placed c12 cc11) (placed c2 cc12) (placed c13 cc20) (placed c18 cc21) (placed c14 cc22) (goal-state-reached) )) ) |
- Ontology Representation for PDDL Domain and Problem
- ○
- Types (cylinders, circular containers, bins, holes, colors, and hangers) in the PDDL are represented as classes in the ontology (Cylinder, CircularContainer, Bin, Hole, Color, and Hanger).
- ○
- Objects are instances of types in the PDDL, and they are mapped to ontology with individuals. For example, individual cylinders such as c1, c2, and c3 are instances of the class Cylinder, and individual circular containers such as cc1, cc2, and cc3 are instances of the class CircularContainer.
- ○
- PDDL uses predicates to define relationships between objects, and these are described as object properties in the ontology. For example, the object property inContainer relates cylinders to circular containers, the object property inBin relates cylinders to bins, and the object property onGrid relates circular containers to holes on the grid.
- ○
- Initial state specifies the state of the world before the start in the PDDL. Ontology uses object property assertions to represent these individual instances. Individual instances in the ontology are associated with their properties using object property assertions. For example, individual cylinders like c1, c2, and c3 are associated with their colors, placement in bins, and circular containers using object property assertions. It is the same for the goal state.
- ○
- Actions in the PDDL represent possible transitions between states. They can be created as classes or properties in ontology. For example, the Pick action in the PDDL can be defined as the Pick Class.
- ○
- Constraints are rules and limitations for actions in the PDDL. In ontology, they are described using additional axioms or properties. For example, the constraint that “the container cannot be put on the cylinder” could be represented as an axiom in the ontology.
- ○
- The PDDL applies reasoning to create plans, while the ontology reasoning engines can infer new information based on axioms and assertions. For example, “a container is empty” is specified in the ontology, so the reasoning engine can infer and derive that the container is not holding any cylinders.
5. Conclusions
6. Discussion
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. List of Terms
Term | Source | Description |
Physical | SUMO | An entity that has a location in space-time |
Abstract | SUMO | Properties or qualities as distinguished from any particular embodiment of the properties/qualities in a physical medium. Instances of Abstract can be said to exist in the same sense as mathematical objects such as sets and relations, but they cannot exist at a particular place and time without some physical encoding or embodiment |
Process | SUMO | The class of things that happen and have temporal parts or stages. Examples include extended events like a football match or a race, actions like pursuing and reading, and biological processes. The formal definition is anything that occurs in time but is not an object |
Object | SUMO | Corresponds roughly to the class of ordinary objects. Examples include normal physical objects, geographical regions, locations of processes, and the complement of objects in the physical class |
Region | SUMO | A topographic location. Regions encompass surfaces of objects, imaginary places, and geographic areas. Note that a region is the only kind of object that can be located at itself |
Collection | SUMO | Collections have members like classes, but, unlike classes, they have a position in space-time and members can be added and subtracted without thereby changing the identity of the collection |
Agent | SUMO | Something or someone that can act on its own and produce changes in the world |
Artifact | SUMO | An object that is the product of a making |
Group | SUMO | A collection of agents |
Device | SUMO | A device is an artifact whose purpose is to serve as an instrument in a specific subclass of a process |
Quantity | SUMO | Any specification of how many or how much of something there is. Accordingly, there are two subclasses of quantity: number (how many) and physical quantity (how much) |
Attribute | SUMO | Qualities that we cannot or choose not to reify into subclasses of object |
Set or Class | SUMO | The set or class of sets and classes—i.e., any instance of abstract that has elements or instances |
Relation | SUMO | The class of relations. There are three kinds of relation: predicate, function, and list. Predicates and functions both denote sets of ordered n-tuples. The difference between these two classes is that predicates cover formula-forming operators, while functions cover term-forming operators. A list, on the other hand, is a particular ordered n-tuple. |
Proposition | SUMO | Propositions are abstract entities that express a complete thought or a set of such thoughts |
Device | CORA(X) | Device is partitioned to Robot, Robot Part, Robot Interface, and Electric Device. Robots have other devices as parts |
Robot | CORA(X) | a robot is a Device (SUMO term) that participates as a tool in a process. A robot is also an Agent which is something that can act on its own and produce changes. Robots perform tasks by acting on the environment/themselves |
RobotPart | CORA(X) | RobotPart is further divided into robotActuatingPart, robotCommunicatingPart, robotProcessingPart and robotSensingPart |
RobotInterface | CORA(X) | Robots interact with the world surrounding it through an interface. The RobotInterface is a device composed by other devices that play the roles of sensing device, actuating device, and communicating device. Through the interface, the robot can sense and act on the environment as well as communicate with other agents. Therefore, the robot interface can be viewed as a way to refer to all the devices that allow the robot to interact with the world. Every robot interface must have a part that is either a robot sensing part, a robot actuating part, or a robot communicating part |
RobotGroup | CORA(X) | A robot is an agent, and agents can form social groups. According to SUMO, a group is “a collection of agents”, like a pack of animals, a society, or an organization. A RobotGroup is a group whose only members are robots |
ArtificialSystem | CORA(X) | ArtificialSystem is an Artifact formed by various devices (and other objects) that interact in order to execute a function. For any part of an artificial system, there is at least one other part it interacts with |
RoboticSystem | CORA(X) | Robots and other devices can form a RoboticSystem. A RoboticSystem is an artificial system formed by robots and devices intended to support robots to carry on their tasks. Robotic systems might have only one or more than one robot |
Interaction | CORA(X) | An interaction is a Process in which two agents participate. It is composed of two subprocesses defining action and reaction. The action subprocess initiated by agent x on a patient agent y causes a reaction subprocess having y as an agent and x as a patient |
RobotProcess | ORPP | RobotProcess is a Process. It has at least one Robot as an Agent participant. A RobotProcess consists of at least one RobotTask. It can be composed of HumanTasks, RobotTasks, and HumanRobotCollaborationTasks |
Task | ORPP | Task is a Process. It is part of a RobotProcess. It has subclasses, RobtTask, HumanTask, and HumanRobotColloaborationTask |
RobotTask | ORPP | RobotTask is a Task. It is part of a RobotProcess. It has at least one Robot as Agent. It cannot have Human as Agent. It consists of RobotSkills |
HumanTask | ORPP | HumanTask is a Task. It is part of a RobotProcess. It has at least one Human as Agent. It cannot have Robot as Agent |
HumanRobotCollaborationTask (HRC Task) | ORPP | HRCTask is a Task. It is part of a RobotProcess. It has at least one Human and one Robot as Agent. There are five types of HRC tasks: Coexistence Fence (CF), Sequential Cooperation SMS (SS), Teaching HG (TH), Parallel Cooperation SSM (PS), and Collaboration PFL (CP). |
RobotSkill | ORPP | RobotSkill is a Process. It is part of a RobotTask. It can also be part of a RobotSkill. It consists of Primitives and/or other RobotSkills. RobotSkill has Robot as its Agent. Human cannot be the Agent of RobotSkill. |
RobotPrimitive | ORPP | RobotPrimitive is a Process. It is part of a RobotSkill. RobotPrimitive is atomic. RobotPrimitives has Robot or RobotPart as Agent. Human cannot be the Agent of RobotPrimitive |
HumanRobotGroup | ORPP | A Robot or a Human is an Agent, and agents can form social groups. According to SUMO, a group is “a collection of agents”. A HumanRobotGroup is a group that has human and robots as members. It participates in HumanRobotCollaborationTasks |
Requirement | ORPP | Requirement is a subclass of Proposition. Requirements evaluate Tasks or Processes. Requirement has KPI to specify it. A Task or Process can have multiple Requirements |
KPI | ORPP | KPI is a subclass of Proposition. KPIs specify Requirements. A Requirement can have multiple KPIs. KPI and Requirement are a disjoint subclass of Proposition |
Human | ORPP | Human is an Agent who can act on his own and produce changes. It participates in HumanTask or HumanRobotCollaborationTask. Human performs these tasks by acting on the environment/themselves |
References
- Diab, M.; Pomarlan, M.; Borgo, S.; Beßler, D.; Rossel, J.; Bateman, J.; Beetz, M. FailRecOnt-An Ontology-Based Framework for Failure Interpretation and Recovery in Planning and Execution. In Proceedings of the 2nd International Workshop on Ontologies for Autonomous Robotics, Bolzano, Italy, 11–18 September 2021. [Google Scholar]
- Pantano, M.; Eiband, T.; Lee, D. Capability-based Frameworks for Industrial Robot Skills: A Survey. In Proceedings of the 2022 IEEE 18th International Conference on Automation Science and Engineering (CASE), Mexico City, Mexico, 20–24 August 2022; pp. 2355–2362. [Google Scholar]
- Manzoor, S.; Rocha, Y.G.; Joo, S.H.; Bae, S.H.; Kim, E.J.; Joo, K.J.; Kuc, T.Y. Ontology-Based Knowledge Representation in Robotic Systems: A Survey Oriented toward Applications. Appl. Sci. 2021, 11, 4324. [Google Scholar] [CrossRef]
- Aguado, E.; Sanz, R.; Rossi, C. Ontologies for Run-Time Self-Adaptation of Mobile Robotic Systems. In Proceedings of the XIX Conference of the Spanish Association for Artificial Intelligence (CAEPIA), Málaga, Spain, 22–24 September 2021. [Google Scholar]
- Pang, W.; Gu, W.; Li, H. Ontology-Based Task Planning for Autonomous Unmanned System: Framework and Principle. J. Phys. Conf. Ser. 2022, 2253, 012018. [Google Scholar] [CrossRef]
- ACROBA. Available online: https://acrobaproject.eu/project-acroba/ (accessed on 28 February 2022).
- IEEE Std 1872-2015; IEEE Standard Ontologies for Robotics and Automation. IEEE: Piscataway, NJ, USA, 2015; p. 45.
- Niles, I.; Pease, A. Towards a Standard Upper Ontology. In Proceedings of the Second International Conference on Formal Ontology in Information Systems, Ogunquit, ME, USA, 17–19 October 2001; pp. 2–9. [Google Scholar] [CrossRef]
- Pedersen, M.R.; Nalpantidis, L.; Andersen, R.S.; Schou, C.; Bøgh, S.; Krüger, V.; Madsen, O. Robot Skills for Manufacturing: From Concept to Industrial Deployment. Robot. Comput.-Integr. Manuf. 2016, 37, 282–291. [Google Scholar] [CrossRef]
- Olivares-Alarcos, A.; Beßler, D.; Khamis, A.; Goncalves, P.; Habib, M.K.; Bermejo-Alonso, J.; Barreto, M.; Diab, M.; Rosell, J.; Quintas, J.; et al. A Review and Comparison of Ontology-Based Approaches to Robot Autonomy. Knowl. Eng. Rev. 2019, 34, e29. [Google Scholar] [CrossRef]
- Nilsson, A.; Muradore, R.; Nilsson, K.; Fiorini, P. Ontology for Robotics: A Roadmap. In Proceedings of the 2009 International Conference on Advanced Robotics, Munich, Germany, 22–26 June 2009. [Google Scholar]
- Buisan, G.; Sarthou, G.; Bit-Monnot, A.; Clodic, A.; Alami, R. Efficient, Situated and Ontology Based Referring Expression Generationfor Human-Robot Collaboration. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; ISBN 9781728160757. [Google Scholar]
- Olivares-Alarcos, A.; Foix, S.; Borgo, S.; Alenyà, G. OCRA—An Ontology for Collaborative Robotics and Adaptation. Comput. Ind. 2022, 138, 103627. [Google Scholar] [CrossRef]
- Umbrico, A.; Orlandini, A.; Cesta, A. An Ontology for Human-Robot Collaboration. Procedia CIRP 2020, 93, 1097–1102. [Google Scholar] [CrossRef]
- Umbrico, A.; Cesta, A.; Orlandini, A. Deploying Ontology-Based Reasoning in Collaborative Manufacturing. 2022. Available online: https://www.semantic-web-journal.net/system/files/swj3024.pdf (accessed on 12 July 2024).
- Sarthou, G.; Clodic, A.; Alami, R. Ontologenius: A Long-Term Semantic Memory for Robotic Agents. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019. [Google Scholar]
- Olszewska, J.I.; Barreto, M.; Bermejo-Alonso, J.; Carbonera, J.; Chibani, A.; Fiorini, S.; Goncalves, P.; Habib, M.; Khamis, A.; Olivares, A.; et al. Ontology for Autonomous Robotics. In Proceedings of the RO-MAN 2017 26th IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal, 28 August–1 September 2017; pp. 189–194. [Google Scholar] [CrossRef]
- Booch, G.; Jacobson, I.; Rumbaugh, J. The Unified Modeling Language for Object-Oriented Development; Documentation Set Version 0.91Addendum UML Update; Rational Software Corporation: Santa Clara, CA, USA, 1996. [Google Scholar]
- Yuguchi, A.; Nakamura, T.; Toyoda, M.; Yamada, M.; Tulathum, P.; Aubert, M.; Garcia Ricardez, G.A.; Takamatsu, J.; Ogasawara, T. Toward Robot-Agnostic Home Appliance Operation: A Task Execution Framework Using Motion Primitives, Ontology, and GUI. Adv. Robot. 2022, 36, 548–565. [Google Scholar] [CrossRef]
- Beetz, M.; Bessler, D.; Haidu, A.; Pomarlan, M.; Bozcuoglu, A.K.; Bartels, G. Know Rob 2.0—A 2nd Generation Knowledge Processing Framework for Cognition-Enabled Robotic Agents. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2018; pp. 512–519. [Google Scholar]
- Tenorth, M.; Beetz, M. KNOWROB—Knowledge Processing for Autonomous Personal Robots. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2009, St. Louis, MO, USA, 10–15 October 2009; pp. 4261–4266. [Google Scholar] [CrossRef]
- Weser, M.; Bock, J.; Schmitt, S.; Perzylo, A.; Evers, K. An Ontology-Based Metamodel for Capability Descriptions. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; ISBN 9781728189567. [Google Scholar]
- Perzylo, A.; Grothoff, J.; Lucio, L.; Weser, M.; Malakuti, S.; Venet, P.; Aravantinos, V.; Deppe, T. Capability-Based Semantic Interoperability of Manufacturing Resources: A BaSys 4.0 Perspective. IFAC-PapersOnLine 2019, 52, 1590–1596. [Google Scholar] [CrossRef]
- Schäfer, P.M.; Steinmetz, F.; Schneyer, S.; Bachmann, T.; Eiband, T.; Lay, F.S.; Padalkar, A.; Sürig, C.; Stulp, F.; Nottensteiner, K. Flexible Robotic Assembly Based on Ontological Representation of Tasks, Skills, and Resources. In Proceedings of the 18th International Conference on Principles of Knowledge Representation and Reasoning, Online, 3–12 November 2021. [Google Scholar]
- Saxena, A.; Jain, A.; Sener, O.; Jami, A.; Misra, D.K.; Koppula, H.S. RoboBrain: Large-Scale Knowledge Engine for Robots. 2014. Available online: https://ozansener.net/papers/robobrain.pdf (accessed on 12 July 2024).
- Falbo, R.d.A. SABiO: Systematic Approach for Building Ontologies. ONTO.COM/ODISE@FOIS 2014. Available online: https://dblp.org/rec/conf/fois/Falbo14.html (accessed on 12 July 2024).
- Caro, M.F.; Cox, M.T.; Toscano-Miranda, R.E. A Validated Ontology for Metareasoning in Intelligent Systems. J. Intell. 2022, 10, 113. [Google Scholar] [CrossRef] [PubMed]
- Zhang Sprenger, C.; Ribeaud, T. Robotic Process Automation with Ontology-Enabled Skill-Based Robot Task Model and Notation (RTMN). In Proceedings of the IEEE International Conference on Robotics, Automation and Artificial Intelligence (RAAI 2022), Singapore, 9–11 December 2022. [Google Scholar]
- Zhang Sprenger, C.; Corrales Ramón, J.A.; Urs Baier, N. RTMN 2.0—An Extension of Robot Task Modeling and Notation (RTMN) Focused on Human–Robot Collaboration. Appl. Sci. 2024, 14, 283. [Google Scholar] [CrossRef]
- Ghallab, M.; Knoblock, C.A.; Wilkins, D.E.; Barrett, A.; Christianson, D.; Friedman, M.; Kwok, C.; Golden, K.; Penberthy, S.; Smith, D.E.; et al. PDDL-The Planning Domain Definition Language. Domain-Specific Insight Graphs (DIG) View Project Ability-Based Design View Project SEE PROFILE PDDL|The Planning Domain Deenition Language. 1998. Available online: https://www.cs.cmu.edu/~mmv/planning/readings/98aips-PDDL.pdf (accessed on 12 July 2024).
- Borgo, S.; Cesta, A.; Orlandini, A.; Umbrico, A. Knowledge-Based Adaptive Agents for Manufacturing Domains. Eng. Comput. 2019, 35, 755–779. [Google Scholar] [CrossRef]
- Schou, C.; Skovgaard Andersen, R.; Chrysostomou, D.; Bøgh, S.; Madsen, O. Skill-Based Instruction of Collaborative Robots in Industrial Settings. Robot. Comput.-Integr. Manuf. 2018, 53, 72–80. [Google Scholar] [CrossRef]
- Topp, E.A.; Stenmark, M.; Ganslandt, A.; Svensson, A.; Haage, M.; Malec, J. Ontology-Based Knowledge Representation for Increased Skill Reusability in Industrial Robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Protégé. Available online: https://protege.stanford.edu/ (accessed on 9 May 2023).
- Paulius, D.; Sun, Y. A Survey of Knowledge Representation in Service Robotics. Robot. Auton. Syst. 2019, 118, 13–30. [Google Scholar] [CrossRef]
CQ Nr | Competency Question | Answer |
---|---|---|
CQ1 | What is the purpose of the ontology? | The ontology aims to support and facilitate the robotic process automation and human robot collaboration. |
CQ2 | What is the scope? | The ontology will include information on robotic processes. This includes pick and place, inspection, assembly, robot control, task planning, etc. |
CQ3 | Who are the intended end users? | There are two groups of intended users: 1. system engineers, programmers, and software engineers for support in AI functionalities; 2. Operators and engineers for error diagnostics and system queries. |
CQ4 | What is the intended use? |
|
CQ5 | What types of robotic processes for manufacturing exist? | There are two general types. One is applying robotics in light-out manufacturing processes. In these processes, the main goal of robots is replacing human workers and achieving automation. The other type is to apply robotics in collaborative manufacturing processes. The main goal for these processes is for robots and human to work together safely while increasing efficiency and/or improving human work conditions. |
CQ6 | How do robotic processes work? | These are processes that involve robots in the manufacturing steps. The process must enable communication with robots and control of robots. |
CQ7 | What are the robotic processes composed of? | A robotic process consists of different tasks. It can be human tasks, robot tasks, or combined tasks that are performed by humans and robots together. |
CQ8 | What are the links between these components? | These tasks can be independent or dependent. Sequential tasks are performed one after another (often the output of the former task is the input of the later task). Parallel tasks can be performed at the same time without influencing one another. |
CQ9 | What impacts robotic processes? | The speed of the robot, the performance of the robot, the safety of the robot, human factors (such as trust, fatigue, stress), and error diagnostics |
CQ10 | How are robotic processes evaluated? | There are KPIs that measure the process performance, such as cycle time, productivity, accident rate, etc. |
CQ11 | What are the types of robotic tasks? | Human tasks, robot tasks, and human robot collaboration tasks that are performed by human and robot together. |
CQ12 | What do robotic tasks consist of? | Tasks can contain subtasks. Robot tasks often consist of many robot actions that are called by different names in the research and in practice—for instance, skills, capabilities, functions, etc. |
CQ13 | What are the links between these components? | It is similar to tasks; these components can also work dependently or independently. |
CQ14 | What are types of Human–Robot Collaboration tasks? | They are different based on the task types (coexistence, sequential, parallel, cooperate, teaching, collaborate) and collaboration mode (Fence, safety-rated monitored stop, hand guide, speed and separation monitoring, power and force limiting). |
CQ15 | What impacts Human–Robot Collaboration? | Human factors (how humans feel and behave around the robot), safety of the robot, safety measures, safety control |
CQ16 | How do humans and robots work together? | Humans and robots can work together in different ways: coexistence (working on their own, do not have anything to do with each other), sequential (they do the work sequentially one after another, they are dependent), parallel (they work at the same time but not on the same part), cooperate(they work on the same part but not at the same time), teaching (humans guide the robot to learn something, the human is in control, the robot is in teaching mode), collaborate (the human and robot work together on the same part and at the same time). |
CQ17 | How is safety insured when humans and robots work together? | There are different safety modes based on ISO standards, and these safety modes must be ensured to guarantee safety. |
Class | Property | Condition | Cardinality |
---|---|---|---|
Process | name goal sequence input output feedback precondition postcondition | has at least one task | |
Task | name sequence input output feedback precondition postcondition | has at least one human task or robot skill | |
Human Task | name human agent output | no robot involvement | has at least one human agent |
Robot Task | name sequence input output feedback precondition postcondition | no human involvement | no human agent |
HRC Task | name sequence inputs outputs feedback precondition postcondition HRC task type HRC mode safety level | human and robot are both involved in completing the activity | has at least one human task and one robot task |
Robot Skill | name sequence input output feedback precondition postcondition | no human involvement | has at least one primitive |
Robot Primitive | name input output feedback precondition postcondition | has robot as an agent | atomic |
KPI | name formular | ||
Requirement | name goal |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang Sprenger, C.; Corrales Ramón, J.A.; Baier, N.U. ORPP—An Ontology for Skill-Based Robotic Process Planning in Agile Manufacturing. Electronics 2024, 13, 3666. https://doi.org/10.3390/electronics13183666
Zhang Sprenger C, Corrales Ramón JA, Baier NU. ORPP—An Ontology for Skill-Based Robotic Process Planning in Agile Manufacturing. Electronics. 2024; 13(18):3666. https://doi.org/10.3390/electronics13183666
Chicago/Turabian StyleZhang Sprenger, Congyu, Juan Antonio Corrales Ramón, and Norman Urs Baier. 2024. "ORPP—An Ontology for Skill-Based Robotic Process Planning in Agile Manufacturing" Electronics 13, no. 18: 3666. https://doi.org/10.3390/electronics13183666
APA StyleZhang Sprenger, C., Corrales Ramón, J. A., & Baier, N. U. (2024). ORPP—An Ontology for Skill-Based Robotic Process Planning in Agile Manufacturing. Electronics, 13(18), 3666. https://doi.org/10.3390/electronics13183666