Next Article in Journal
Scalable and Optimal QoS-Aware Manufacturing Service Composition via Business Process Decomposition
Previous Article in Journal
An Efficient System Based on Experimental Laboratory in 3D Virtual Environment for Students with Learning Disabilities
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of Evaluation Criteria for Robotic Process Automation (RPA) Solution Selection

Department of IT Convergence Software Engineering, Korea University of Technology & Education (KOREATECH), 1600, Chungjeol-ro, Dongnam-gu, Cheonan-si 31253, Chungcheongnam-do, Republic of Korea
Electronics 2023, 12(4), 986; https://doi.org/10.3390/electronics12040986
Submission received: 18 January 2023 / Revised: 12 February 2023 / Accepted: 15 February 2023 / Published: 16 February 2023
(This article belongs to the Topic Software Engineering and Applications)

Abstract

:
When introducing a robotic process automation (RPA) solution for business automation, selecting an RPA solution that is suitable for the automation target and goals is extremely difficult for customers. One reason for this difficulty is that standardised evaluation items and indicators that can support the evaluation of RPA have not been defined. The broad extension of RPA is still in its infancy and only a few studies have been conducted on this subject. In this study, an evaluation breakdown structure for RPA selection was developed by deriving evaluation items from prior studies related to RPA selection and a feasibility study was conducted. Consequently, a questionnaire was administered three times, and the coefficients of variation, content validity, consensus, and convergence of factors and criteria were measured from the survey results. All of these measurement results are reflected in the final suitability value that was calculated to verify the stability of the evaluation system and evaluation criteria indicators. This study is the first to develop an RPA solution selection evaluation standard and the proposed evaluation breakdown structure provides useful evaluation criteria and a checklist for successful RPA application and introduction.

1. Introduction

Robotic process automation (RPA) is a business-process-based software solution that automates and processes simple and repetitive tasks using software robots [1,2,3,4,5]. Specifically, it processes organisational structured data and provides rule-based outputs [6,7]. The term was coined in the early 2000s by the Blue Prism company, which introduced software robots based on screen scraping technology [8]. RPA automates human behaviour and tasks, whereas artificial intelligence (AI) automates analysis and decision making by imitating intelligence and reasoning. Both technologies can enable various services individually or in combination [9]. RPA has facilitated innovation in the productivity improvement of many industries [10]. It originated in the field of information systems as a disruptive innovation that, among other automation solutions, has had a profound effect on job descriptions and work itself [11]. Since then, RPA adoption has grown because RPA does not require dedicated software development and is a low-cost solution that requires a small workforce and minimal implementation time to automate operations. It can automate the functions of existing software, promote communication between IT and other departments, and can more easily recognise coding-related capabilities and knowledge compared to other development methods. In particular, software integration is made possible based on various tasks in existing environments, which facilitates low complexity with high efficiency and productivity [12], and inevitable process improvement [7,13] in Industry 4.0. Therefore, automation using AI technology has recently been widely introduced across many industrial sectors. RPA has been applied to more than 20 diverse business areas, including internal organisational operations, functional improvements, risk management audits, data analysis, and reporting [2,3], and many enterprises have successfully integrated RPA [2,6,14,15,16,17,18].
According to Gartner [19], survey result scores for RPA product and service levels based on a five-point scale in 2020 were as follows: WorkFusion, 4.34; Microsoft, 4.20; Automation Anywhere, 4.18; Pegasystems, 4.16; UiPath, 4.15; Kofax, 4.12; ServiceeTrace, 4.11; NICE, 4.05; and Edge, 4.05.
According to McKinsey, the adoption rate for RPA in 2020 was 22% [20], which exceeds that for artificial-intelligence-based computer vision (18%) and deep learning (16%) solutions. Furthermore, the average annual growth from 2021 to 2024 was predicted to be in the double digits. According to Gartner [19], intelligent process automation linked to AI will further expand its market size and adoption, with 90% of global conglomerates being expected to introduce RPA by 2022. Furthermore, Samsung SDS, LG CNS, POSCO ICT, Grid One, Symation, Inzisoft, and EDENTNS have released RPA solutions and are competing with UiPath and Automation Anywhere.
However, organisations that wish to implement RPA solutions have a wide range of services to choose from and often have difficulty in selecting appropriate RPA solutions for their characteristics. This is because there are no standards or guidelines for the evaluation criteria used for selecting solutions. To address this issue, this study aimed to develop evaluation criteria for RPA solution selection. Specifically, this study was designed to evaluate both strategy and technology, and this is the first paper to propose an evaluation index for RPA solution selection. The evaluation criteria derived in this study can be used as a checklist for the introduction of RPA. Additionally, this paper is expected to serve as both a theoretical and practical reference when revising national laws and systems related to software projects.
The remainder of this paper is organised as follows. Section 2 presents the preliminary research and related research associated with this study. Section 3 presents the detailed research procedure and methodology adopted in this study. Section 4 describes a detailed development process and evaluation criteria for RPA solution selection. Finally, Section 5 summarises the main conclusions of this study.

2. Preliminary Research

2.1. Screen Scraping

In the term RPA, ‘robotic’ does not refer to a physical robot, but a ‘computer process’, in the sense that it replaces human cognitive work [14]. This implies that perception and behaviour are connected intelligently. Therefore, when introducing RPA, it is necessary to distinguish between the role of RPA in existing stereotyped processes and the role of employees.
Figure 1 compares pre- and post-RPA business processing for an ‘Order Details Processing Task’. Prior to applying RPA, an employee periodically logs in directly to the system. After confirming and verifying orders, the employee applies prices and discount rates that meet specific conditions, applies any additional discounts, and then charges the post-delivery price. However, after applying RPA, the employee only needs to perform the role of verifying order information based on contract terms and the other tasks are completed entirely by the RPA software.
The technology that enables this process is called screen scraping (also known as web scraping or web harvesting). This term refers to a technique used to capture and decode text and bitmap data on a computer screen that is used primarily in web environments to extract and convert structured data from output data into a human-readable form [8]. Screen scraping allows users to specify the outline of a box around icons and labels [22], which then allows robots to identify and click areas that are not accessible through existing limited pixel-based coded screen scraping [1]. A screen scraper communicates with the system as if it was an ordinary user, explores the user screen of the system, and reads information [23]. Additionally, a screen scraper can serve as a component of a larger program outside the information system [23]. A single-time scaler retrieves all information from an old information system and stores it in a new database, but uses a continuous scaler to keep the existing system active and retrieve information on the system screen when requested. Based on this principle, RPA identifies the patterns through which users perform tasks on existing legacy system screens. Developing automated lists of tasks from extracted patterns allows an RPA robot to repeat tasks directly in a graphical user interface (GUI) automatically.

2.2. Comparative Studies on RPA Solutions

Kim [2] divided RPA solution functions into robots, robot managers, and script-editing tools, as shown in Figure 2, and compared various RPA solutions. Ribeiro et al. [4] compared RPA intelligence functions between six RPA solutions with high market shares by dividing them into AI-related goals, technologies, and algorithms, as shown in Figure 3.
The results for each major RPA vendor presented in The Forrester (Cambridge, Massachusetts) Wave Evaluation [24] were divided into solution functions and strategies. Scores are provided on a scale of weak (0) to strong (5). These data represent an evaluation of the top vendors in the RPA market and do not describe the entire vendor landscape. Each vendor’s position on the vertical axis of the graphic indicates the strength of its current offering [24]. The key criteria for these solutions include task and process discovery, portfolio analysis, bot design and development, deployment and management, security and governance, scaling experience, and architecture [24]. Placement on the horizontal axis indicates the strength of vendor strategies [24]. This represents the product vision and innovation roadmap, delivery and support models, financial performance, and partner ecosystem [24]. The functional analysis results are presented in Figure 4.

2.3. Studies on RPA Solution Evaluation Elements

The US Federal RPA Community of Practice [25] has proposed evaluation elements for each department in terms of technical capabilities, process management, and operations, as shown in Table 1.
Korea’s Software Policy & Research Institute (SPRI) [2] divides evaluation elements into technology, process operational impact, and risk management categories according to the automation and maturity of vendors. Specifically, elements can be divided into the introduction stage, technical architecture, technological policy, process strategy consistency, and operational management. The details of these items are provided in Table 2.
For processes, only review items that are applicable to the RPA solution selection evaluation criteria were extracted from [8], reconstructed, and included in the listed categories.
Major RPA vendors attended the RPA Introduction Guide Seminar [26] sponsored by the Korea Electronic Newspaper and announced evaluation criteria for RPA solution selection. Here, Chan Sik Bong from KPMG proposed the selection of a solution with sufficient references to prioritise enterprise-level introduction and stable construction when introducing an RPA. Sean Lee, who is the managing director of Automation Anywhere, explained the derivation and verification of non-functional requirements, including automation functional requirements, architectural requirements, and development convenience/operability/maintenance/security requirements. Gye Kwan Kim, who is the CEO of Grid One, opined that ‘Korea’s IT environment should include not only company businesses, similar cases, business areas, and investment efficiency (return on investment, ROI), but also the ability to perform tasks in non-standardised GUI environments such as ActiveX and Flash’. Myung Su Jo, who is the managing director of Deloitte, announced that the prime considerations for RPA solution selection should be application capabilities, technical compatibility, manufacturing capabilities, and pricing.

2.4. Business Structural Optimisation Studies on Improving RPA Operational Efficiency

Algorithms that minimise the number of robots [27] when introducing RPA and algorithms that optimise area clustering and storage location allocation in process automation cloud systems are important considerations for operational efficiency. Strategically, new capabilities in terms of the average cost of automation, total investment cost, quality control, optimisation of management, and control of automation productivity have been presented as important factors for consideration [25]. Following in-depth experimentation on the RPA tools available in the market, the authors of [28] developed a classification framework for product categorisation and a methodology for selecting target tasks for robotic process automation using human interface records and process mining techniques [29]. Additionally, the authors of [30] developed an application to automate data acquisition process management and control by applying an RPA implementation workflow.

3. Research Procedure and Methodology

The sequence and method of implementation adopted in this study are summarised in Figure 5.

3.1. Research Procedure

The first step in this study was to structure collected evaluation items. Therefore, comprehensively structured items that should be evaluated when selecting RPA solutions were collected from existing resources, including literature reviews, press releases, and seminar videos. It was necessary to consider the efficiency and productivity of the introducing organisation and to be practical from both strategic and construction perspectives. I applied a solution-lifecycle-level approach progressing from user or organisational requirements to actual construction, management, and control tasks. Consequently, the initial introduction, functional, infrastructure, and vendor support aspects were evaluated comprehensively.
The second step was to derive evaluation criteria for RPA solution selection. Therefore, I developed a draft RPA solution evaluation standard based on the detailed evaluation department and evaluation items finalised in the structured evaluation item results. The proposed RPA solution selection evaluation system consists of three layers: evaluation department, evaluation item, and evaluation criteria. Each layer is based on similarity and the group names of the evaluation department and evaluation item were defined by referencing existing resources [2,4,8,24,25,26].
The third step was to verify the RPA solution evaluation criteria. Therefore, the Delphi survey method was used to verify the evaluation criteria for the proposed RPA solution. The validity of the questionnaire was evaluated using a seven-point Ricardo scale, and the coefficient of variation (CV), content validity ratio (CVR), conformity assessment (CA), and convergence degree (CGD) of the questionnaire results were calculated. If all validity measurements were satisfactory, then it was deemed that the RPA solution-phase criteria were satisfied.
(1)
Stability measured using the CV
The CV measures the value of measurement data and uses measured values as the basis for determining the agreement between panels [31,32]. It is the ratio of the standard deviation to the mean (average), as defined in Equation (1) [33]. Based on the study by Khorramshahgol and Moustakis [34], it was judged that a CV value below 0.5 is stable, a value of 0.5 to 0.8 is relatively stable, and additional questionnaires are required for a CV above 0.8 [34].
CV = S D ( S t a n d a r d   D e v i a t i o n ) M e a n
(2)
CVR
The CVR is defined as the total number of exports divided by the number of ‘important’ responses [31], as shown in Equation (2). The effective minimum value of the CVR based on the number of experts was determined by Lawshe [35,36]. In this study, there were 11 experts, so it was judged that a CVR value of 59 or more would satisfy the relevant conditions.
CVR = n r ( N 2 ) N 2  
Here, n r refers to the number of panel members indicating an item as ‘essential’, and N refers to the total number of panel members.
(3)
Consensus degree (CSD) and convergence degree (CGD)
To determine whether a panel is looking for agreement, the results presented by Delbecq et al. [37] were applied to measure the CSD and CGD, where the CSD was required to be at least 0.75 and the CGD was required to be less than 0.5, as defined in Equations (3) and (4).
CSD = 1 ( Q 3 Q 1 M e d i a n )
CGD = ( Q 3 Q 1 2 )
Median = median value
Q1 = first quartile, 25% of the total
Q3 = third quartile, 75% of the total
(4)
CA
CA applies Equation (5) to the CVR, CSD, and CDG values calculated using the equations presented above. As shown in Equation (5), CV, CVR, CSD, and CGD are all considered to be ‘conforming’ in the RPA solution selection evaluation criteria.
C A = ( C V ( x )   0.5 ) ( C V R ( x )   0.99 ) ( C S D ( x ) 0.75 ) ( C G D ( x )   0.5 )  
The final step is defining the RPA solution evaluation criteria. The Delphi survey method verifies the N-order evaluation criteria and determines the appropriate evaluation criteria for RPA solutions. Therefore, anonymous experts were asked about the evaluation criteria for RPA solutions after reflecting on the opinions of experts in the first round and re-questioning the revised evaluation criteria.

3.2. Research Methodology

In this study, the Delphi methodology was employed to develop indicators for the developed RPA breakdown structure and selection criteria with help from experts. The Delphi method refers to a ‘set of procedures to guide experts’ views on the issues they want to predict and summarise them into a comprehensive judgement’ [38]. This method can be used in scenarios where relevant research is insufficient or new evaluation standards are to be developed [17,31]. This method can also be used as a judgement, decision support, or prediction tool because Delphi surveys often help in understanding problems, opportunities, and solutions, as well as developing predictions [22].
In this study, the evaluation department, evaluation items, and evaluation standard of the RPA solution derived from existing sources were evaluated based on a seven-point Ricardo scale. Currently, there are no international standards or guidelines for selection criteria for RPA. To the best of my knowledge, this is the first study to address this problem. The Delphi methodology is suitable in that expert opinions are considered as much as possible to derive meaningful items by collecting informed ideas. The stability of factors and criteria were measured using CV, CVR, CSD, CGD, and CA, and then filtered.
To apply the Delphi survey method, RPA consulting and construction experts were defined as people with at least three years of relevant work experience. Delbecq et al. [37] suggested that between 10 and 15 people should be included for an appropriate number of members of the Delphi method group. Therefore, this study used a panel of 11 experts.

4. Developing Evaluation Criteria for RPA Solution Selection

4.1. Structuring Selected Evaluation Items

A total of 87 major criteria were derived and structured selected evaluation items were obtained from the collection of candidate items for RPA evaluation criteria that should be considered by an organisation when selecting RPA solutions, as described in previous studies. The details of the collected criteria are presented in Table 3.

4.2. Deriving Evaluation Criteria for RPA Solution Selection

The selection criteria were divided into evaluation categories, evaluation items, and evaluation criteria. The evaluation department converged the contents collected from existing resources [2,4,8,24,25,26] as closely as possible, resulting in categories of ‘introduction strategy’, ‘functionality’, ‘technical architecture’, and ‘operational management’. Regarding the evaluation criteria, the collected considerations and evaluation criteria defined in previous studies were rearranged according to their affinity. Next, similar standard names were renamed to a single name with the same meaning and only one duplicate item was deleted for the same standard.
Finally, evaluation items were defined as comprehensive representations of the considerations contained in each group to form a final refined evaluation criteria group. In this process, I attempted to maintain the framework of the evaluation criteria groups by referring to the results of prior research [2,4,8,24,25,26].
Consequently, seven evaluation items for the introduction strategy evaluation department were evaluated based on economic validity, supply maintenance, technical compatibility, real-time decision-making support, strategic compatibility, and process.
The functionality evaluation items map robot management and operation, analysis/categorisation/prediction, automation, and process evaluation criteria. The technical architecture evaluation department derives security and architecture evaluation items and maps detailed evaluation criteria for each evaluation item. The operational management evaluation department derives operational and standardised asset management evaluation items and defines the key evaluation criteria for RPA operations. The final evaluation criteria for RPA solution selection are presented in Table 4. For the convenience of the composition of questionnaires and the preparation and analysis of the results of questionnaires according to the evaluation index, the numbers under column ‘No.’ refer to the evaluation items and criteria. Additionally, for the same purpose, the evaluation department, evaluation items, and evaluation criteria are represented by I, II, and III, respectively.

4.3. RPA Solution Evaluation Criteria Verification

To verify the RPA solution evaluation criteria, experts with more than three years of experience in RPA construction and operation in Korea were invited. These experts consisted of RPA service supply groups such as RPA solution vendors, builders, consultants, and customer groups that introduce and operate RPA services. Detailed information regarding the final participating experts is provided in Table 5.
A total of three Delphi surveys were conducted to verify the criteria for RPA solution selection. The questionnaire questions were repeated in the form of 34 items for the first round and the evaluation indicators for 27 items were included in the second round. The initial development of evaluation departments, evaluation items, and evaluation criteria reflects the results of validity measurements and suggestions for evaluation indicators.
Validation was performed three times and the main contents of each step of verification are summarised below.
First Verification Overview: The validity of the evaluation criteria listed in Table 4 was verified for each I. evaluation category, II. evaluation item, and Ⅲ. evaluation criterion. After deriving indicators satisfying the conditions of CV ≤ 0.7, CVR ≥ 0.59, CSD ≥ 0.75, and CGD ≤ 0.5, six indicators were identified as appropriate, as shown in Table 6. The column headings ⓐ, ⓑ, ⓒ, and ⓓ in Table 6 indicate the conformity for each value, where ‘0′ represents ‘suitable’ and ‘1′ represents ‘unsuitable’.
When revised carefully, the evaluation categories as a whole and evaluation criteria for ‘Architecture’ and ‘Technical Architecture’ were considered as security evaluation items. These criteria are the most stable standards for evaluation.
Next, the opinions of the first expert evaluation were incorporated. The RPA solution selection evaluation benchmark index was improved by reflecting the ‘Proposal of Evaluation Criteria’ of experts for each questionnaire item. As a result, the names associated with the evaluation department were consolidated from ‘customer introduction strategy’, ‘functionality’, ‘development and operability’, and ‘operation management’ into the name of ‘operation management system’. Regarding the evaluation items, the real-time decision support and strategy integrity evaluation items of the introduction strategy evaluation department, analysis/classification/prediction, and process evaluation criteria of the functional evaluation department were rearranged.
Evaluation criteria for AI technology collaboration and expansion of the functional evaluation department were added, including real-time decision support for the introduction strategy evaluation department and analysis/category/prediction. To include the revised evaluation criteria, the names of the evaluation items were revised as deemed necessary.
Consequently, the introduction strategy evaluation department revised its evaluation item names by adding ‘solution supplier capacity’ and changing ‘technical integrity’ to ‘technical policy integrity’, ‘security policy’ to ‘security policy conformity’, and ‘process’ to ‘methodology’.
In the functional evaluation department, the names of the evaluation items were revised by adding ‘robot management and operability’ and changing ‘automation’ to ‘automation process development ease’. In the management evaluation department, ‘management’ was changed to ‘management policy’ and ‘standardised asset management’ was changed to ‘information asset management policy’. The criteria for selecting RPA solutions that reflect the results of the validity evaluation and expert questionnaires are summarised in Table 7. In this table, ‘N1′ represents drafts developed via literature research and ‘N2′ represents revised drafts. ‘N3′ represents deleted drafts and ‘N4′ represents movement between evaluation items. Many of the items marked with ‘O’ in the validity evaluation were also corrected.
To the best of my knowledge, this is the first expert questionnaire on RPA solution selection criteria and the conditions of CV ≤ 0.5 and CVR ≥ 0.99 were applied in consideration of the large number of relocations in evaluation items and criteria for actively incorporating expert opinions. To improve the accuracy of our study, the results in Table 6 were not used in their initial form and final amendments were applied to a more stringent standard. To this end, the results of Survey I.1 were adopted in the first Delphi survey. Additionally, the four evaluation items II.5, II.6, II.9, and II.11 were deleted, and the II.16 evaluation item was added. The evaluation criteria III.5, III.6, III.9, and III.11 included in the four deleted evaluation items were deleted or rearranged for form other evaluation items.
Second Verification Overview: For the development of the secondary verification questionnaire, the evaluation target index was selected based on whether CV ≤ 0.5 and CVR < 0.99 were satisfied by the verification result criteria of the primary questionnaire. Additionally, all indicators, deleted evaluation items, and deleted evaluation criteria were excluded. When completing the second questionnaire, the experts could easily correct the results by providing mean, standard deviation, stability, validity, consensus, convergence, and final judgement values. A total of 27 questionnaire items were presented. Considering that the evaluation department, evaluation items, and evaluation criteria were based on the opinions collected in the first round, the stability index was CV ≤ 0.75 and validity was determined according to CVR < 0.59 (p = 0.05) (Table 8).
The results of the second Delphi questionnaire were derived from 11 appropriately fitted items. A detailed examination of each indicator identified ‘I.2. Development and operability’ and ‘I.3. Technical architecture’ as appropriate categories for the evaluation department. Among the evaluation items, ‘II.2. Solution supplier capabilities’, ‘II.3. Technical policy consistency’, ‘II.8. Robot management and operability’, and ‘II.13. The architecture’ were identified as appropriate items. Among the evaluation criteria, III.1, III.3, III.8, III.10, and III.13 were identified as conforming.
Next, the opinions of the secondary expert evaluations were reflected. The RPA solution selection evaluation standard index was refined again by reflecting the contents of the ‘Evaluation Standard Opinion Proposal’ presented in the second questionnaire. No additional opinions were expressed by the evaluation department according to the table breakdown. Regarding the evaluation items, it was suggested that development and evaluation methods are necessary for the evaluation items considered by the introduction strategy evaluation departments of customers. Furthermore, ‘methodology’ was revised to ‘discovery and appropriateness evaluation of automation work objects’, and ‘automation process development and evaluation methodology’ was added to the evaluation criteria. Additionally, for the development and operability evaluation department, ‘methodology’ was revised to ‘automation process development and convenience’ because ‘ease of automation process development’ did not include evaluation criteria. The evaluation items for the management system evaluation department and information asset management policy are ambiguous, so no differences appeared. However, ‘operation policy’ emphasises that RPA falls under information service operation management policy. Therefore, ‘individual information service operation system’ and ‘information asset management policy’ were changed to ‘company-wide information asset management system’. Other changes, including changes to the evaluation criteria, are presented in Table 9. The contents of this table are described in the form of ‘Evaluation Item Number: Evaluation Criteria Elements’. For example, ‘4: Consistency with customer security architecture’ indicates that item ‘II.4 Security policy consistency assessment’ has added an evaluation criterion called ‘Consistency with customer security architecture’.
Third Verification Overview: To develop a questionnaire for the third round of verification, only 16 indicators that were not selected in the results of the second round of verification were included. The final adoption criteria were indicators satisfying Equation (5), and the results are presented in Table 10.
Of the 16 included indicators, 12 were confirmed to be valid and four were found to be inappropriate. The CVR of the evaluation items in II. 7 and II. 15 was each −0.09 and 0.45. The CVR of the evaluation criteria in III. 7 was −0.09 and that of the evaluation criteria in III.15 was −0.27. Despite failing to satisfy the CVR requirement, evaluation item II. 15 satisfied the requirements for stability, agreement, and convergence. Paradoxically, most experts disproved that the associated evaluation criteria were inappropriate. In the end, evaluation items II.7 and II.15, and evaluation criteria III.7 and III.15 were eliminated.
Next, the opinions of the third expert survey were incorporated. First, the evaluation criteria for III.7 and III.15 reflect the opinions of experts from the first and third surveys, and the evaluation criteria for automation policy development and convenience evaluation are contained in II.10. Other calculations [8], process engineering, and evaluation [25], which met the initial evaluation criteria derived from our literature search in terms of departmental unit objectives and enterprise unit objectives, were deleted. The evaluation criteria of III.15 were also deleted. III.15 defines the evaluation criteria corresponding to the company-wide operational management system to be observed during the operation of ‘III.14 individual automated processes’. Therefore, the III.15 evaluation criteria were modified into expressions suitable for individual automated operating systems such as code sharing [8], RPA lifecycle management [8], licence management [8], common module standardisation, and product repository management.
The evaluation criterion name In II.12 was revised to ‘supplementary management’ because it was suggested that it should be revised to ‘security management’ to enhance the ‘consistency of security policies’ and ‘differentiation of the introduction strategy’. The evaluation criterion of ‘character recognition ability to handle the specificity of native languages without exception [2,26], OCR’ was revised to ‘OCR (printed), OCR (written)’. The results of the other detailed opinions are presented in Table 11.

4.4. Final Validated RPA Solution Evaluation Criteria

The RPA solution evaluation criteria were divided into four evaluation departments and 10 evaluation items, each of which contains several evaluation criteria. The customer introduction strategy evaluation department introduced ‘economic feasibility’, ‘solution supplier capabilities’, ‘technical policy consistency,’ and ‘security policy consistency’ evaluation items. In the development and operability evaluation department, ‘robot management and operability’, ‘automation process development and convenience’, ‘AI technology connection and extension, ‘security management’, and ‘architecture evaluation’ were defined. The final RPA solution selection evaluation criteria for each evaluation item are presented in Table 12. In this table, the evaluation item numbers for the evaluation categories defined in this study and the flow of results from previous surveys are presented to help readers understand the development of the evaluation criteria.
Through the survey process, some of the criteria derived from our literature research were deleted or revised and various additional opinions were incorporated based on expert knowledge and experience.
In the evaluation of customer deployment strategies, cost items are divided into solution prices, deployment costs, licensing costs, operating costs, etc. In the evaluation of technical policy consistency, automated process development and evaluation methodology were added as standards for evaluating the conformity of the security architecture of a customer in relation to security policy conformity.
In the development and operability evaluation department, the ‘Business Operations Analysis Function [25]’ was modified to be more intuitive and concrete in the form of ‘Robot Operational Status Summary Function [25]’. The development and convenience evaluation items for automated processes were revised to provide a better understanding of existing indicators, including customer performance procedures (manual, automation) and RPA suppliers. In the case of AI technology connection and extension evaluation, AI technology and application were included.
In the technical architecture evaluation section, ‘virtualisation support using VM/content technology’ was modified and indicators such as ‘client versus supplier’ were added.
Finally, the operations management system evaluation department added indicators such as ‘script code shape and change management’, ‘bot management and operation data policy’, ‘supplier technical support system’, ‘operation model’, ‘operation calculation’, and ‘operation standardisation’.

4.5. Implications

RPA is limited to specific businesses and often accompanied by robot operations. Although RPA it is a software tool, it has limitations in that it cannot directly apply the technical evaluation criteria used in software construction. Therefore, organisations that wish to introduce RPA must establish appropriate criteria for selecting solutions. This minimises the time and effort required to modify and standardise subsequent maintenance and operational processes to match solutions by standardising and selecting appropriate applications for RPA. Even if there is no IT specialisation, it can easily be incorporated and so-called shadow IT introduction may increase. Therefore, RPA management should also be considered at the enterprise architecture standardisation and integration level in an enterprise-wide resource management system.
One expert participating in our Delphi surveys suggested that organisational consideration should be given to the evaluation criteria of RPA solutions for enterprise and agency missions and objectives [8], leadership priorities, and strategies. Another vendor expert stated that ‘RPA’s information management and business data management are often independent of customer companies, and the involvement of suppliers is limited, so it is necessary to manage data standards’. This company-wide issue is one of the evaluation items associated with the operating management system evaluation department defined in this study and is the main reason why this evaluation department and its corresponding evaluation items are maintained.
AI technology collaboration and scalability evaluation items were established in terms of development and operability, which is consistent with the current trend of selecting RPA solutions starting with the introduction of AI. In particular, ‘robot management and operation’ and ‘automation process development and operation’, which are not typically considered in the software field, are emerging as unique elements compared to other evaluation criteria.
Even if RPA is introduced based on these solution selection evaluation criteria, further efforts as a company are essential to recognise and utilise RPA in an organisation in the early stages of RPA development. Because RPA aims to automate repetitive business processes that have been standardised by companies, it is necessary to change the structure of an organisation to one that can further enhance and add value to existing human resources. Furthermore, the efficiency of operations achieved through RPA should be linked to the performance evaluation of individuals and their organisations, and the results should be shared as best practices at the company level to induce the spread of operational efficiency. Vendors and designers of RPA solutions should strive not only to promote companies that wish to introduce RPA, but also to form active partnerships that can promote the development of solutions that suit partners. Additionally, RPA lacks a consistent vocabulary. Therefore, a vendor-independent conceptualisation of RPA relationships between vocabularies is required [39].

4.6. Potential Threats to the Validity of this Study

The evaluation criteria used for selecting RPA solutions should be carefully selected according to their level of importance and the characteristics of the target industry group. Recently, several studies on various applications of RPA [40,41,42,43,44,45,46] have been published. It has been emphasized that the aspect of modelling improvement through implementations, adaptations, changes, and tracking that meet the needs of the target business environment should be prioritised. This indicates that the problem of the productivity paradox [47], where productivity decreases with increasing IT investment, including RPA solutions, may appear.
The current practice for developing RPA is to observe how routines are executed and then implement the executable RPA scripts that are necessary to automate routines using software robots based on evaluations by skilled human experts [48]. However, process optimization through intelligent automation that can interpret the UI logs of routine executions and support changes in automation routines for intermediate inputs is still in the research stage. Regardless, some studies [49,50] have indicated the emergence of new trends in IT that will pave the way for developing new methods of achieving sustainability that are very noteworthy for RPA adoption and selection. This implies that when introducing RPA, one should not overlook the fact that RPA is continuing to develop and evolve. Therefore, it must be clarified whether the goal of introducing RPA is simply automation, or process integration, intelligent automation, and autonomous intelligent work that enables decision making to minimize potential risks and threats when investing in and constructing IT, including RPA.

5. Conclusions

In recent years, RPA has been rapidly adopted by commercial organisations to automate repetitive business processes [19,20]. However, with various RPA solutions available on the market, it is difficult for companies to select RPA solutions that suit their business characteristics and processes. No formal evaluation criteria for RPA solution selection have been developed to alleviate this issue.
In this study, I developed evaluation indicators that can be used to select an optimal RPA solution for a specific enterprise. Based on a literature review, evaluation indicators were subdivided into evaluation departments, evaluation items, and evaluation criteria, and organised hierarchically. Eleven experts rated the validity of the derived evaluation indicators through three Delphi surveys. As a result, ten evaluation items were assigned to four evaluation departments and the evaluation criteria to be considered for each item were presented in detail. The customer deployment strategy evaluation department focuses on items of ‘economic feasibility’, ‘solution supplier capabilities’, ‘technical policy consistency’, and ‘security policy consistency evaluation’. The development and operability evaluation department considers ‘robot management and operability’, ‘automation process development and convenience’, ‘AI technology collaboration’, and ‘extension evaluation items’.
The technical architecture evaluation department considers ‘security management and architecture evaluation items’, as well as ‘operation management’. The system evaluation department considers individual automated process operating system evaluation items.
This study is of great significance for the development of evaluation indicators for RPA solution selection. Additionally, the evaluation criteria for each evaluation item presented in the developed evaluation index can be used as a checklist when applied in practice. This should allow organisations that are introducing RPA and those who lack an understanding of RPA to select RPA solutions that are optimised for enterprise and business characteristics. Finally, the presented evaluation standard can provide a theoretical reference for revising technical evaluation laws and regulations related to national software projects such as Korea’s software technology evaluation standard.
Regardless, because this study did not consider the selection of RPA solutions for a specific company, the feasibility of the derived RPA solution evaluation criteria must be verified through additional studies. As part of a follow-up study, I intend to conduct further research on the weight calculation for each indicator so that the work characteristics of each company are reflected at the most optimal level.

Funding

This research received no external funding.

Data Availability Statement

The data supporting the reported results are available upon request.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Madakam, S.; Holmukhe, R.M.; Kumar, J.D.K. The future digital work force: Robotic process automation (RPA). JISTEM-J. Inf. Syst. Technol. Manag. 2019, 16, 1–17. [Google Scholar] [CrossRef]
  2. Kim, K.B. A study of convergence technology in robotic process automation for task automation. J. Converg. Inf. Technol. (JCIT) 2019, 9, 8–13. [Google Scholar] [CrossRef]
  3. Dossier: The Choice of Leading Companies RPA, How to Choose It Well, and Use It Well. IBM Korea. Available online: https://www.ibm.com/downloads/cas/RRX5GWY1 (accessed on 6 April 2020).
  4. Ribeiro, J.; Lima, R.; Eckhardt, T.; Paiva, S. Robotic process automation and artificial intelligence in industry 4.0-A literature review. Procedia Comput. Sci. 2021, 181, 51–58. [Google Scholar] [CrossRef]
  5. van der Aalst, W.M.P.; Bichler, M.; Heinzl, A. Robotic process automation. Bus. Inf. Syst. Eng. 2018, 60, 269–272. [Google Scholar] [CrossRef] [Green Version]
  6. Aguirre, S.; Rodriguez, A. Automation of a business process using robotic process automation (RPA): A case study. Commun. Comput. Inf. Sci. 2017, 742, 65–71. [Google Scholar] [CrossRef]
  7. Pramod, D. Robotic process automation for industry: Adoption status, benefits, challenges and research agenda. Benchmarking Int. J. 2021, 29, 1562–1586. [Google Scholar] [CrossRef]
  8. Kang, S.H.; Lee, H.S.; Ryu, H.S. The Catalysts for Digital Transformation, Low·No-Code and RPA, Issue Report IS-117, Software Policy & Research Institute. Available online: www.spri.kr (accessed on 29 June 2021).
  9. IEEE. Available online: http://standards.ieee.org/standard/2755-2017.html (accessed on 2 April 2021).
  10. Hyun, Y.; Lee, D.; Chae, U.; Ko, J.; Lee, J. Improvement of business productivity by applying robotic process automation. Appl. Sci. 2021, 11, 10656. [Google Scholar] [CrossRef]
  11. Sarilo-Kankaanranta, H.; Frank, L. The Continued Innovation-Decision Process—A Case Study of Continued Adoption of Robotic Process Automation. In Proceedings of the European, Mediterranean, and Middle Eastern Conference on Information Systems, Virtual Event, 8–9 December 2021; Springer: Cham, Switzerland, 2022; pp. 737–755. [Google Scholar] [CrossRef]
  12. Wewerka, J.; Reichert, M. Robotic Process Automation: A Systematic Literature Review and Assessment Framework [Technical report]. arXiv 2020, arXiv:2012.11951. [Google Scholar]
  13. Marciniak, P.; Stanisławski, R. Internal determinants in the field of RPA technology implementation on the example of selected companies in the context of Industry 4.0 Assumptions. Information 2021, 12, 222. [Google Scholar] [CrossRef]
  14. Hyen, Y.G.; Lee, J.Y. Trends analysis and future direction of business process automation, RPA (robotic process automation) in the times of convergence. J. Digit. Converg. 2018, 16, 313–327. [Google Scholar] [CrossRef]
  15. Asatiani, A.; Penttinen, E. Turning robotic process automation into commercial success—Case opuscapita. J. Inf. Technol. Teach. Cases 2016, 6, 67–74. [Google Scholar] [CrossRef]
  16. George, A.; Ali, M.; Papakostas, N. Utilising robotic process automation technologies for streamlining the additive manufacturing design workflow. CIRP Ann. 2021, 70, 119–122. [Google Scholar] [CrossRef]
  17. Lee, T.-L.; Chuang, M.-C. Foresight for public policy of solar energy industry in Taiwan: An application of Delphi method and Q methodology. In Proceedings of the PICMET’12: Technology Management for Emerging Technologies, Vancouver, BC, Canada, 29 July–2 August 2012; IEEE Publications: New York, NY, USA, 2012. [Google Scholar]
  18. Yoon, S.; Roh, J.; Lee, J. Innovation resistance, satisfaction and performance: Case of robotic process automation. J. Digit. Converg. 2021, 19, 129–138. [Google Scholar]
  19. Gartner. Top 10 Trends in PaaS and Platform Innovation. Available online: https://discover.opscompass.com/en/top-10-trends-in-paas-and-platform-innovation-2020. (accessed on 10 October 2021).
  20. McKinsey. The State of AI in 2020. 2020. Available online: https://www.stateof.ai/. (accessed on 10 October 2021).
  21. Schatsky, D.; Muraskin, C.; Iyengar, K. Robotic Process Automation: A Path to the Cognitive Enterprise; Deloitte N Y Consult: New York, NY, USA, 2016. [Google Scholar]
  22. Skulmoski, G.J.; Hartman, F.T.; Krahn, J. The Delphi Method for Graduate Research. J. Inf. Technol. Educ. Res. 2021, 6, 1. [Google Scholar] [CrossRef]
  23. van Oostenrijk, A. Screen Scraping Web Services; Radboud University of Nijmegen, Department of Computer Science: Nijmegen, The Netherlands, 2004. [Google Scholar]
  24. Schaffrik, B. The Forrester Wave: Robotic Process Automation, Q1 2021. Herausgegeben von Forrester Research. Available online: start.uipath.com/rs/995-XLT-886/images/161538_print_DE.PDF (accessed on 10 October 2021).
  25. U.S. Fed.: RPA Community Practice RPA Program Playbook. 2020. Available online: https://www.fedscoop.com/rpa-cop-first-playbook/ (accessed on 3 September 2021).
  26. Etnews, J. RPA Introduction Guide Seminar. Available online: https://m.etnews.com/20190226000165?obj=Tzo4OiJzdGRDbGFzcyI6Mjp7czo3OiJyZWZlcmVyIjtOO3M6NzoiZm9yd2FyZCI7czoxMzoid2ViIHRvIG1vYmlsZSI7fQ%3D%3D (accessed on 10 February 2021).
  27. Séguin, S.; Tremblay, H.; Benkalaï, I.; Perron-Chouinard, D.; Lebeuf, X. Minimizing the number of robots required for a robotic process automation (RPA) problem. Procedia Comput. Sci. 2021, 192, 2689–2698. [Google Scholar] [CrossRef]
  28. Agostinelli, S.; Marrella, A.; Mecella, M. Research challenges for intelligent robotic process automation. In Proceedings of the International Conference on Business Process Management, Vienna, Austria, 1–6 September 2019; Springer: Cham, Switzerland, 2019; pp. 12–18. [Google Scholar]
  29. Choi, D.; R’bigui, H.; Cho, C. Candidate digital tasks selection methodology for automation with robotic process automation. Sustainability 2021, 13, 8980. [Google Scholar] [CrossRef]
  30. Atencio, E.; Komarizadehasl, S.; Lozano-Galant, J.A.; Aguilera, M. Using RPA for performance monitoring of dynamic SHM applications. Buildings 2022, 12, 1140. [Google Scholar] [CrossRef]
  31. Kim, S.H. Development of satisfaction evaluation items for degree-linked high skills Meister courses using the Delphi method. J. Inst. Internet Broadcast. Commun. 2020, 20, 163–173. [Google Scholar]
  32. Mitchell, V.-W.; McGoldrick, P.J. The Role of Geodemographics in Segmenting and Targeting Consumer Markets. Eur. J. Mark. 1994, 28, 54–72. [Google Scholar] [CrossRef]
  33. Na, Y.-S.; Kim, H.-B. Research articles: A study of developing educational training program for flight attendants using the Delphi technique. J Tourism. Sci. Soc. Korea 2011, 35, 465–488. [Google Scholar]
  34. Khorramshahgol, R.; Moustakis, V.S. Delphic hierarchy process (DHP): A methodology for priority setting derived from the Delphi method and analytical hierarchy process. Eur. J. Oper. Res. 1988, 37, 347–354. [Google Scholar] [CrossRef]
  35. Ayre, C.; Scally, A.J. Critical values for Lawshe’s content validity ratio: Revisiting the original methods of calculation. Meas. Eval. Couns. Dev. 2014, 47, 79–86. [Google Scholar] [CrossRef] [Green Version]
  36. Lawshe, C.H. A quantitative approach to content validity. Pers. Psychol. 1975, 28, 563–575. [Google Scholar] [CrossRef]
  37. Delbecq, A.L.; Van de Ven, A.H.; Gustafson, D.H. Group Techniques for Program Planning: A Guide to Nominal Group and Delphi Processes. Foresman, Scott Deloitte Analysis dupress.com; Deloitte University Press: Quebec, QC, Canada, 1975. [Google Scholar]
  38. Murry, J.W., Jr.; Hammons, J.O. Delphi: A versatile methodology for conducting qualitative research. Rev. Higher Educ. 1995, 18, 423–436. [Google Scholar] [CrossRef]
  39. Völker, M.; Weske, M. Conceptualizing bots in robotic process automation. In Proceedings of the International Conference on Conceptual Modeling, Virtual, 18–21 October 2021; Springer: Cham, Switzerland, 2021; pp. 3–13. [Google Scholar]
  40. Banta, V.C. Application of RPA Solutions near ERP Systems-in The business Environment Related to the Production Area. A Case Study. Ann. Univ. Craiova Econ. Sci. Ser. 2020, 1, 17–24. [Google Scholar]
  41. Banta, V.C. The Current Opportunities Offered by AI and RPA near to the ERP Solution-Proposed Economic Models and Processes, Inside Production Area. A Case Study. Ann. Constantin Brancusi' Univ. Targu-Jiu. Econ. Ser. 2022, 1, 159–164. [Google Scholar]
  42. Banta, V.C. The Impact of the Implementation of AI and RPA Type Solutions in the Area Related to Forecast and Sequencing in the Production Area Using Sap. A Case Study. Ann. Univ. Craiova Econ. Sci. Ser. 2020, 2, 121–126. [Google Scholar]
  43. Banta, V.C.; Turcan, C.D.; Babeanu, A. The Impact of the Audit Activity, Using AI, RPA and ML in the Activity of Creating the Delivery List and the Production Plan in Case of a Production Range. A Case Study. Ann. Univ. Craiova Econ. Sci. Ser. 2022, 1, 98–104. [Google Scholar]
  44. Hsiung, H.H.; Wang, J.L. Research on the Introduction of a Robotic Process Automation (RPA) System in Small Accounting Firms in Taiwan. Economies 2022, 10, 200. [Google Scholar] [CrossRef]
  45. E-Fatima, K.; Khandan, R.; Hosseinian-Far, A.; Sarwar, D.; Ahmed, H.F. Adoption and Influence of Robotic Process Automation in Beef Supply Chains. Logistics 2022, 6, 48. [Google Scholar] [CrossRef]
  46. Sobczak, A.; Ziora, L. The use of robotic process automation (RPA) as an element of smart city implementation: A case study of electricity billing document management at Bydgoszcz city Hall. Energies 2021, 14, 5191. [Google Scholar] [CrossRef]
  47. Jaiwani, M.; Gopalkrishnan, S. Adoption of RPA and AI to Enhance the Productivity of Employees and Overall Efficiency of Indian Private Banks: An Inquiry. In Proceedings of the 2022 International Seminar on Application for Technology of Information and Communication (iSemantic), Semarang, Indonesia, 17–18 September 2022; IEEE: New York, NY, USA, 2022; pp. 191–197. [Google Scholar] [CrossRef]
  48. Agostinelli, S.; Lupia, M.; Marrella, A.; Mecella, M. Reactive synthesis of software robots in RPA from user interface logs. Comput. Ind. 2022, 142, 103721. [Google Scholar] [CrossRef]
  49. Vijai, C.; Suriyalakshmi, S.M.; Elayaraja, M. The Future of Robotic Process Automation (RPA) in the Banking Sector for Better Customer Experience. Shanlax Int. J. Commer. 2020, 8, 61–65. [Google Scholar] [CrossRef] [Green Version]
  50. Vinoth, S. Artificial intelligence and transformation to the digital age in Indian banking industry—A case study. Artif. Intell. 2022, 13, 689–695. [Google Scholar]
Figure 1. Differences between manual and robotic processes. Adaptation based on Refs. [14,21].
Figure 1. Differences between manual and robotic processes. Adaptation based on Refs. [14,21].
Electronics 12 00986 g001
Figure 2. Comparison of RPA solution functions. Adaptation based on Ref. [2].
Figure 2. Comparison of RPA solution functions. Adaptation based on Ref. [2].
Electronics 12 00986 g002
Figure 3. Comparison of technologies and goals associated with AI. Adaptation based on Ref. [4].
Figure 3. Comparison of technologies and goals associated with AI. Adaptation based on Ref. [4].
Electronics 12 00986 g003
Figure 4. RPA scorecard (Q1 2021). Adaptation based on Ref. [24].
Figure 4. RPA scorecard (Q1 2021). Adaptation based on Ref. [24].
Electronics 12 00986 g004
Figure 5. Overview of research process and methodology.
Figure 5. Overview of research process and methodology.
Electronics 12 00986 g005
Table 1. Evaluation factors by dimensional classification of RPA product functions according to the US Federal RPA Community of Practice. Adaptation based on Ref. [25].
Table 1. Evaluation factors by dimensional classification of RPA product functions according to the US Federal RPA Community of Practice. Adaptation based on Ref. [25].
Dimensional ClassificationEvaluation Factors
ProcessWorkflow management
Process recording and reproduction
Self-study capability
Usability
Process engineering and evaluation
AutomationVisual creation tool
Instruction library
Full/partial automation capability
Component sharing
Test/debug control method
Usability
Management and OperationCentralised deployment, management, and scheduling
Licensing structure
Scalability, availability, and performance management
Exception management
Dashboard capability
Business and operational analysis capability
Table 2. Key considerations for RPA evaluation according to Korea’s SPRI. Adaptation based on Ref. [8].
Table 2. Key considerations for RPA evaluation according to Korea’s SPRI. Adaptation based on Ref. [8].
ClassificationMajor Consideration Items
Introduction
stage
CostModel deployment, licensing, and operating costs
UsabilitySystem interaction and integration
Technical aspectOS/hardware requirements, and RPA deployment and operational capabilities
Vendor supportSupport capabilities, education, customer service, contracts, etc.
Technical
architecture
establishment
Vendor
experience
Pre-evaluated market awareness, existing performance in the same field, customer cases, terms and conditions, and considerations
Product functionApplies mutatis mutandis [25]
SecurityLegal compliance, account and personal identification management, risk/security assessment, authentication, data encryption/protection, process tracking
ArchitectureReview hardware/software requirements and virtual server design, multi-tenancy, on-premise/cloud, permissions, availability/disaster recovery capabilities, network capacity/performance management capabilities
Technological
policy
Security policyDevelop risk management strategies, authority management strategies, and code control strategies after evaluating the risks involved in the scope of implementation
Account/personal
identification
management
Service/network and system/
application-level access management required
Privacy
protection
Understand system/application lines, capabilities/functionality, and user reviews, and interact with related systems to determine whether current security policies or use cases violate privacy policies depending on the type of data the system handles
ProcessTechnological
compatibility
[25]
Consistency of RPA program usage and objectives
Appropriateness of RPA service distribution/operation model
RPA program technical policy/architecture consistency
Strategic
compatibility
[25]
Corporate and institutional missions and objectives
Leadership priorities, strategies, and initiatives
Calculations corresponding departmental and corporate objectives
Effect [25]Compliance/audit functions
Operation
and
management
OperationOperational management, change management, automation interruption response, code sharing, automation scheduling, etc.
Standardised asset
management
Licence management, technical policy updating, RPA lifecycle management
Table 3. Numbers of key criteria for the evaluation of RPA selection collected through preliminary research.
Table 3. Numbers of key criteria for the evaluation of RPA selection collected through preliminary research.
Ref.Kang et al.
[8]
U.S.
[25]
Kim
[2]
Ribeiro et al.
[4]
Lu et al.
[24]
Etnews [26]
Number of
key criteria
20179131018
Table 4. Derived results for evaluation criteria for RPA solution selection.
Table 4. Derived results for evaluation criteria for RPA solution selection.
I. CategoryNo.II. Evaluation ItemsIII. Evaluation Criteria
1.
Introduction strategy
1Economic
validity
Price [26], cost elements (deployment, licensing, and operating expenses) [8], ROI [24], investment efficiency [26]
2Supplier
maintenance
support
Reference customer case holder [8,26], solution provider capability [26], product vision [24], dealer market awareness [8], existing results in the same field [8], terms and conditions [8], operability [26], serviceability [26], product and service support [24], support capabilities, education and customer service [8]
3Technological
compatibility
Hardware/software requirements [8], development convenience [26], technical elements (OS requirements and technology, and capabilities required for RPA deployment and operation) [8], technical compatibility [26], performance [24], system interaction and integration [8]
4Security policiesPersonal information protection (consideration of personal information protection policies according to system/application lines, capabilities/functionality and user reviews, and interactive system data types) [8], account/personal identification management [8]
5Real-time decision
making support
Classification [4], cognition [4], information extraction [4], optimisation [4]
6Strategic
compatibility
Portfolio [24], revolutionary roadmap [24], risk management strategies, corporate and institutional missions and objectives [8], and leadership priorities, strategies, and initiatives according to risk analysis evaluation [8]
7ProcessIterative and regular process identification/discovery [8], bot idea [24], delivery model [24], company business characteristics and business areas [26], calculations matching departmental and company-wide objectives [8]
2.
Functionality
8Robot management
and operations
Bot platform model, security [24], availability [25], and quality analysis of business performance (provided a graph of business performance) [2], management and analysis [24], dashboard capability [25], licensing structure [25], robot management functions [2], business performance management [25], exception management
9Analysis/
categorising/
predicting
Artificial neural network (ANN) [4], neuro-linguistic programming [4], decision tree [4], recommendation system [4], computer vision cognition [4], Text mining [4], statistical technique [4], fuzzy logic [4], fuzzy matching [4]
10AutomationExcel and SAP (ERP Solution) API support [2], instruction library [25], security enhancement site response [26], security character recognition [26], information security [26], bot development [24], bot design and development [24], atypical GUI infrastructure program automation (X-Internet, Active X, Flash) [2], task performance ability in standardised GUI environment [26], Hangul character recognition ability [26], Hangul character recognition [2]
11ProcessTechnology (RPA service distribution and operational model competency) [8], technology (RPA program technology policy/architecture compatibility) [8], usability [25], architectural requirements are easy to derive [26], application functions [26], workflow [25], self-learning capabilities [25], process greening and reproduction [25], process engineering and evaluation [25]
3.
Technical
Architecture
12SecurityCompliance with legal systems such as personal information protection [8], account and personal identification management [8], data encryption/protection [8], application security [8], risk/security evaluation [8], authentication [8], process traceability [8]
13ArchitectureOn-premise/cloud [8], virtualisation server design [8], availability/disaster recovery capabilities [8], permissions [8], network capacity [8], multi-tenancy [8], performance management capabilities review [8]
4.
Operational management
14OperationChange management [2], operation management [2], automation scheduling [8], automation interruption accident response [8]
15Standardised
asset management
Code sharing method [8], technical policy update, RPA lifecycle management [8], licence management [8]
Table 5. Information regarding experts who participated in this study.
Table 5. Information regarding experts who participated in this study.
ExpertsRPA Construction (Supplier Group)RPA Operations (Client Group)
11
120AO, AA, BR5-- -
25BR3BR103Finance, manufacturing, MIS, IT
3---AO13Public
4---AA204MIS, manufacturing, logistics, R&D
510BR3-- -
610AA, BP, UP, AO5----
77AA, AW4AA, AW2503FCM, HR, SCM, CRM, MFG
810AO3----
920UP3----
1015AO4----
1113UP4UP203Aviation, production, pharmaceutical, retail
1 Solution: AO (Automate One), AA (Automation Anywhere), BR (Brity RPA), BP (Blue Prism), UP (UiPath), AW (A. Works). ⓐ Number of constructions. ⓑ Construction experience solution. ⓒ Number of years of RPA construction experience. ⓓ RPA Operation solution. ⓔ Number of RPA systems introduced. ⓕ Introduction and operation period. ⓖ Field of introduction operations.
Table 6. First conformity assessment results.
Table 6. First conformity assessment results.
No.MeanSDCVCVRCSDCGDSelection
I.16.270.750.121.000.830.500000O
26.090.510.080.821.000.000000O
35.450.780.140.640.800.500000O
45.821.030.180.640.830.500000O
II.15.731.660.290.640.750.750001X
25.181.750.340.450.401.500111X
34.911.500.310.450.750.750101X
44.451.620.360.090.601.000111X
52.641.300.49−0.820.670.500110X
63.641.610.44−0.450.630.750111X
74.181.800.43−0.270.251.500111X
84.731.540.330.270.601.000111X
94.091.930.470.270.501.250111X
104.821.900.390.450.601.000111X
113.731.480.40−0.270.381.250111X
124.731.660.350.090.601.000111X
135.640.880.160.820.830.500000O
146.001.040.170.820.671.000011X
154.731.540.330.450.800.500100X
III15.551.780.320.640.671.000011X
25.640.980.170.820.700.750011X
35.091.240.240.450.750.750101X
44.821.530.320.450.700.750111X
52.730.960.35−1.000.830.250100X
64.551.440.32−0.090.251.500111X
74.181.530.37−0.270.381.250111X
85.090.790.160.450.700.750111X
93.451.880.54−0.270.131.750111X
105.820.940.160.820.750.750001X
114.641.150.25−0.090.501.000111X
125.001.600.320.640.800.500000O
135.001.540.310.450.700.750111X
145.911.160.200.640.671.000011X
154.641.490.320.090.700.750111X
ⓐ CV fitness (0: true, 1: false). ⓑ CVR fitness (0: true, 1: false). ⓒ CSD Fitness (0: true, 1: false). ⓓ CGD fitness (0: true, 1: false).
Table 7. Updated selection criteria based on the first round of Delphi evaluation opinions.
Table 7. Updated selection criteria based on the first round of Delphi evaluation opinions.
I. CategoriesNo.Ⅱ. Evaluation ItemsⅢ. Evaluation Criteria
1.
Customer
deployment
strategy N2
1Economic
validity
Expenses (solution, introduction and construction, licence, operations) [8,26] N2, investment value (ROI; [24], EVA, TCO, EVS, TEI, BSC, etc.) N1
2Capabilities of
solution
suppliers N2
Reference customer case holder [8,26] solution provider capability [26], product vision [24], dealer market awareness [8], existing results in the same field [8], terms and conditions [8], product and service support [24], support capabilities, education and customer service [8], partner ecosystem [24]
3Technology
policy
conformity N2
Purpose of application of RPA introduction N4, hardware/software requirements [8], technical elements (OS/hardware requirements and technical capabilities required for RPA deployment and operation) [8], technical compatibility [26], performance [24], system interaction and integration [8], RPA program application consistency [8], scalability N1, relatedness to other technologies N1, portfolio [24] N4, revolutionary roadmap [24] N4, risk management strategy based on risk analysis evaluation [8,26] N4, mission and objectives of companies and agencies [8] N4, leadership priority and strategy, initiative [8] N4, licensing structure [25] N4
4Security
policy
conformity
Personal information protection (considering personal information protection policies based on system/application lines, capabilities, functions and user reviews, interactive system data types) [8], account/personal identification management [8] N3, establishment of code control strategies [8]
7Methodology N2Repeated and regular process identification/standardisation [8] (modify benchmark names to reflect primary comments), bot ideas [24] N3, business performance processing procedures [24] N2, characteristics and operations of the company [26] N3, calculations consistent with departmental and enterprise unit objectives [8], process engineering and evaluation [25] N4
2.
Development and
operability N2
8Robot
management
and operability N2
Bot platform model and security [24], availability [25], business performance N3, quality analysis (provided with quality transition graphs) [2], management and analysis [24], dashboard capabilities [25], robot management functions (scheduling, load balancing, monitoring) [2], business and operational analysis functions [25], business performance details [2] N3, performance management [25], exception management [25], centralised deployment management, scheduling [25], operability [26] N4, maintainability [26] N4, self-study capability [25] N4, derive architectural requirements N3, tenancy [8] N4
10Ease of
development
for automated
processes N2
Excel and SAP (ERP solution) API support [2], command library [25], security enhancement site response [26], security character recognition [2], security [26], bot development [24], bot design and development [24], atypical GUI-based program automation (X-Internet, Active X, Flash) [2], performance ability under standardised GUI environment [26], usability [25], visual creation tools [25], full/partial automation capabilities [25], website automation essential security enhancements (HOMETAX, GOV24, Court, e-car) [2], convenient and intuitive creation (direct programming, flowchart, etc.) [2], component sharing [25], test/debug control methods [25], character recognition ability to handle local language specificities without exception [2,26] N2, OCR N1, development convenience [26] N4, RPA program service distribution/operation model conformity [8] N4, application functions [26] N4, workflow [25] N4, process recording and reproduction [25] N4
16Collaboration
and expansion
of AI technology
[4] N2
Classification, cognition, information extraction, optimisation, ANN, natural language processing (NLP), decision tree, recommendation system, computer vision cognition, text mining, statistical technique, fuzzy logic, fuzzy matching N4, process mining N1, scalability [25] N4
3.
Technical
architecture
12Security [8]Compliance with legal systems such as personal information protection, account and personal identification management, data encryption/protection, application security, risk/security evaluation, authentication, process traceability
13Architecture [8]On-premise/cloud, virtualisation support using VM/content technology N2, availability/disaster recovery capabilities, permissions, network capacity, performance management capabilities review, RPA program technical policy/architecture consistency N4, availability of duplex configuration N1, collaboration structure with customer’s internal system N1
4.
Operation
management system N2
14Operation policy
[8] N2
Change management, operational management, automation scheduling, automation interruption accident response, bot management, and operational data visualisation policy N1
15Asset
management
for information
assets [8] N2
Code sharing method, technical policy update, RPA lifecycle management, licence management, standardised operating models N1
N1: Added to reflect opinions from the primary expert survey. N2: Modified to reflect opinions from the primary expert questionnaire. N3: Deleted to reflect opinions from the primary expert survey. N4: Modified to reflect opinions from the primary expert questionnaire.
Table 8. Secondary conformity assessment results.
Table 8. Secondary conformity assessment results.
No.MeanSDCVCVRCSDCGDSelection
I.26.180.720.121.000.830.500000O
35.550.990.180.640.830.500000O
45.551.080.190.640.700.750011X
II.15.911.080.180.820.671.000011X
25.821.110.190.820.830.500000O
35.270.750.140.640.800.500000O
45.090.900.180.270.601.000111X
74.641.300.280.090.601.000111X
85.550.890.160.820.800.500000O
105.271.210.230.450.700.750111X
165.911.080.180.640.750.750001X
125.551.080.190.450.750.750101X
135.450.890.160.640.830.500000O
145.640.980.170.820.700.750011X
155.091.160.230.270.601.000111X
III16.360.770.121.000.860.500000O
26.000.850.141.000.671.000011X
35.360.770.140.640.830.500000O
45.000.950.190.270.700.750111X
74.551.300.29−0.270.750.500100X
85.360.640.120.820.800.500000O
105.360.770.140.820.800.500000O
165.451.230.230.450.750.750101X
125.731.140.200.640.671.000011X
135.360.880.160.640.800.500000O
145.821.030.180.820.671.000011X
154.911.440.290.090.601.000111X
ⓐ CV fitness (0: true, 1: false). ⓑ CVR fitness (0: true, 1: false). ⓒ CSD Fitness (0: true, 1: false). ⓓ CGD fitness (0: true, 1: false).
Table 9. Updated selection criteria following the second Delphi evaluation survey.
Table 9. Updated selection criteria following the second Delphi evaluation survey.
TypeResults of Changes in Evaluation Criteria by Evaluation Item
(Example: Evaluation Item Number (No.) of II: Evaluation Criteria Elements of III)
Added4: Consistency with customer security architecture
7: Automated process development and evaluation methodology
13: Review cloud architecture implementation standards (customer versus supplier)
14: Customer feedback management, script management, logging/upgrading/migration policies
15: Company-wide common module standardisation and product repository (output/result storage) management methods
Modified4: Possibility of integration with information access authorisation, issue management, and code control systems
8: Robot operation status aggregation function
12: Application security (authentication, authorisation, encryption, logging, security testing, etc.)
Deleted2: Solution provider capabilities, terms, and conditions
3: RPA introduction objectives, technical conformity, consistency of RPA program application, risk management strategy through risk analysis assessment, corporate and organisational mission and objectives, leadership priorities and strategies, initiative
13: RPA program technology policy/architecture conformity
15: Technological policy update
16: Classification, cognition, information extraction, optimisation
Moved14: Operability
16: Character recognition ability regardless of language specialties, OCR, scalability, relatedness to other technologies
Table 10. Third conformity assessment results.
Table 10. Third conformity assessment results.
No.MeanSDCVCVRCSDCGDSelection
I.45.730.620.111.000.830.500000O
II.16.000.740.121.000.830.500000O
45.270.750.140.820.800.500000O
74.911.000.20−0.090.601.000111X
105.550.890.161.000.800.500000O
165.910.670.111.000.920.250000O
125.550.780.141.000.830.500000O
145.550.780.140.820.830.500000O
155.270.860.160.450.800.500100X
III.26.090.670.111.000.920.250000O
45.360.880.160.640.800.500000O
74.450.990.22−0.090.750.500100X
165.550.780.140.820.830.500000O
125.640.770.141.000.830.500000O
145.450.780.140.820.800.500000O
154.550.780.17−0.270.750.500100X
ⓐ CV fitness (0: true, 1: false). ⓑ CVR fitness (0: true, 1: false). ⓒ CSD Fitness (0: true, 1: false). ⓓ CGD fitness (0: true, 1: false).
Table 11. Updated selection criteria based on the third Delphi evaluation survey.
Table 11. Updated selection criteria based on the third Delphi evaluation survey.
TypeResults of Changes in Evaluation Criteria by Evaluation Item
(Example: Evaluation Item Number (No.) of II: Evaluation Criteria Elements of III)
Added16: AI/ML level of optimisation for cognitive automation
10: Customer’s existing business performance procedures (manual, automation), steps/tools required to automate from RPA suppliers
14: Operational model, operational product, operational standard policy, operational rules, logging/dashboard management
Modified12: Security management
16: OCR (printed), OCR (written)
14: Script code shape and change management, operational data policy
Deleted10: Process recording and reproduction [25]
16: ANN [4], NLP [4], decision tree [4], recommendation system [4], computer vision cognition [4], text mining [4], statistical technique [4], fuzzy logic [4], fuzzy matching [4],
14: Operability [26]
Moved14: Code sharing [8], RPA lifecycle management [8], licence management [8], common module standardisation and product repository (output and result storage) management, upgrading/migration
Table 12. Final validated criteria for evaluating RPA solutions.
Table 12. Final validated criteria for evaluating RPA solutions.
I. CategoryNo.II. Evaluation ItemsIII. Evaluation Criteria
1.
Customer
deployment strategy
1Economic validityExpense (solution, introduction and construction, licence, operation expenses) [8,26], investment value (ROI, EVA, TCO, EVS, TEI, BSC, etc.) [24]
2Capabilities of
solution suppliers
Companies with reference client case [8,26], product vision [24], awareness of manufacturer’s market [8], existing performance in the same field [8], product and service support capabilities, education and customer service [8], partner ecosystem [24]
3Technology
policy conformity
Hardware/software requirement [8], technological elements (technology and ability for fulfilling OS/hardware requirements and RPA deployment and operations) [8], performance [24], system interaction and integration [8], portfolio [24], innovation roadmap [24], automation process development and evaluation methodology
4Security
policy conformity
Personal information protection (system/application line, capability and user review, information access and issue management strategies, interactive data types) [8], account/personal identification management (service/network and system/application-level access management) [8], consistency with customer security architecture
2.
Development and operability
8Robot
management and
operability
Bot platform model and security [24], availability [25], quality analysis (quality transition graph provided) [2], management and analysis [24], dashboard capability [25], robot management functions (scheduling, load balancing, monitoring) [2], robot operation status aggregation function [25], performance management [25], exception management [25], centralised deployment management, scheduling [25], maintainability [26], self-learning capability [25], Multi-tenancy [8]
10Automation
process
development and
convenience
Excel and SAP (ERP solution) API support [2], command library [25], security enhancement site response [26], security character recognition [2], security [26], bot development [24], bot design and development [24], atypical GUI-based program automation (X-Internet, Active X, Flash) [2], performance ability under standardised GUI environment [26], usability [25], visual creation tools [25], full/partial automation capabilities [25], website automation essential security enhancements (HOMETAX, GOV24, Court, e-car) [2], Convenient and intuitive creation (direct programming, flowchart, etc.) [2], component sharing [25], test/debug control methods [25], development convenience [26], RPA program service distribution/operation model conformity [8], application functions [26], workflow [25], process recording and reproduction [25], customer’s existing business performance procedures (manual, automation), steps/tools required to automate from RPA suppliers
16Collaboration and
expansion of
AI technology
AI/ML optimisation level, process mining and scalability for cognitive automation [25], OCR (printed), OCR (written), relatedness to other technologies
3.
Technical
Architecture
12Security
management
Compliance with legal systems such as personal information protection [8], account and personal identification management [8], data encryption/protection [8], application security [8], risk/security evaluation [8], authentication [8], process traceability [8]
13ArchitectureOn-premise/cloud [8], virtualisation support using VM/container technology, availability/disaster recovery capabilities [8], permission [8], network capacity [8], performance management capabilities [8], dual configuration availability, collaboration structure with customer internal systems, cloud architecture deployment standards (customer versus supplier)
4.
Operation
and
management systems
14Automation
process operation
systems
Script code shape and change management, operational management, automation scheduling [8], automation interruption accident response [8], bot management and operational data policy, supplier-customer technical support system, operational model, operational product policy, operational standard
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kim, S.-H. Development of Evaluation Criteria for Robotic Process Automation (RPA) Solution Selection. Electronics 2023, 12, 986. https://doi.org/10.3390/electronics12040986

AMA Style

Kim S-H. Development of Evaluation Criteria for Robotic Process Automation (RPA) Solution Selection. Electronics. 2023; 12(4):986. https://doi.org/10.3390/electronics12040986

Chicago/Turabian Style

Kim, Seung-Hee. 2023. "Development of Evaluation Criteria for Robotic Process Automation (RPA) Solution Selection" Electronics 12, no. 4: 986. https://doi.org/10.3390/electronics12040986

APA Style

Kim, S. -H. (2023). Development of Evaluation Criteria for Robotic Process Automation (RPA) Solution Selection. Electronics, 12(4), 986. https://doi.org/10.3390/electronics12040986

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop