A Model for the Definition, Prioritization and Optimization of Indicators
Abstract
:1. Introduction
2. Background and Related Works
- Product: It is often described as having the following main characteristics: novelty, value and surprisingness.
- (a)
- Novelty: A creative product must be new and original. Based on the three levels of creativity, novelty can be determined as levels P, S or H. P-level ideas seem new to the individual creator. S-level ideas result from a confluence of individual effort and the collective cultures of professional domains and social groups; therefore, S-level ideas are recognized as new and original to both the professional and the social group(s) involved. H-level ideas seem original to everyone [23].
- (b)
- Value: An innovative product must also be useful, i.e., it must be feasible and effective in solving a problem. Nguyen and Shanks [23] described value through suitability, including the correctness and appropriateness of the creative product to its context of use.
- (c)
- Surprisingness: Surprise is often associated with creative products. Nguyen and Shanks [23] described surprise as an unusual and unexpected impact that can either shock or surprise a person.
- Process: The creative process can be defined as an internal process of exploring and transforming conceptual spaces in an individual mind.
- Domain: The role of the domain is strongly recognized in the systemic view of creativity. First, the domain provides a symbolic system and body of knowledge of a subject. Second, the value and novelty of a creative product must be defined within a specific domain and the state of the art of that domain [23].
- People: The common personal characteristics of creative individuals can be categorized as follows: Traits (original, articulate and fluent in generating ideas), cognitive skills (metaphorical thinking, problem sensitivity and cognitive flexibility) and problem-solving styles (holistic approach to thinking, logical thinking approach and experimentation) [23].
- Context: Creative products are usually the result of teamwork, done collaboratively. The main factors that influence team creativity can be classified as cognitive or social. Cognitive factors include analysis and synthesis skills, procedural and organizational skills and political knowledge. Individual social factors include each team member’s education, work experience and background culture. Collaborative social factors include group dynamics and conflicts [23].
2.1. Requirement Indicators
2.2. Performance Indicators
2.3. Risk Indicators
2.4. Test Indicators
2.5. Service Operation Indicators
2.6. Deployment Indicators
2.7. Applying Indicator Management in Organizations
3. Materials and Methods
- Plan the case study: Identify a relevant situation for conducting a case study; compared research methods; understand the benefits of using a case study; address concerns about using a case study; and finally, decide whether you want to do a case study, as shown in Figure 2.
- Design the case study: Identify the cases and establish the logic of your case study; define the case(s) to be studied; develop the theory, propositions and related questions to guide the case study and generalize its findings; identify the design of the case study (single or multiple cases, holistic or embedded); and test the project based on pre-defined criteria to maintain the quality of a case study (Figure 2).
- Prepare to collect evidence from case studies: Improve skills to conduct the case study; train for a specific case study; develop the case study protocol; together with the general strategy, take into account some analytical techniques and address opposite explanations and interpretations (Figure 2).
- Collect evidence from case studies: Make data available from different perspectives; check the promising patterns, ideas and concepts; and develop a general analytical strategy (Figure 2).
- Analyze the evidence from the case study: Start with the definition of the questions (for example, the questions in your case study protocol) and not with the data. Focus son a small question first and then identify the evidence that deals with it. Draw an experimental conclusion based on the weight of the evidence. Consider how you should display the evidence so that interested parties can verify your assessment (Figure 2).
- Share the case studies: Define the audience and medium of communication; develop textual and visual materials; display enough evidence for the reader to reach his own conclusions; and review and recompose the research until it is adequate and well structured (Figure 2).
4. Proposed Model
- Select Indicators in Literature: Activity responsible for defining a list of indicators that will be used to define the scope of the organization’s indicators that will be evaluated based on the literature review.
- Interaction with Stakeholders: not applicable.
- Select Organization Indicators: Activity responsible for carrying out a survey of all indicators defined and monitored by the organization, within the scope of work previously established.
- Interactions with Stakeholders: sending emails, requesting a list of all indicators; for each indicator, it is necessary to supply: name, origin and documentation.
- Analyze Indicators in Common: Activity responsible for crossing the indicators from the literature review with the indicators coming from the organization, selecting the indicators in common and analyzing the feasibility of using the indicator as an entry in the prioritization model and optimization of indicators to be optimized.
- Interaction with Stakeholders: not applicable.
- Validate the List of Indicators with the Management Area: After defining the list of indicators, this activity will be responsible for validating it with senior management and business managers, whose processes will be impacted with the optimization of indicators to be prioritized.
- Interaction with Stakeholders: sending emails to all stakeholders, requesting acceptance of the resulting list of indicators.
- Communicate to Stakeholders the Impossibility of Executing the Model: If any stakeholder involved does not validate the list of indicators, this activity will be responsible for communicating to the other stakeholders involved the impossibility of continuing the process of prioritizing indicators. In this case, it will be necessary to select new indicators for the organization to be used in the process of prioritizing and optimizing indicators.
- Interaction with Stakeholders: sending an email informing stakeholders that it is impossible to proceed with the execution of the model and the need to select new indicators.
- Interviewing the End User: Activity responsible for conducting the interview with the customer to understand what are their main difficulties in using the organization’s product or service and understanding what their future expectations are (Figure 5).
- Interaction with Stakeholders: meeting with stakeholders.
- Define a Persona: Activity responsible for establishing and creating a profile that reflects the end user who benefits from the optimization of the prioritized indicators. It is necessary to define some information for the persona, such as (Figure 5): biography; personal data; greater challenges and frustrations; goals and objectives; responsibilities at work; day-to-day tools; your work is measured by; preferred means of communication; personality analysis (DISC–dominance, influence, stability and compliance).
- Interaction with Stakeholders: not applicable.
- Analyze Persona Problems: Based on the persona’s definitions and the interviews carried out by the end user, this activity will be responsible for analyzing the main persona problems, in which they need to be solved through the optimization of indicators (Figure 5).
- Interaction with Stakeholders: meeting to define and analyze problems.
- Establishing Long-Term Objectives: Activity responsible for establishing long-term objectives, as they direct the strategic vision of the organization’s future (Figure 5).
- Interaction with Stakeholders: meeting with stakeholders.
- Define Sprint Objectives: Activity responsible for defining what the current sprint’s objectives and deliverables will be, as the objectives direct where the end of the sprint should be reached, and the deliverables show us what should be done at the end sprint (Figure 5).
- Interaction with Stakeholders: design thinking workshop.
- Define the Indicator’s Objective: Activity responsible for clearly defining what the indicator’s objective will be. Defining the objective of the indicator is the first step in defining an indicator. It is through the objective that it is possible to clearly define the purpose of creating the indicator (Figure 6).
- Interaction with Stakeholders: design thinking workshop.
- Define the Area Responsible for the Indicator: Activity responsible for defining the area responsible for managing the indicator, from its creation to its monitoring (Figure 6).
- Interaction with Stakeholders: design thinking workshop.
- Define Indicator Data Sources: Activity responsible for clearly defining which data sources will be used by the indicator. An example of a data source could be: a database, spreadsheet, extraction via ETL, etc. (Figure 6).
- Interaction with Stakeholders: design thinking workshop.
- Define the Indicator Calculation Form: Activity responsible for defining the calculation to be used to score each indicator that makes up the list of indicators (Figure 6).
- Interaction with Stakeholders: design thinking workshop.
- Define the Indicator Collection Periodicity: Activity responsible for establishing what the indicator collection periodicity will be (every hour, once a day, every 12 hours, etc.) (Figure 6).
- Interaction with Stakeholders: design thinking workshop.
- Define the Indicator’s Goal: Activity responsible for establishing the indicator’s goals, to be defined by the responsible manager. When defining goals, it is necessary to establish criteria to define what a good, medium or bad goal is (Figure 6).
- Interaction with Stakeholders: design thinking workshop.
- Classify All Indicators in Cynefin Domains: Activity responsible for classifying all indicators that make up the list of prioritized indicators according to the domains established in Cynefin (obvious, complicated, complex and chaotic) (Figure 7);
- Interaction with Stakeholders: design thinking workshop.
- Classify All Indicators According to the Execution of Design Thinking: Activity responsible for classifying all indicators that make up the list of prioritized indicators according to the design thinking process (Figure 7);
- Interaction with Stakeholders: design thinking workshop.
- Apply the Calculation to Score each Indicator: Activity responsible for defining the calculation that will be used to score each indicator that makes up the list of indicators. The calculation to be defined should use the scoring of the indicators, both from the point of view of Cynefin and from the point of view of design thinking (Figure 7);
- Interaction with Stakeholders: not applicable.
- Define the Priority Indicators List: Activity responsible for defining a ranking of all indicators that make up the list of all prioritized indicators, based on the calculation performed in the previous activity (Figure 7);
- Interaction with Stakeholders: not applicable.
- Validate the List of Indicators with Participants: Activity responsible for validating the list of indicators that were prioritized, based on the model performed, and if the list is not successfully validated, it is necessary for the participants to identify the activity that failed for the process to run again from this activity (Figure 8).
- Interaction with Stakeholders: at the end of the design thinking workshop.
- Validate the List of Indicators with Senior Management: Based on the model executed, this activity will be responsible for validating the list of indicators that have been prioritized. The organization’s top management will be responsible for validating the list of indicators that has been prioritized, from a strategic and business point of view. If the list is not successfully validated, it is necessary for top management to identify the activity that failed in order for the process to be executed again from this activity (Figure 8);
- Interaction with Stakeholders: validation carried out through a meeting or by email.
- Identifying the weaknesses of the prioritized indicators: Activity responsible for identifying the weaknesses measured by the indicators that were prioritized. This weakness can correspond to a process, a procedure, a project, etc. (Figure 9).
- Interaction with Stakeholders: Meeting with everyone involved in the prioritized indicator.
- Identify Improvement Actions: Activity responsible for establishing the set of improvement actions that will solve the problems identified by the determined indicator (Figure 9).
- Interaction with Stakeholders: Design Thinking Workshop
- Define the Scope of Improvement Actions: Activity responsible for delimiting the scope of improvement actions up to the limit measured by the indicator (Figure 9).
- Interaction with Stakeholders: Design Thinking Workshop
- Identify those responsible for each action: Activity responsible for identifying who will be responsible for making each of the actions in the proposed action plan viable (Figure 9).
- Interaction with Stakeholders: Design Thinking Workshop
- Define the Period for Improvement Actions: Activity responsible for establishing the maximum period for the execution of each action plan improvement action (Figure 9).
- Interaction with Stakeholders: Design Thinking Workshop
- Formalize the Action Plan or Project: Activity responsible for formally initiating the execution of the action plan (Figure 9).
- Interaction with Stakeholders: The area responsible for improving the indicator will define the best way to interact with stakeholders.
Model Assumptions and Constraints
5. Model Execution
5.1. Execution of the Validate Indicators Step
5.2. Execution of the Apply Model Step
- Empathy Map: The empathy map is a tool used to map the user’s perception of the product or service to be offered. The purpose of this tool is to try to put itself in the end user’s shoes and identify how he interacts with the product or service he is consuming [67]. This tool was used to guide participants regarding the definition and classification of problems in the CSD matrix.
- CSD Matrix (Assumptions, Assumptions and Doubts): The CSD matrix is a tool used in the beginning of the execution of the design thinking process, which works from three issues: What is known about the needs of the end user? What are the hypotheses or what are you supposed to know? What are the doubts and what questions could be asked? With the CSD Matrix, you define what exactly you should focus and concentrate your efforts on to propose a solution to a given problem [68].
- The problem measures a totally restricted, predictable and repetitive behavior;
- Validation checklists help the problem to be solved;
- The indicator below the target identifies problems that are easily overcome;
- The relations of cause and effect are very clear and are always repeated;
- The solutions are easy and simple to pass on.
- The problem measures well-known, orderly and predictable practices, but needs a specialist to solve the problem;
- The restrictions are evident and applicable;
- The cause and effect relationships are evident, but the solution is open to analysis;
- Solution analysis involves an unrestricted set of processes;
- The solutions are not easy and simple to be passed on.
- The problem measures a system that partially restricts the behavior, despite the behavior modifying the restrictions;
- Cause and effect relationships are variable. The effects are not always repeated;
- Indicator below the target obliges senior management to provide real-time feedback;
- The analysis of the solution is usually modularized;
- If the problem is contained, it is good for innovation.
- The problem measures a random and unrestricted behavior that is difficult to create or sustain;
- There is no cause and effect relationship between the problem identified;
- Indicator below the target generates crisis if it is not optimized quickly;
- If the problem is contained, it is good for the organization’s operational resilience;
- It is not easy to reproduce the problem and the relationship between the system and the agents is impossible.
5.3. Execution of the Validate Prioritized Indicators Step
- Communicate to all stakeholders about the completion of the previous steps and the need to validate the list of indicators that have been prioritized.
- Validate the list of indicators prioritized by all stakeholders. If any interested party disagrees with the result, it is necessary to understand the reasons for which the disagreements occurred, and if applicable, review the model in question and perform the previous steps again.
- Formalize for senior management the list of prioritized indicators, as the optimization of these indicators is part of the organization’s strategic planning.
5.4. Execution of the Optimize the Prioritized Indicators Step
5.4.1. Design Thinking Workshop
- Personas Analysis: a persona is a fictional user designed to represent the typical user, literally speaking to them during the design process. The objective of the method is to develop solutions that go beyond the needs of these individuals. However, a persona is precisely a profile that represents most of the personality and characteristics of the end user of the solution to be proposed [64].
- Blueprint: It is a tool used to map the interactions between the user and the service provider, helping to standardize these interactions and find the points of failure of this interaction, in order to create more attractive value propositions for the end user [65].
- User Journey Map: It is a graphical representation of the user’s relationship steps with a product or service, which describes the steps taken before, during and after use [66].
5.4.2. Project
- Situation: The most proper solution when the organization already has more detailed knowledge of the problem to be addressed and the problem needs a more elaborated and planned solution (long term) [73].
- Execution: Develop a project plan, consisting of: definition of scope, schedule, communication plan, risk management and budget.
- Result: Improvement in the object measured by the optimized indicator.
5.4.3. Action plan
- Situation: The most proper solution when the organization already has more detailed knowledge of the problem to be addressed and the problem does not need a more elaborated and/or planned solution. Simpler planning addresses the problem resolution (short and medium term).
- Execution: Definition of a set of actions necessary to optimize the indicator, along with the deadline and the person responsible for executing each action.
- Result: Improvement in the object measured by the optimized indicator.
5.5. Execution of the Proposed Model
- Systems and Solutions Development
- Context: Area responsible for activities related to: development, measurement by function points, tests and management of external systems.
- Quantity of Indicators: 57.
- IT infrastructure
- Context: Area responsible for activities related to: configuration, change, deployment, database, connectivity, service desk and network.
- Quantity of Indicators: 60.
- IT Governance
- Context: Area responsible for activities related to: IT planning, IT internal controls, IT process management, IT contract planning, data administration, IT architecture and IT service level agreement management.
- Quantity of Indicators: 37.
The results of executing the step validate indicators are shown below. Following the criteria defined in the proposed model, the following results were found for Company A:- Total Indicators: 154
- Goal Indicators: 99
- Indicators Below the Goal: 26
5.6. Results and Discussion
6. Threats to Validity
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Turner, R. How Does Governance Influence Decision Making on Projects and in Project-Based Organizations? Proj. Manag. J. 2020, 51, 670–684. [Google Scholar] [CrossRef]
- Chatterjee, S.; Ghosh, S.K.; Chaudhuri, R. Knowledge management in improving business process: An interpretative framework for successful implementation of AI-CRM-KM system in organizations. Bus. Process. Manag. J. 2020, 26, 1261–1281. [Google Scholar] [CrossRef]
- Jeong, J.; Kim, N. Does sentiment help requirement engineering: Exploring sentiments in user comments to discover informative comments. Autom. Softw. Eng. 2021, 28, 18. [Google Scholar] [CrossRef]
- Ferrari, A.; Esuli, A. An NLP approach for cross-domain ambiguity detection in requirements engineering. Autom. Softw. Eng. 2019, 26, 559–598. [Google Scholar] [CrossRef]
- Kamalrudin, M.; Hosking, J.G.; Grundy, J. MaramaAIC: Tool support for consistency management and validation of requirements. Autom. Softw. Eng. 2017, 24, 1–45. [Google Scholar] [CrossRef]
- Sommerville, I.; Fowler, M.; Beck, K.; Brant, J.; Opdyke, W.; Roberts, D. Edition: Software Engineering. Instructor. 2019. Available online: http://www.cse.yorku.ca/~zmjiang/teaching/eecs4314/EECS4314_CourseOutline.pdf (accessed on 5 February 2022).
- Wanner, J.; Hofmann, A.; Fischer, M.; Imgrund, F.; Janiesch, C.; Geyer-Klingeberg, J. Process Selection in RPA Projects—Towards a Quantifiable Method of Decision Making; ICIS—Association for Information Systems: Munich, Germany, 15–18 December 2019. [Google Scholar]
- Kucukaltan, B.; Irani, Z.; Aktas, E. A decision support model for identification and prioritization of key performance indicators in the logistics industry. Comput. Hum. Behav. 2016, 65, 346–358. [Google Scholar] [CrossRef] [Green Version]
- Thomas Lockwood, E.P. Innovation by Design: How Any Organization Can Leverage Design Thinking to Produce Change, Drive New Ideas, and Deliver Meaningful Solutions; Career Press: Wayne, NJ, USA, 2017; pp. 1–224. [Google Scholar]
- Lucassen, G.; Dalpiaz, F.; van der Werf, J.M.E.M.; Brinkkemper, S. Improving agile requirements: The Quality User Story framework and tool. Requir. Eng. 2016, 21, 383–403. [Google Scholar] [CrossRef] [Green Version]
- Ciriello, R.F.; Richter, A.; Schwabe, G. When Prototyping Meets Storytelling: Practices and Malpractices in Innovating Software Firms. In Proceedings of the 2017 IEEE/ACM 39th International Conference on Software Engineering: Software Engineering in Practice Track (ICSE-SEIP), Buenos Aires, Argentina, 20–28 May 2017; Volume 37, pp. 163–172. [Google Scholar]
- Kirlangiç, G.; Obaid, M.; Yantaç, A.E. Storytelling before or after Prototyping with a Toolkit for Designing Classroom Robots; OZCHI; ACM: Sydney, NSW, Australia, 2020; pp. 582–593. [Google Scholar]
- Hotomski, S. Supporting Requirements and Acceptance Tests Alignment during Software Evolution. Ph.D. Thesis, University of Zurich, Zurich, Switzerland, 2019. [Google Scholar]
- Mayer, S.; Haskamp, T.; de Paula, D. Measuring what Counts: An Exploratory Study about the Key Challenges of Measuring Design Thinking Activities in Digital Innovation Units. In Proceedings of the HICSS 54th Hawaii International Conference on System Sciences, Kauai, HI, USA, 5 January 2021; pp. 1–10. [Google Scholar]
- Almeida, F.V.; Canedo, E.D.; da Costa, R.P. Definition of Indicators in the Execution of Educational Projects with Design Thinking Using the Systematic Literature Review. In Proceedings of the IEEE Frontiers in Education Conference—FIE, Covington, KY, USA, 16–19 October 2019; IEEE: Cincinnati, OH, USA, 2019; pp. 1–9. [Google Scholar]
- Souza, A.F.; Ferreira, B.; Valentim, N.M.C.; Correa, L.; Marczak, S.; Conte, T. Supporting the teaching of design thinking techniques for requirements elicitation through a recommendation tool. IET Softw. 2020, 14, 693–701. [Google Scholar] [CrossRef]
- Ferreira, V.G.; Canedo, E.D. Using design sprint as a facilitator in active learning for students in the requirements engineering course: An experience report. In Proceedings of the 34th ACM/SIGAPP Symposium on Applied Computing, Limassol, Cyprus, 8–12 April 2019; pp. 1852–1859. [Google Scholar]
- Henreaux, E.; Noutcha, M.; Phan-Ngoc, T.; Kieffer, S. Design Sprints Integrating Agile and Design Thinking: A Case Study in the Automotive Industry. In AHFE (12); Springer: New York, NY, USA, 2021; Volume 270, pp. 189–195. [Google Scholar]
- Ferreira, V.G.; Canedo, E.D. Design sprint in classroom: Exploring new active learning tools for project-based learning approach. J. Ambient Intell. Humaniz. Comput. 2020, 11, 1191–1212. [Google Scholar] [CrossRef]
- Shalbafan, S.; Leigh, E.; Pollack, J.; Sankaran, S. Decision-making in project portfolio management: Using the Cynefin framework to understand the impact of complexity. In Proceedings of the International Research Network on Organizing by Projects, Boston, MA, USA, 11–14 June 2018. [Google Scholar]
- Fierro, D.; Putino, S.; Tirone, L. The Cynefin Framework Furthermore, The Technical Leadership: How To Handle The Complexity. In CIISE; CEUR-WS.org: Naples, Italy, 2017; Volume 2010, pp. 72–81. [Google Scholar]
- Sirisawat, P.; Hasachoo, N.; Kaewket, T. Investigation and Prioritization of Performance Indicators for Inventory Management in the University Hospital. In Proceedings of the 2019 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Macao, China, 15–18 December 2019; pp. 691–695. [Google Scholar]
- Lemai Nguyen, G.S. A Framework for Understanding Creativity in Requirements Engineering. J. Inf. Softw. Technol. 2009, 51, 655–662. [Google Scholar] [CrossRef]
- Hickey, A.M.; Davis, A.M. A Unified Model of Requirements Elicitation. J. Manag. Inf. Syst. 2015, 20, 65–84. [Google Scholar]
- Inayat, I.; Salim, S.S.; Marczak, S.; Daneva, M.; Shamshirband, S. A systematic literature review on agile requirements engineering practices and challenges. Comput. Hum. Behav. 2015, 51, 915–929. [Google Scholar] [CrossRef]
- Robinson, W.; Vlas, R. Requirements evolution and project success: An analysis of SourceForge projects. In Proceedings of the 21st Americas Conference on Information Systems (AMCIS), Puerto Rico, 13–15 August 2015; pp. 1–12. Available online: https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1383&context=amcis2015 (accessed on 5 February 2022).
- Vlas, R.; Robinson, W. Extending and Applying a Rule-Based Natural Language Toolkit for Open Source Requirements Discovery and Classification. In Proceedings of the Open Source Systems (OSS’11), Salvador, Brazil, 6–7 October 2011. [Google Scholar]
- Vlas, R.; Robinson, W. A pattern-based method for requirements discovery and classification in open-source software development projects. J. Manag. Inf. Syst. (JMIS) 2012, 28, 11–38. [Google Scholar] [CrossRef] [Green Version]
- Vlas, R.; Robinson, W.N. Applying a Rule-Based Natural Language Classifier to Open Source Requirements: A Demonstration of Theory Exploration. In Proceedings of the 2013 46th Hawaii International Conference on System Sciences, Wailea, HI, USA, 7–10 January 2013; pp. 3158–3167. [Google Scholar]
- Sánchez, M.A. Integrating sustainability issues into project management. J. Clean. Prod. 2015, 96, 319–330. [Google Scholar] [CrossRef]
- Chopra, M.; Gupta, V. Linking knowledge management practices to organizational performance using the balanced scorecard approach. Kybernetes 2020, 49, 88–115. [Google Scholar] [CrossRef]
- Vad, T.; Stec, K.; Larsen, L.B.; Nellemann, L.J.; Czapla, J.J. Development of a framework for UX KPIs in Industry—A case study: ABSTRACT. In Proceedings of the OzCHI ’20: 32nd Australian Conference on Human-Computer-Interaction, Sydney, NSW, Australia, 2–4 December 2020; pp. 141–147. [Google Scholar]
- Kaplan, R.S.; Norton, D.P. The Balanced Scorecard: Measures that Drive Performance; Harvard Business Review; Harvard University: Boston, MA, USA, 1992. [Google Scholar]
- Vandaele, N.J.; Decouttere, C.J. Sustainable R&D portfolio assessment. Decis. Support Syst. 2013, 54, 1521–1532. [Google Scholar]
- Egilmez, G.; Kucukvar, M.; Tatari, O. Sustainability assessment of US manufacturing sectors: An economic input output-based frontier approach. J. Clean. Prod. 2013, 53, 91–102. [Google Scholar] [CrossRef]
- Rai, A.K.; Agrawal, S.; Khaliq, M. Identification of agile software risk indicators and evaluation of agile software development project risk occurrence probability. In Proceedings of the 7th International Conference on Engineering Technology, Science and Management Innovation (ICETSMI-2017), Delhi, India, 11 June 2017; pp. 489–494. [Google Scholar]
- Agrawal, R.; Singh, D.; Sharma, A. Prioritizing and optimizing risk factors in agile software development. In Proceedings of the 2016 Ninth International Conference on Contemporary Computing (IC3), Noida, India, 11–13 August 2016; pp. 1–7. [Google Scholar]
- Marzagão, D.S.L.; Carvalho, M.M. Critical success factors for Six Sigma projects. Int. J. Proj. Manag. 2016, 34, 1505–1518. [Google Scholar] [CrossRef]
- Chow, T.; Cao, D.B. A survey study of critical success factors in agile software projects. J. Syst. Softw. 2008, 81, 961–971. [Google Scholar] [CrossRef]
- Mahnic, V.; Zabkar, N. Using COBIT indicators for measuring scrum-based software development. Wseas Trans. Comput. 2008, 7, 1605–1617. [Google Scholar]
- Kremljak, Z.; Kafol, C. Types of risk in a system engineering environment and software tools for risk analysis. Procedia Eng. 2014, 69, 177–183. [Google Scholar] [CrossRef] [Green Version]
- Juhnke, K.; Tichy, M.; Houdek, F. Quality Indicators for Automotive Test Case Specifications. In Proceedings of the Workshops of the German Software Engineering Conference 2018 (SE 2018), Ulm, Germany, 6 March 2018; pp. 96–100. [Google Scholar]
- Lima, D.S.; Cerdeiral, C.T.; Santos, G. Indicadores de Medição de Testes em um Contexto ágil Usando o Template ASM.br. In Proceedings of the XXI Iberoamerican Conference on Software Engineering, Bogota, Colombia, 23–27 April 2018; Genero, M., Kalinowski, M., Molina, J.G., Pino, F., Conte, T., Marín, B., Brito, I., Giachetti, G., Eds.; Curran Associates: New York, NY, USA, 2018; pp. 395–408. [Google Scholar]
- Paredes-Gualtor, J.; Moscoso-Zea, O.; Luján-Mora, S. The role of enterprise architecture as a management tool. In Proceedings of the 2018 International Conference on Information Systems and Computer Science (INCISCOS), Quito, Ecuador, 13–15 November 2018; pp. 306–311. [Google Scholar]
- Granulo, A.; Tanovic, A. The advantage of using SWOT analysis for companies with implemented ITIL framework processes. In Proceedings of the 2020 43rd International Convention on Information, Communication and Electronic Technology (MIPRO), Opatija, Croatia, 28 September–2 October 2020; pp. 1656–1661. [Google Scholar]
- Moeller, R.R. Executive’s Guide to IT Governance: Improving Systems Processes with Service Management, COBIT, and ITIL; John Wiley & Sons: Hoboken, NJ, USA, 2013; Volume 637. [Google Scholar]
- Kusumasari, T.F.; Fauzi, R. Design Guidelines and Process of Metadata Management Based on Data Management Body of Knowledge. In Proceedings of the 2021 7th International Conference on Information Management (ICIM), London, UK, 27–29 March 2021; pp. 87–91. [Google Scholar]
- Rose, K.H. A Guide to the Project Management Body of Knowledge (PMBOK® Guide)—Fifth Edition. Proj. Manag. J. 2013, 44, e1. [Google Scholar] [CrossRef]
- Mateo, J.R.S.C.; Diaz, E.; Carral, L.; Fraguela, J.A.; Iglesias, G. Complexity and Project Management: Challenges, Opportunities, and Future Research. Complexity 2019, 2019, 6979721:1–6979721:2. [Google Scholar]
- Kozak, M.; Beaman, J. Relationship between customer satisfaction and loyalty. Tour. Anal. 2006, 11, 397–409. [Google Scholar] [CrossRef]
- Alemanni, M.; Alessia, G.; Tornincasa, S.; Vezzetti, E. Key performance indicators for PLM benefits evaluation: The Alcatel Alenia Space case study. Comput. Ind. 2008, 59, 833–841. [Google Scholar] [CrossRef] [Green Version]
- Jetter, J.; Eimecke, J.; Rese, A. Augmented reality tools for industrial applications: What are potential key performance indicators and who benefits? Comput. Hum. Behav. 2018, 87, 18–33. [Google Scholar] [CrossRef]
- Niemann, L.; Hoppe, T.; Coenen, F. On the benefits of using process indicators in local sustainability monitoring: Lessons from a Dutch municipal ranking (1999–2014). Environ. Policy Gov. 2017, 27, 28–44. [Google Scholar] [CrossRef] [Green Version]
- De Almeida Filgueiras, A.; de Souza Barros, L.P.; Gomes, J.S. O processo de implantação do Balanced Scorecard em uma empresa estatal brasileira: O caso Petrobras. REGE Rev. Gestão 2010, 17, 45–57. [Google Scholar]
- Wieringa, R.J. Design Science Methodology for Information Systems and Software Engineering; Springer: Berlin, Germany, 2014; Volume 1, pp. 1–337. [Google Scholar]
- Dalpiaz, F.; Gieske, P.; Sturm, A. On deriving conceptual models from user requirements: An empirical study. Inf. Softw. Technol. 2021, 131, 106484. [Google Scholar] [CrossRef]
- Sjøberg, D.I.; Dybå, T.; Anda, B.C.; Hannay, J.E. Building theories in software engineering. In Guide to Advanced Empirical Software Engineering; Springer: Berlin, Germany, 2008; pp. 312–336. [Google Scholar] [CrossRef]
- Yin, R.K. Case Study Research and Applications: Design and Methods; Sage Publications: Thousand Oaks, CA, USA, 2017; pp. 1–352. ISBN 150633615. [Google Scholar]
- Siponen, M.T.; Soliman, W.; Holtkamp, P. Research Perspectives: Reconsidering the Role of Research Method Guidelines for Interpretive, Mixed Methods, and Design Science Research. J. Assoc. Inf. Syst. 2021, 22, 1. [Google Scholar]
- Venable, J.R.; Pries-Heje, J.; Baskerville, R. Choosing a Design Science Research Methodology. In Proceedings of the 28th Australasian Conference on Information SystemsIEEE/ACIS International Conference on Computer and Information Science, University of Tasmania, Hobart, Australia, 24–26 May 2017; pp. 67–102. [Google Scholar]
- Adenso-Díaz, B.; Lozano, S.; Gutiérrez, E.; Calzada, L.; Garcia, S. Assessing individual performance based on the efficiency of projects. Comput. Ind. Eng. 2017, 107, 280–288. [Google Scholar] [CrossRef]
- Martins, H.F.; de Oliveira Junior, A.C.; Canedo, E.D.; Kosloski, R.A.D.; Paldês, R.Á.; Oliveira, E.C. Design Thinking: Challenges for Software Requirements Elicitation. Information 2019, 10, 371. [Google Scholar] [CrossRef] [Green Version]
- Castiblanco Jimenez, I.A.; Mauro, S.; Napoli, D.; Marcolin, F.; Vezzetti, E.; Rojas Torres, M.C.; Specchia, S.; Moos, S. Design Thinking as a Framework for the Design of a Sustainable Waste Sterilization System: The Case of Piedmont Region, Italy. Electronics 2021, 10, 2665. [Google Scholar] [CrossRef]
- Tonkinwise, C. A taste for practices: Unrepressing style in design thinking. Des. Stud. 2011, 32, 533–545. [Google Scholar] [CrossRef]
- Penzenstadler, B.; Betz, S.; Venters, C.C.; Chitchyan, R.; Porras, J.; Seyff, N.; Duboc, L.; Becker, C. Blueprint and Evaluation Instruments for a Course on Software Engineering for Sustainability. arXiv 2018, arXiv:1802.02517. [Google Scholar]
- Fehér, P.; Varga, K. The Value of Customer Journey Mapping and Analysis in Design Thinking Projects. In International Conference on Business Process Management; Springer: Berlin, Germany, 2019; pp. 333–336. [Google Scholar]
- Neubauer, D.; Paepcke-Hjeltness, V.; Evans, P.; Barnhart, B.; Finseth, T. Experiencing Technology Enabled Empathy Mapping. Des. J. 2017, 20, S4683–S4689. [Google Scholar] [CrossRef]
- Parizi, R.; da Silva, M.M.; Couto, I.; Trindade, K.; Plautz, M.; Marczak, S.; Conte, T.; Candello, H. Design Thinking in Software Requirements: What Techniques to Use? A Proposal for a Recommendation Tool. In Proceedings of the XXIII Iberoamerican Conference on Software Engineering, CIbSE, Curitiba, Brazil, 9–13 November 2020; pp. 320–333. [Google Scholar]
- Gray, B. The Cynefin framework: Applying an understanding of complexity to medicine. J. Prim. Health Care 2017, 9, 258–261. [Google Scholar] [CrossRef]
- Vogel, J.; Schuir, J.; Koßmann, C.; Thomas, O.; Teuteberg, F.; Hamborg, K. Let us do Design Thinking Virtually: Design and Evaluation of a Virtual Reality Application for Collaborative Prototyping. In Proceedings of the 28th European Conference on Information Systems—Liberty, Equality, and Fraternity in a Digitizing World, ECIS, Marrakech, Morocco, 15–17 June 2021. [Google Scholar]
- Thoring, K.; Müller, R.M. Understanding design thinking: A process model based on method engineering. In Proceedings of the DS 69: E&PDE 2011, the 13th International Conference on Engineering and Product Design Education, London, UK, 8–9 September 2011; pp. 493–498. [Google Scholar]
- Bayona, S.; Bustamante, J.; Saboya, N. PMBOK as a Reference Model for Academic Research Management. In Proceedings of the WorldCIST’18, Naples, Italy, 27–29 March 2018; Springer: Cham, Switzerland, 2018; pp. 863–876. [Google Scholar] [CrossRef]
- Martinsuo, M.; Klakegg, O.J.; van Marrewijk, A. Delivering value in projects and project-based business. Int. J. Proj. Manag. 2019, 37, 631–635. [Google Scholar] [CrossRef]
- Rivoir, A.; Landinelli, J. ICT-mediated Citizen Engagement—Case Study: Open Government National Action Plan in Uruguay. In Proceedings of the 10th International Conference on Theory and Practice of Electronic Governance, ICEGOV, New Delhi, India, 7–9 March 2017; pp. 214–217. [Google Scholar]
- Ribeiro, R.; Casanova, D.; Teixeira, M.; Wirth, A.L.; Gomes, H.M.; Borges, A.P.; Enembreck, F. Generating action plans for poultry management using artificial neural networks. Comput. Electron. Agric. 2019, 161, 131–140. [Google Scholar] [CrossRef]
Creativity Elements | Description | Implications for RE | RE-Related Creativity Research |
---|---|---|---|
1. Product | Novelty, value and surprise. | How can novelty, surprise and value be defined and determined in RE? | Integrate creativity techniques to facilitate the discovery of new and useful ideas and requirements in RE. |
2. Process | Inspirationalist, structuralist and situationist. | These three views are not mutually exclusive. An integration of views is needed to support different styles and creative thinking processes in the ER. | Evaluating creativity techniques in the RE process. |
3. Domain | A debate between general and domain-specific views of continuous creativity. | RE involves several domains. The research needs to clarify general and domain-specific aspects of creativity in RE. Education in RE needs to address different levels of creativity in general and in specific domains, with appropriate educational structures. | Educational frameworks have been proposed to support constructivist and experiential learning to support creativity in the RE domains and the business problem field. |
4. People | A list of common personal characteristics identified and examined. | The need to identify common personal characteristics (traits, cognitive skills, and problem-solving approaches) possessed by creative systems analysts. | Individual factors have been identified through empirical studies in an educational setting. |
5. Context | S-level creativity and social processes in producing, evaluating, and adopting creative products. | The need to understand and support the collaborative creative team process in RE. | Organizational factors at different levels that influence creativity were identified through a focus group and a case study in an educational setting. |
Assumptions | Restrictions |
---|---|
Result of the “Validate Indicators” step is at least 2 indicators. | All areas involved in improving the indicator to be prioritized should be aware that they must act directly in the activities that make up the model, when requested. |
To be sponsored by senior management regarding the application of Cynefin and Design Thinking. | Everyone involved in the “Apply the Model” step must have a full understanding of the process, project or service measured by the indicator to be prioritized. |
Top management should validate the list of indicators, both in the “Validate Indicators” step and in the “validate prioritized indicators” step. | Senior management should provide the necessary resources to optimize the indicator to be prioritized. |
The area responsible for the process, project or service measured by the prioritized indicator, should be directly involved in the indicator optimization process. | All areas involved in improving the indicator to be prioritized should be aware that they must act directly in the activities that make up the model, when requested. |
In the “Apply the Model” step, the end user must be the end customer of the business product or service to be offered or the business manager responsible for the product or service in question. | The end user must participate and be available to the people responsible for optimizing the indicator, whenever requested. |
Position | Indicator | Punctuation |
---|---|---|
1° | Average Demand Service Time | 50 |
2° | Time Performance Index | 48 |
3° | Average Incident Response Time | 41 |
4° | Percentage of Corrections in Deployments | 36 |
5° | Service Availability (CHANNELS) | 33 |
6° | Percentage of Demands Tested | 32 |
7° | Service Availability (SPB) | 32 |
8° | Monitoring Budget Execution (expenditure and investment) | 28 |
9° | Percentage of Unavailability and Failures | 26 |
10° | Indicator to measure the expected execution of POTI actions | 24 |
11° | Percentage of overtime consumption per period | 22 |
12° | Percentage of PDTI Executed up to the Current Period | 20 |
13° | Percentage of Untested Demands | 20 |
14° | Percentage of Managers’ Satisfaction | 19 |
15° | Satisfaction Response Percentage | 17 |
16° | Project completion percentage | 16 |
17° | Percentage of Non-Conformities in Treatment | 12 |
ID | Question |
---|---|
Design Thinking | |
Q1 | Does preliminary immersion occur when the problem is understood, based on research, both in the initial field (exploratory research) and in local and global references (desk research)? |
Q2 | In the analysis and synthesis phase, can some tools be used as insight cards and affinity diagrams? |
Q3 | (Ideation)—In the ideation phase, brainstorming is carried out, in addition to co-creation sessions with the public and professionals in the area, generating ideas that will be captured? |
Q4 | (Prototyping)—In the prototyping phase, which abstract ideas gain formal and material content, in order to represent the captured reality and provide the validation of all apprehended content? |
Cynefin | |
Q5 | (Obvious)—In this domain, everything that is predictable and repetitive was represented? |
Q6 | (Complicated)—Does it require a lot of analysis and technical expertise? |
Q7 | (Complex)—Is empiricism the basis of this domain? |
Q8 | (Chaotic)—In a situation like this, patterns are not identified. Action must be taken to restore order immediately. Is it necessary to be quick and decisive? |
Design Sprint | |
Q9 | Design Sprint is an excellent tool to solve challenges in a collaborative way, but these challenges must be well cut so that their resolution fits within the time available? |
Q10 | For the short time, Design Sprint is also not recommended when the subject is totally unknown. In this case, is it better to take more time generating the necessary knowledge? |
Suggested model | |
Q11 | Did the use of Design Thinking help prioritize and optimize indicators? |
Q12 | Did the use of Cynefin help prioritize and optimize indicators? |
Q13 | Did the use of Design Sprint help prioritize and optimize indicators? |
Q14 | Was the proposed model adequate to prioritize and optimize indicators? |
ID | Question | Yes | No |
---|---|---|---|
Design Thinking | |||
1 | Do you know the concept of Design Thinking? | 67% | 33% |
2 | Do you have any practical experience with Design Thinking? | 20% | 80% |
Cynefin | |||
1 | Do you know the concept of Cynefin? | 33% | 67% |
2 | Do you already have any practical experience with Cynefin? | 7% | 93% |
Design Sprint | |||
1 | Do you know the concept of Design Sprint? | 40% | 60% |
2 | Do you already have practical experience with Design Sprint? | 13% | 87% |
ID | Specific Objective | Result Analysis |
---|---|---|
1° | Conduct a literature review to identify the works that define and prioritize requirements, performance, risk and test indicators, using decision-making and user-oriented approaches, such as Design Thinking, Design Sprint, Cynefin, among others. | The literature review was carried out, as seen in the theoretical basis, and in summary, it was found that until this moment a model that establishes a prioritization and optimization of IT indicators has not been found. |
2° | Analyze the most relevant approaches to be used in the implementation of a new model to define and prioritize requirements, performance, risk and test indicators with a focus on the user, efficiently and within a context related to agile software development. | After reviewing the literature, it was decided to use the approaches listed in the proposed model. The approaches are Design Thinking, Design Sprint and Cynefin. |
3° | Propose an indicator prioritization model to check which indicators, if improved, will have a greater impact on business efficiency. | The proposed model is found in Section 4 |
4° | Verify the effectiveness and efficiency of the proposed model by comparing the result of delivering an organization’s services before and after the indicators are prioritized and improved. | The model’s effectiveness cannot yet be verified by means of this comparison, considering that it is necessary to wait a minimum period of time until more data is generated and the comparison can be carried out successfully. The verification of the degree of efficiency and effectiveness of the model was carried out through the feedback from interested parties collected at the end of the third stage of the model. |
5° | Perform a simulation to project the results of service deliveries, if other indicators were prioritized to be improved. If there is a greater perception in the improvement of products and services, when improving the indicators that were prioritized in the models in comparison with the result of the simulation performed with the improvement of other indicators, then the model is valid and the choice of the indicators that were improved was proper. | It has not been possible to perform the simulation so far, considering that such a simulation involves many variables and the degree of complexity increases a lot, as more indicators are ranked. The realization of this simulation is still under study and will be proposed as a future work. |
6° | If necessary, make adjustments to the proposed model, incorporating improvements from the observations/findings made in the validation of the model. | After the execution of the case study, several feedbacks were collected and the necessary adjustments were made in the case study itself. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Almeida, F.V.; Canedo, E.D.; de Oliveira Albuquerque, R.; de Deus, F.E.G.; Sandoval Orozco, A.L.; García Villalba, L.J. A Model for the Definition, Prioritization and Optimization of Indicators. Electronics 2022, 11, 967. https://doi.org/10.3390/electronics11060967
Almeida FV, Canedo ED, de Oliveira Albuquerque R, de Deus FEG, Sandoval Orozco AL, García Villalba LJ. A Model for the Definition, Prioritization and Optimization of Indicators. Electronics. 2022; 11(6):967. https://doi.org/10.3390/electronics11060967
Chicago/Turabian StyleAlmeida, Frederico Viana, Edna Dias Canedo, Robson de Oliveira Albuquerque, Flávio Elias Gomes de Deus, Ana Lucila Sandoval Orozco, and Luis Javier García Villalba. 2022. "A Model for the Definition, Prioritization and Optimization of Indicators" Electronics 11, no. 6: 967. https://doi.org/10.3390/electronics11060967
APA StyleAlmeida, F. V., Canedo, E. D., de Oliveira Albuquerque, R., de Deus, F. E. G., Sandoval Orozco, A. L., & García Villalba, L. J. (2022). A Model for the Definition, Prioritization and Optimization of Indicators. Electronics, 11(6), 967. https://doi.org/10.3390/electronics11060967