Next Article in Journal
Leading Schools towards Sustainability. Fields of Action and Management Strategies for Principals
Next Article in Special Issue
Decision Tree Method to Analyze the Performance of Lane Support Systems
Previous Article in Journal
Evaluating Regional Eco-Green Cooperative Development Based on a Heterogeneous Multi-Criteria Decision-Making Model: Example of the Yangtze River Delta Region
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Few Critical Human Factors for Developing Sustainable Autonomous Driving Technology

by
José Fernando Sabando Cárdenas
1,
Jong Gyu Shin
2 and
Sang Ho Kim
2,*
1
HR Business Partner, Holcim Ecuador S.A., Guayaquil 090150, Ecuador
2
School of Industrial Engineering, Kumoh National Institute of Technology, Gumi 39177, Korea
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(7), 3030; https://doi.org/10.3390/su12073030
Submission received: 27 February 2020 / Revised: 31 March 2020 / Accepted: 8 April 2020 / Published: 9 April 2020

Abstract

:
The purpose of this study is to develop a framework that can identify critical human factors (HFs) that can generate human errors and, consequently, accidents in autonomous driving level 3 situations. Although much emphasis has been placed on developing hardware and software components for self-driving cars, interactions between a human driver and an autonomous car have not been examined. Because user acceptance and trust are substantial for the further and sustainable development of autonomous driving technology, considering factors that will influence user satisfaction is crucial. As autonomous driving is a new field of research, the literature review in other established fields was performed to draw out these probable HFs. Herein, interrelationship matrices were deployed to identify critical HFs and analyze the associations between these HFs and their impact on performance. Age, focus, multitasking capabilities, intelligence, and learning speed are selected as the most critical HFs in autonomous driving technology. Considering these factors in designing interactions between drivers and automated driving systems will enhance users’ acceptance of the technology and its sustainability by securing good usability and user experiences.

1. Introduction

Artificial intelligence (AI) technology is profoundly changing our daily lives, including the way humans drive vehicles. Autonomous driving technology started to emerge in the form of partial automation based on advanced driving assistance systems (ADAS) and has since evolved into full automation through partial and high automation [1]. This technology has been continuously developing to assist drivers, reduce their cognitive workload, and provide more pleasant driving experiences. According to Fagnant et al. [2], self-driving cars will reduce crashes by 90%. Their basic assumption for this hypothesis is that 90% of recent traffic accidents involve human errors. Therefore, the human role in transportation should be changed from drivers to passengers, who should thus stay out of the control loop to avoid making errors.
The aforementioned assumption seems reasonable; however, other things should be considered to make this technology acceptable and thus sustainable. As user acceptance and trust is crucial for the further and sustainable development of any new technology [3], considering human factors (HFs) that will affect user satisfaction is substantial for autonomous driving technology. However, reduced consideration of driver factors and personal differences can influence the performance of autonomous driving [4]. A standard classification of autonomous driving defines six levels of automation (see Section 2.1), and level 3 is the most advanced one at present. At this level, human drivers still need to monitor the environment and directly intervene with the actions of self-driving cars. When an unexpected situation happens, such that an autonomous car cannot continue handling by itself, then the driving control needs to be transferred to the human driver (handover control) [5]. This handover control can be abrupt due to a system’s sudden failure; hence, the driver needs to be kept in the loop and informed about the ongoing driving scenario [6]. In this level of automation, emphasis is placed on the proper design of interactions between the autonomous driving system and the human driver. One of the most critical goals of human factors engineers involved in autonomous driving technology is to design a new interface for autonomous cars. Such an interface needs to handle the new kind of interactions that come with this technology implementation. Currently, its most critical and important function is dealing with the handover control and making it as smooth and effective as possible. The megatrend of the recent device market is personalization and customization [7]. These personalization and customization needs can be delivered by different interactions for the various preferences of users. The first stage of a proper interaction design is to understand the users [8]. In take-over task, the critical factors that influence the performance of drivers must be drawn out through theories and experiments. These factors need to focus on the human side of this human–machine system; i.e., the different characteristics of humans that affect performance when driving need to be identified. Besides the common factors, such as age and gender, other factors must be pointed out, which is the aim of this research. Several factors can influence human performance, and we need a way to prioritize them on criticality for the task at hand. The purpose of this research is to propose this method to identify critical HFs and the connection between the most critical factors and human errors that can affect the performance of the driver when taking over control in level 3 of autonomous driving.

2. Theoretical Framework

To set the conceptual basis for this article, three main concepts and ideas are introduced in this section. First, the driving automation levels are defined according to the Society of Automotive Engineers (SAE). In addition, the types of human errors in autonomous driving and how these errors can yield differences in a take-over performance is explained. Finally, the quality function deployment (QFD), a key analysis technique of the research, is introduced.

2.1. Driving Automation Levels

A standard classification of autonomous driving defines six levels of automation, classified as follows, according to SAE (2016) [1]:
  • Level 0—No Automation: Zero autonomy; the driver performs all driving tasks.
  • Level 1—Driver Assistant: The vehicle is controlled by the driver; however, some driving assist features may be included in the vehicle design.
  • Level 2—Partial Automation: The vehicle has combined automated functions, such as acceleration and steering; however, the driver must remain engaged with the driving tasks and monitor the environment at all times.
  • Level 3—Conditional Automation: The driver is a necessity but is not required to monitor the environment. The driver must be ready to take control of the vehicle at all times with notice. This level is the focus of this study.
  • Level 4—High Automation: The vehicle is capable of performing all driving functions under certain conditions. The driver may have the option to control the vehicle.
  • Level 5—Full Automation: The vehicle is capable of performing all driving functions under all conditions. The driver may have the option to control the vehicle.

2.2. Human Error in Autonomous Driving

Because level 3 of autonomous driving cars still requires the intervention of a human driver from time to time, minimizing risks by considering the drivers’ deficiencies and capabilities in the aspect of the human–computer interactions is imperative. Therefore, the types of human errors that may happen in the course of control take-over are defined in accordance with the cognitive steps of the information processing model.
  • Perception: Missed the take-over request (ToR) signal for some reason, such as sleeping and inattention due to involvement in distracting activities other than the monitoring environment.
  • Recognition: Perceived the ToR signal but failed to recognize what it means and/or be aware of the situation.
  • Response Selection (Decision Making): Understood the overall situation but failed to select the proper response.
  • Response Execution: Selected the proper response but slipped to execute it as intended.
Of note, the driver needs to take over control as soon as possible within the safe time bound. Moreover, the driver needs to take over control as smooth as possible because the abrupt response can yield a sudden change of movement in the longitudinal and latitudinal directions, which will increase the risk of accidents during the take-over. The control take-over may fail, or its performance may be degraded at least if any of the above human errors happens.
According to a related research [9], the degraded performance during a control take-over can be measured in terms of the take-over time (ToT) and take-over quality (ToQ). ToT is defined as the time it takes for the driver to completely receive control after the ToR and is evaluated through variables, such as the take-over completion time of the operation. ToQ refers to the change in driving conditions that appear after a take-over and is evaluated through the standard deviation of lateral position, deceleration/acceleration, and handle angle. In this study, we postulate that the ToT or ToQ will appear differently depending on the chance of making these human errors. Once again, the goal of this study is to find critical HFs that may yield the difference in these chances of making errors.

2.3. Quality Function Deployment (QFD)

Several companies have been using QFD to assure that customer requirements are properly deployed into the process of creating a new product and to improve the product development process [10,11].
A main component of the QFD is the interrelationship matrix, which relates the components of two different dimensions using a four-point scale: no relationship, weak relationship, middle relationship, and strong relationship. In addition, the relative importance or weight for each “requirement” should be included. This is the main tool used in the proposed framework for this research; it makes the process of connecting the performance with the HFs a matter of successive relationship deployments.
In the design, QFD can be used, as shown in Figure 1 [12]. The goal is to find the relationship between customer requirements and product internal properties; however, it is not simple to identify such a relationship in one step. In this application, the deployment starts linking customer requirements and product external properties; a team of experts should translate the client’s requests into technical parameters that can be externally measured. Once the important external properties are found, a next deployment is made, and the link between the external and internal properties is identified.
In addition to being a tool for drawing out relationships between factors, QFD can be used as a prioritization tool as it allows the more important factors to be accounted by their relative weight.

3. Method

Human factors in autonomous driving is a new field of research, and take-overs in autonomous driving (human–AI interactions) have not been researched in extent. Limited relevant literature is available currently; however, there are some fields where automation has been implemented and operators have been relegated to the task of monitoring and taking control for emergencies or specific scenarios. Therefore, alternative sources of information were taken into consideration to obtain the critical HFs that will affect performance in control take-overs.
A literature review was performed on three main sources:
  • Autonomous driving (DRI) and human interactions: to get a proper understanding on the current stage of development and research on the field of self-driving cars, including, in general terms, how a take-over is made.
  • Nuclear power plant (NPP) and aviation (AVI) human factors: interactions between the drivers in an autonomous car are to a certain extent similar.
  • Human factors engineering: to get an understanding of probable human errors and categorize the possible problems in terms of human factors.
The proposed framework is summarized in Figure 2. It starts with a scenario analysis to identify the human activities or tasks required in a take-over. In an ideal scenario, these tasks will be perfectly performed, and the performance will be influenced only by the driver’s skills. However, human errors can happen. Tasks, human errors, and performance are influenced by HFs, and the most critical factors need to be determined first to design an interface to support them. Critical HFs find through the relationship between accident data due to human errors and take-over scenarios and interrelationship with HFs.
Considering the previous literature, probable and critical HFs that can affect the performance of take-overs were drawn out. The initial set of HFs was classified to reach a reduced category of HFs that are compared against the tasks and performance.
Because it is hard to identify a direct relationship between HFs and the tasks required for a take-over, intermediate steps were performed. Interrelationship matrices were used to intervene in this gap. Successive iterations of the matrix deployment were performed until the critical HFs affecting the performance in a take-over subtask were derived.
The method can be summarized as follows in Table 1:

4. Results

The first part of this section describes how to divide a control take-over task of the driver into its subtasks based on the scenario analysis and define those subtasks in terms of cognitive engineering point of view. The second part summarizes HFs, i.e., driver’s characteristics, that may affect the acceptance of autonomous driving technology, with references. These HFs are categorized with respect to their sources and features for a further analysis, which estimates the importance of the category and factors within them. The last part of the section describes the process and results of selecting a few critical HFs based on their importance estimated by the successive deployment of the interrelation matrices.

4.1. Scenario Analysis in a Take-Over

Initially, a work domain analysis on take-overs was performed in this study to gain a proper understanding of the required subtasks that need to be performed by drivers. Walch et al. [13] proposed a generic handover process from the system’s side. As the control is on the system side, an alert must be given to the driver for being engaged in the driving mode. After the attention is gained, the information should be delivered so that the driver understands what is going on and can take proper actions. However, drivers may or may not understand the ongoing situation, mostly depending on their own experience and the level of information quality provided by the interface. Once this step is completed, and if the driver is potentially able to take control, a proper maneuver must be performed to avoid danger and/or continue with driving safely and comfortably.
Take-over tasks are analyzed here in terms of the information processing theory. The focus is on the actions performed by the driver, not by the system. With the scenario analysis, the subtasks (to-do list) required for a take-over were separated, and the identification of causes for human errors became easier [14]. The results are presented in Figure 3. The third subtask, situation awareness, in the figure represents the cognitive process of gathering information to recognize the situation before making a decision to respond. Although it is also a continuous process, including perception and recognition, it is conceived as single subtask because it should be separated from the perception and recognition of the ToR.
As shown in Figure 3, most of the subtasks are under cognitive workload rather than a physical one. Clearly, the critical HFs must be highly associated with the cognitive capability to prevent making mistakes.

4.2. Human Factors (HFs) in Control Take-Overs

A thorough review of available literature on HFs was performed, and this review was mainly focused on cars [15,16,17,18,19,20,21,22,23,24,25,26,27,28,29], planes [30,31,32,33,34,35,36,37,38,39,40,41,42], and NPP operators [43,44,45,46,47,48,49]. The process started with drawing out all possible HFs and listing them up to avoid redundancy and duplicity of information. The HFs were composed of things that can potentially affect the performance of the take-over task. For example, vision and hearing capabilities were considered critical as these are commonly used as modalities for delivering warning signals.
Table 2 summarizes the HFs drawn out from the available literature along with their categories and sources.
In addition to the HFs found on previous studies, some additional factors were introduced as long as they can likely influence human performance in these particular task conditions. These HFs are residency, car ownership, family, and mobile and computer users. Residency was selected as people from different cities usually have particular driving styles based on their traditions and sense of urgency. Car ownership was proposed because drivers are usually more careful when the car is not theirs, especially if it is owned by a company or a friend. Family was also included in the list because people that have dependents tend to become more responsible, and this factor can influence the performance of control take-over. Lastly, mobile and computer users were added to identify people who are embracing the technology, such as early adopters.
Once the group of HFs was reduced to 36 items, the next step, i.e., categorization, was performed. However, for the purpose of analyzing the human errors, the grouping should be independent between categories and highly correlated within each category. The organization of HFs resulted in nine independent HF categories. These categories were simpler to handle when assigning relative weights in the interrelationship matrices. In addition, the levels for each HF were defined, and future experiments can be run in the related field with these levels. Through these experiments, the effect of HFs on the take-over performance can be analyzed.
The following table provides the HFs considered most influential on the take-over performance according to the proposed framework, and the column indicates whether the HFs were obtained from the literature on AVI, NPPs, or driving.
These HFs were analyzed against the most important human errors that can happen in a handover control process to prioritize the factors that can yield a higher performance influence over the take-over task.

4.3. Interrelationship Matrices

This subsection describes the estimation method of the level of associations between HFs and the cognitive subtasks using interrelation matrices. It also describes the estimation method of the importance of each HF based on the results from the interrelation matrices successively deployed for the HF category and the factors within the category. In the last part, a few critical HFs for the take-over control interaction are identified by the Pareto analysis based on the estimated importance.

4.3.1. Interrelationship between Take-Over Subtasks and Human Factor Category

The first step is to identify the relationship between the take-over subtasks and analyze the probable human errors with accident data.
The subtasks were weighted according to the percentage of car accident causes as reported by the National Highway Traffic Safety Administration (NHTSA) of the United States of America, as shown in Table 3 [50]. The classification related to the tasks found on the scenario analysis and summarized in Table 3 was performed in the following way: Recognition error accounts for inattention, distractions, and inadequate surveillance, which accounts as task 1 (Perceive Signal). The decision error was divided into task 3 (Situation Awareness) and task 4 (Decide Action). The performance error was divided into task 5 (Avoid Danger) and task 6 (Continue Safe Driving). Finally, the remaining percentage was assigned to task 2 (Recognize Warning).
The strength of the relationships between the take-over subtasks and HF category was assigned according to affinity and definition.
The following four-point scale was used:
  • Strong relationship: 9
  • Medium relationship: 3
  • Weak relationship: 1
  • No relationship: 0
For this interrelationship, the weights were assigned as follows: demography is more related to scenarios 1 and 2 and has a relatively higher weight (9). Cognitive workload is highly related and critical for scenarios 1, 2, and 5–6 and has a higher weight than the other HF categories.
The results of the interrelationship analysis are presented in Table 4. The importance per column is calculated by multiplying each of the relationship values by its importance percentage indicator: (example of demography importance calculation: (9 × 0.41) + (3 × 0.33) + (3 × 0.11) + (9 × 0.07) = 5.64).
The weighted total importance per row is represented by the sum of the values obtained from the columns. Clearly, the most important HF category influencing the take-over subtask is cognitive workload and demography. These categories have approximately 40% importance.

4.3.2. Interrelationship between Take-Over Subtasks and Human Factors (HFs)

The previous subsection identified the HF categories that are more important and worthier for a further analysis. In this subsection, one more deployment step of an interrelationship matrix was performed. The purpose of this deployment was to identify more critical HFs that belong to the demography and cognitive categories and relate them to probable human errors in the take-over subtasks. The results are summarized in Table 5.
On the Pareto analysis (Figure 4), the most critical HFs are as follows: age, focus, multitasking ability, IQ, and learning speed.

5. Discussion

Autonomous cars will most likely bring a great number of benefits for the future society once they are completely implemented. However, self-driving cars are still far from being completely independent of human drivers. The system’s dependency on the human component has encouraged human factors engineers to prioritize designing methods to enhance the usability and user experience of the system during interactions. According to Nielson [51], usability is a measurement of quality that a user is experiencing when interacting with a system and a component of the usefulness, which is also a part of the practical acceptability of the system.
Unlike machines where parameters can be tweaked for problem-solving, people cannot be easily changed and thus need to be aided by a properly designed interaction process. In the level 3 of autonomous driving, the driver needs to return in the loop by the system’s request for taking over control. Therefore, autonomous vehicles should be designed to provide proper interactions in accordance with its specified context of use [52]. The context of use heavily depends on the user’s characteristics, i.e., HFs. A standardized interaction procedure can hardly adapt the differences in HFs. The control take-over task will be more safe and pleasant if customized and personalized interactions can be given. Assumptions can be confirmed through experimental research; for example, whether providing a faster ToR warning to an elderly person and a driver with low concentration or divided attention ability will result in better ToQ can be examined.
To design a proper solution, it is substantial to understand the problem. Experimental research models need to be designed and tested to understand the effects of HFs on the acceptability of autonomous driving technology to secure its sustainability. In this sense, identifying a few critical HFs that are entitled to consider first as independent variables in these research models is a meaningful work. These critical HFs need to be identified by a sound and logical process. The first step must be to establish a pool of HFs related to the autonomous driving technology. Previous research done by Son and Park [9] shared the same idea; however, their contribution is limited in that they considered only a handful of probable HFs. The limitation is attributed to the prematurity of the autonomous HF research. More HFs were gathered from other sources, such as the literature on NPP and AVI, in this study. As shown in Table 2, several HFs can influence the level of expected performance of operators in autonomous driving. Importantly, these factors are controlled to some extent by the organization when it comes to NPP operators and AVI pilots. The levels of factors are limited because these operators are carefully screened, and only the most fitting to the task and conditions are selected. They are trained and deployed to work on site after a careful induction process and supervision of on the job training. However, anyone can pick up and drive a car; hence, the interface should support the drivers’ lack of fitness or condition. Essentially, this technology aims to specially aid those who require it the most in terms of safety. Therefore, the work domain analysis and human factors engineering knowledge were also used to identify the initial HFs and then categorize them.
The next step is to establish a criterion to estimate the importance of each HF based on its possible effects on making errors or degraded performance during a control take-over. In this study, this problem was resolved by a successive deployment of interrelationship matrices between HFs and the subtasks. It is postulated that the importance of each HF can be estimated by summing up its effect on perceived workloads to experience doing the subtasks. Using interrelationship matrices was adapted because they have been successfully implemented in finding out critical design parameters with respect to the user requirements based on their associations. As the result showed that most of the subtasks are under cognitive workload while taking over the control, the interrelationship between HFs and the subtasks are determined on the cognitive point of view. To assign weights of contribution in each subtask for making errors, critical accident data were analyzed by the cause of accidents in terms of the cognitive process. A concern was also raised regarding taking the percentage of cognitive causes from the database because these accidents happen when driving with ordinary automobiles. However, the percentage was taken without adjustments based on the following reasons. When the system calls a warning for a take-over, the automobile needs to get back to the manual driving mode. It is the driver’s responsibility to be aware of the situation and respond according to his or her decision under the circumstances. Thus, the cognitive workload and motor behaviors that the driver needs to get through are not significantly different.
One of the weak points in this research is that the values assigned on the interrelationship matrices were determined by the authors themselves. Usually, a deployment of this tool is done by a team of experts that can weigh in and discuss the value assigned on each cell. Thus, the results are tentative and can slightly differ depending on the team using the proposed framework. Interrelationship matrices can be applied on different environments, not just autonomous driving, to find out the connections between HFs and human errors. With the suggested steps followed in this study and with a proper team of experts, we can weigh in and validate or correct the specific values assigned on the different deployments of the matrices based on their own research model. The goal of this research is to contribute in making a sound and theoretical research model for doing further experimental research and propose a framework for identifying critical HFs in level 3 of autonomous driving technology based on literature reviews.
To validate the criticality of the identified HFs, experimental research should be designed. The goal of further research must be to show the differences in take-over performances according to HFs and find a way to customize the interactions for intervening the performance gaps. The dependent variables, such as the performance of a control take-over, can be ToT, ToQ, or others. These dependent variables can be measured in a simulated environment. The critical HFs or independent variables need to be included in an experimental research model, and their respective levels are suggested in detail in Table 6. The levels for each HF are given to serve as a basis for further experimental designs.
For age, in the first level ( < 20), inexperienced drivers are considered. In the second level (20 age < 40), drivers have relative experience and are more adaptable to new technologies. In the third level (40 age < 60), the physical capabilities can, in some cases, start to decrease, and the adaptation to new technologies is stronger. In the last level ( 60), a strong focus should be placed because it is a vulnerable group.
Focus and multitasking can be measured with some of the tests already designed for this, such as eye tracking, mouse tracking, and electroencephalogram. The goal will be to divide the drivers into groups that feel confident and have the skills to focus and/or multitask.
Intelligence measured by the IQ coefficient can be divided by a well-known scale of relative intelligence. The learning speed can be tested by a simple experiment, and the subjects can be divided according to the average responses.
The hypothesis of whether these critical HFs can explain the majority of take-over errors should be tested. If they cannot, the next critical HFs can be examined to sophisticate the model.

6. Conclusions

The identification of HFs that can influence a take-over performance can be a complex process. However, the framework proposed in this paper can be applied to identify critical HFs in take-overs under level 3 of autonomous driving. Nonetheless, the framework should be worked by a team of experts in different fields of autonomous driving and human factors engineers to obtain more accurate and reliable results. The proposed method can draw out a few critical HFs; however, experts’ input is critical to be certain of the validity of the results.
The critical HFs that were derived based on the tentative weights are as follows:
  • Age,
  • Focus capabilities,
  • Multitasking capabilities,
  • IQ,
  • Learning speed.
These HFs that are considered critical for take-over tasks are mostly related to cognitive HFs.
Further studies should be performed to validate the specific weights assigned by the authors with a team of experts in autonomous driving. With the validated HFs, an experimental design similar to the one proposed in this study must be set up, with the objective of analyzing the changes in performance by changing the levels of these factors.

Author Contributions

Conceptualization, S.H.K.; Formal analysis, J.F.S.C.; Funding acquisition, S.H.K.; Investigation, J.F.S.C.; Methodology, S.H.K.; Supervision, S.H.K.; Validation, J.G.S.; Visualization, J.G.S.; Writing—original draft, J.F.S.C.; Writing—review & editing, J.G.S. and S.H.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Kumoh National Institute of Technology: 2016-104-159.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. SAE International. Automated Driving—Levels of Driving Automation Are Defined in New SAE International Standard J3016; SAE International: Warrendale, PA, USA, 2014; Archived from the original on 15 January 2019. [Google Scholar]
  2. Fagnant, D.J.; Kockelman, K. Preparing a Nation for Autonomous Vehicles: Opportunities, Barriers and Policy Recommendations. Transp. Res. Part A Policy Pract. 2016, 77, 167–181. [Google Scholar] [CrossRef]
  3. Taherdoost, H. Importance of Technology Acceptance Assessment for Successful Implementation and Development of New Technologies. Glob. J. Eng. Sci. 2019, 1. [Google Scholar] [CrossRef] [Green Version]
  4. Morgan, P.; Alford, C.; Parkhurst, G. Handover Issues in Autonomous Driving: A Literature Review; University of the West of England: Bristol, UK, 2016. [Google Scholar]
  5. Endsley, M.R. From Here to Autonomy: Lessons Learned from Human–Automation Research. Hum. Factors 2017, 59, 5–27. [Google Scholar] [CrossRef] [PubMed]
  6. Vlaveld, W. Transition of Control in Highly Automated Vehicles; SWOV Institute for Road Safety Research: Leidschendam, The Netherlands, 2015; R-2015-22. [Google Scholar]
  7. Wearable AI Market by Product (Smart Watch, Ear Wear, Eye Wear), Operation (On-Device AI, Cloud-Based AI), Component (Processor, Connectivity IC, Sensors), Application (Consumer Electronics, Enterprise, Healthcare), and Geography—Global Forecast to 2023. Available online: https://www.marketsandmarkets.com/Market-Reports/wearable-ai-market-168051207.html (accessed on 21 December 2019).
  8. Preece, J.; Rogers, Y.; Sharp, H. Interaction Design: Beyond Human-Computer Interaction, 4th ed.; John Wiley & Sons Ltd.: Chichester, UK, 2015. [Google Scholar]
  9. Son, J.W.; Park, M.O. Situation Awareness and Transitions in Highly Automated Driving: A Framework and Mini-Review. J. Ergon. 2017, 7, 1–6. [Google Scholar] [CrossRef]
  10. Akao, Y.; Mazur, G.H. The Leading Edge in QFD: Past, Present and Future. Int. J. Qual. Reliab. Manag. 2003, 20, 20–35. [Google Scholar] [CrossRef]
  11. Tapke, J.; Muller, A.; Johnson, G.; Sieck, J. House of Quality—Steps in Understanding the House of Quality; Iowa State University: Ames, IA, USA, 2009; Available online: https://vardeman.public.iastate.edu/IE361/f01mini/johnson.pdf (accessed on 19 December 2019).
  12. Youssef, C.; Waldele, M.; Herbert, B. QFD—A link between Customer Requirements and Product Properties. Proceedings of ICED 2007, the 16th International Conference on Engineering Design, Paris, France, 28–31 August 2007. [Google Scholar]
  13. Walch, M.; Mühl, K.; Baumann, M.; Weber, M. Autonomous Driving: Investigating the Feasibility of Bimodal Take-Over Requests. Int. J. Mob. Hum. Comput. Interact. (IJMHCI) 2017, 9, 58–74. [Google Scholar] [CrossRef] [Green Version]
  14. Dalijono, T.; Castro, J.; Löwe, K.; Löher, H.J. Reducing Human Error by Improvement of Design and Organization. Process Saf. Environ. Prot. 2006, 84, 191–199. [Google Scholar] [CrossRef]
  15. Kaur, K.; Rampersad, G. Trust in driverless cars: Investigating key factors influencing the adoption of driverless cars. J. Eng. Technol. Manag. 2018, 48, 87–96. [Google Scholar] [CrossRef]
  16. Loeb, H.; Belwadi, A.; Maheshwari, J.; Shaikh, S. Age and gender differences in emergency takeover from automated to manual driving on simulator. Traffic Inj. Prev. 2019, 1–3. [Google Scholar] [CrossRef]
  17. Sportillo, D.; Paljic, A.; Ojeda, L. On-road evaluation of autonomous driving training. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019; pp. 182–190. [Google Scholar]
  18. Lundqvist, L.M.; Eriksson, L. Age, cognitive load, and multimodal effects on driver response to directional warning. Appl. Ergon. 2019, 76, 147–154. [Google Scholar] [CrossRef]
  19. Zhang, Y.; Sun, P.; Yin, Y.; Lin, L.; Wang, X. Human-like autonomous vehicle speed control by deep reinforcement learning with double Q-learning. In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium(IV), Changshu, China, 26–30 June 2018; pp. 1251–1256. [Google Scholar]
  20. Aghaei, A.S.; Donmez, B.; Liu, C.C.; He, D.; Liu, G.; Plataniotis, K.N.; Chen, H.Y.W.; Sojoudi, Z. Smart Driver Monitoring: When Signal Processing Meets Human Factors: In the Driver’s Seat. IEEE Signal Process. Mag. 2016, 33, 35–48. [Google Scholar] [CrossRef]
  21. Arakawa, T. Trial verification of human reliance on autonomous vehicles from the viewpoint of human factors. Int. J. Innov. Comput. Inf. Control 2018, 14, 491–501. [Google Scholar]
  22. Yoo, H.W.; Druml, N.; Brunner, D.; Schwarzl, C.; Thurner, T.; Hennecke, M.; Schitter, G. MEMS-based lidar for autonomous driving. e & i Elektrotechnik Und Inf. 2018, 135, 408–415. [Google Scholar]
  23. Clark, J.R.; Stanton, N.A.; Revell, K.M. Conditionally and highly automated vehicle handover: A study exploring vocal communication between two drivers. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 699–715. [Google Scholar] [CrossRef] [Green Version]
  24. Salmon, P.M.; Lenné, M.G.; Stanton, N.A.; Jenkins, D.P.; Walker, G.H. Managing Error on the Open Road: The Contribution of Human Error Models and Methods. Saf. Sci. 2010, 48, 1225–1235. [Google Scholar] [CrossRef]
  25. Li, X.; Schroeter, R.; Rakotonirainy, A.; Kuo, J.; Lenné, M.G. Effects of different non-driving-related-task display modes on drivers’ eye-movement patterns during take-over in an automated vehicle. Transp. Res. Part F Traffic Psychol. Behav. 2020, 70, 135–148. [Google Scholar] [CrossRef]
  26. Dixit, V.V.; Chand, S.; Nair, D.J. Autonomous Vehicles: Disengagements, Accidents and Reaction Times. PLoS ONE 2016, 11, E0168054. [Google Scholar] [CrossRef] [Green Version]
  27. Blanco, M.; Atwood, J.; Vasquez, H.M. Human Factors Evaluation of Level 2 and Level 3 Automated Driving Concepts; Virginia Tech Transportation Institute: Blacksburg, VA, USA, 2015. [Google Scholar]
  28. Cho, Y.; Park, J.; Park, S.; Jung, E.S. Technology Acceptance Modeling Based on User Experience for Autonomous Vehicles. J. Ergon. Soc. Korea 2017, 36, 87–108. [Google Scholar]
  29. Li, L.; Ota, K.; Dong, M. Humanlike driving: Empirical decision-making system for autonomous vehicles. IEEE Trans. Veh. Technol. 2018, 67, 6814–6823. [Google Scholar] [CrossRef] [Green Version]
  30. Bazargan, M.; Guzhva, V.S. Impact of gender, age and experience of pilots on general aviation accidents. Accid. Anal. Prev. 2011, 43, 962–970. [Google Scholar] [CrossRef]
  31. Gupta, I.; Kalra, P.; Chawla, P.; Singh, J. Evaluation of Pilot’s Seat Design of Civil Aircraft for Indian Anthropometric Data by using Delmia Human Software. Procedia Manuf. 2018, 26, 70–75. [Google Scholar] [CrossRef]
  32. Portal: OGHFA-Skybrary Aviation. Available online: https://www.skybrary.aero/index.php/Portal:OGHFA (accessed on 19 March 2019).
  33. Yang, C.; Yin, T.; Zhao, W.; Huang, D.; Fu, S. Human factors quantification via boundary identification of flight performance margin. Chin. J. Aeronaut. 2014, 27, 977–985. [Google Scholar] [CrossRef] [Green Version]
  34. Brezonakova, A. Pilot burnout as a human factor limitation. Transp. Res. Procedia 2017, 28, 11–15. [Google Scholar] [CrossRef]
  35. Campbell, J.S.; Castaneda, M.; Pulos, S. Meta-Analysis of Personality Assessments as Predictors of Military Aviation Training Success. Int. J. Aviat. Psychol. 2009, 20, 92–109. [Google Scholar] [CrossRef]
  36. Ion, D.C. Human Factors in Aviation: Crew Management. In Proceedings of the International Conference of Scientific Paper AFASES 2011, Brasov, Germany, 26–28 May 2011. [Google Scholar]
  37. Brown, J.P. The effect of automation on human factors in aviation. J. Instrum. Autom. Syst. 2016, 3, 31–46. [Google Scholar] [CrossRef]
  38. Mohrmann, F.; Stoop, J. Airmanship 2.0: Innovating aviation human factors forensics to necessarily proactive role. In Proceedings of the Future Safety: Has the Past Become Irrelevant? The Hague, The Netherlands, 3–5 September 2019. [Google Scholar]
  39. Holland Cook, C.C. Alcohol and aviation. Addiction 1997, 92, 539–555. [Google Scholar] [CrossRef]
  40. Hebbar, P.A.; Pashilkar, A.A. Analysing human pilot control behaviour for an aircraft handling qualities task. In Proceedings of the 7th Symposium on Applied Aerodynamics and Design of Aerospace Vehicles, Trivandrun, Kerala, India, 17–20 December 2015. [Google Scholar]
  41. Kappenberger, C.; Stepniczka, I. HMIAC-Survey on Human-Machine Interaction in Aircraft Cockpits. In Proceedings of the 28th Congress of the International Council of the Aeronautical Sciences, Brisbane, Australia, 23–28 September 2012. [Google Scholar]
  42. Hunter, D.R. Measurement of hazardous attitudes among pilots. Int. J. Aviat. Psychol. 2005, 15, 23–43. [Google Scholar] [CrossRef]
  43. Hwang, S.L.; Liang, S.F.M.; Liu, T.Y.Y.; Yang, Y.J.; Chen, P.Y.; Chuang, C.F. Evaluation of Human Factors in Interface Design in Main Control Rooms. Nucl. Eng. Des. 2009, 239, 3069–3075. [Google Scholar] [CrossRef]
  44. Carvalho, P.V.; dos Santos, I.L.; Gomes, J.O.; Borges, M.R.; Guerlain, S. Human Factors Approach for Evaluation and Redesign of Human–System Interfaces of a Nuclear Power Plant Simulator. Displays 2008, 29, 273–284. [Google Scholar] [CrossRef]
  45. Han, S.H.; Yang, H.; Im, D.G. Designing a Human–Computer Interface for a Process Control Room: A Case Study of a Steel Manufacturing Company. Int. J. Ind. Ergon. 2007, 37, 383–393. [Google Scholar] [CrossRef]
  46. Guttromson, R.T.; Schur, A.; Greitzer, F.L.; Paget, M.L. Human Factors for Situation Assessment in Power Grid Operations; Pacific Northwest National Laboratory (PNNL): Richland, WA, USA, 2007. [Google Scholar]
  47. Carvalho, P.V.; dos Santos, I.L.; Vidal, M.C. Safety Implications of Cultural and Cognitive Issues in Nuclear Power Plant Operation. Appl. Ergon. 2006, 37, 211–223. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. DeVita-Cochrane, C.C. Personality Factors and Nuclear Power Plant Operators: Initial License Success. Ph.D. Thesis, Walden University, Minneapolis, MN, USA, 2015. [Google Scholar]
  49. Guide, I.S. Recruitment, Qualification and Training of Personnel for Nuclear Power Plants; International Atomic Energy Agency Standard Series No. NS-G-2.8; IAEA: Vienna, Austria, 2002. [Google Scholar]
  50. Singh, S. Critical Reasons for Crashed Investigated in The National Motor Vehicle Crash Causation Survey; DOT HS 812 115; NHTSA’s National Center for Statistics and Analysis: Washington, DC, USA, 2015. [Google Scholar]
  51. Nielsen, J. Usability Engineering, 1st ed.; Morgan Kaufmann: Burlington, NJ, USA, 1994. [Google Scholar]
  52. ISO 9241-11. Ergonomics Requirements for Office Work with Visual Display Terminals (VDTs)-Part 11: Guidance on Usability. Available online: https://www.iso.org/standard/16883.html (accessed on 6 October 2017).
Figure 1. Quality function deployment.
Figure 1. Quality function deployment.
Sustainability 12 03030 g001
Figure 2. Proposed framework for identifying critical human factors (HFs) in a take-over.
Figure 2. Proposed framework for identifying critical human factors (HFs) in a take-over.
Sustainability 12 03030 g002
Figure 3. Scenario analysis in a take-over situation.
Figure 3. Scenario analysis in a take-over situation.
Sustainability 12 03030 g003
Figure 4. Pareto chart: Take-over subtasks and HFs.
Figure 4. Pareto chart: Take-over subtasks and HFs.
Sustainability 12 03030 g004
Table 1. Research procedure.
Table 1. Research procedure.
Step #ToolsOutput
1Take-over scenario analysisRequired subtasks
2Literature review of HFsHFs and their categories
3Interrelationship analysisRelation and importance: Take-over subtasks and HF category (Phase 1)
4Relation and importance: Take-over subtasks and HFs (Phase 2)
Table 2. Potential HFs in control take-overs.
Table 2. Potential HFs in control take-overs.
HF CategoryAVINPPDRIHuman FactorLevelsReference
1. Demography* *1. GenderMale
Female
Kaur and Rampersad, 2018 [15]
Loeb et al., 2019 [16]
Bazargan and Guzhva, 2011 [30]
***2. AgeAge < 20
20 ≤ Age < 40
40 ≤ Age < 60
Age ≥ 60
Kaur and Rampersad, 2018 [15]
Loeb et al., 2019 [16]
Sportillo et al., 2019 [17]
Lundqvist and Eriksson, 2019 [18]
Zhang et al., 2018 [19]
Bazargan and Guzhva, 2011 [30]
Hwang et al., 2009 [43]
* 3. HeightTaller than average
Average
Shorter than average
Gupta et al., 2018 [31]
* 4. WeightNormal
Overweight
Obese L1, L2
Obese L3
Gupta et al., 2018 [31]
5. ResidencyUrban
Suburban
Rural
2. Physical capabilities***6. VisionNormal level
Low level
Chronic level
Aghaei et al., 2016 [20]
Arakawa, 2018 [21]
Yoo et al., 2018 [22]
OGHFA, 2017 [32]
Yang et al., 2014 [33]
Hwang et al., 2009 [43]
Carvalho et al., 2008 [44]
Han et al., 2007 [45]
***7. HearingClark et al., 2019 [23]
OGHFA, 2017 [32]
Hwang et al., 2009 [43]
Han et al., 2007 [45]
*8. CardiovascularAghaei et al., 2016 [20]
*9. PulmonaryAghaei et al., 2016 [20]
** 10. FlexibilityYang et al., 2014 [33]
Carvalho et al., 2008 [44]
Pacific Northwest National Laboratory, 2007 [46]
* *11. CoordinationSalmon et al., 2010 [24]
Yang et al., 2014 [33]
3. Health* 12. Chronic/temporal diseasesYes Chronic
Yes Temporal
No
Brezonakova, 2017 [34]
Campbell et al., 2009 [35]
Ion, 2011 [36]
* 13. Healthy lifestyleHydrated
Regular Exercise
Brezonakova, 2017 [35]
Ion, 2011 [36]
* *14. Emotionally stableStress
Depression
Anxiety
Aghaei et al., 2016 [20]
Ion, 2011 [36]
Brown, 2016 [37]
Mohrmann and Stoop, 2019 [38]
* *15. Amount of sleepBetween 6 and 10 h
Less than 6 h
More than 6 h
Aghaei et al., 2016 [20]
Arakawa, 2018 [21]
OGHFA, 2017 [32]
Brezonakova, 2017 [34]
* 16. DietFasting religious beliefs
Fasting medical check
Supervised
Unsupervised
Disorders
Eating time
Ion, 2011 [36]
* 17. DrugsDepressant or Hallucinogen
Performance enhancer
Painkiller
Stimulant
Holland and Cook, 1997 [39]
4. Cognitive***18. FocusEasy to divert
Ease of boredom
Arakawa, 2018 [21]
Salmon et al., 2010 [24]
Li et al., 2020 [25]
Oghfa, 2017 [32]
Campbell et al., 2009 [35]
Carvalho et al., 2006 [47]
**19. MultitaskingMultitasker
Not able to multitask
Aghaei et al., 2016 [20]
Li et al., 2020 [25]
Carvalho et al., 2006 [47]
** 20. Intelligence quotient (IQ)Superior
Average
Lower
Mohrmann and Stoop, 2019 [38]
Hwang et al., 2009 [43]
Han et al., 2007 [45]
*21. Learning speedSuperior
Average
Lower
Salmon et al., 2010 [24]
Dixit et al., 2016 [26]
Virginia Tech Transportation
Institute, 2017 [27]
**22. Education levelUniversity graduate level
University pre-graduate level
High school level
Lower than High school
Kaur and Rampersad, 2018 [15]
Hwang et al., 2009 [43]
5. Experience***23. License typeProfessional
Not professional
Kaur and Rampersad, 2018 [15]
Hebbar and Pashilkar, 2015 [40]
DeVita-Cochrane, 2015 [48]
** 24. Driving experienceBeginner
Experienced
Zhang et al., 2018 [19]
Ion, 2011 [36]
Bazargan and Guzhva, 2011 [30]
DeVita-Cochrane, 2015 [48]
* *25. ADAS exposure yearsNo exposure
Less than 1 year
More than 1 year
Kaur and Rampersad, 2018 [15]
Cho et al., 2017 [28]
Kappenberger and Stepniczka, 2012 [41]
* *26. Autonomous driving interactionsDixit et al., 2016 [26]
Kappenberger and Stepniczka, 2012 [41]
6. Behavior** 27. PersonalityNeuroticism
Extraversion
Openness to experience
Agreeableness
Conscientiousness
Ion, 2011 [36]
Hebbar and Pashilkar, 2015 [40]
Hunter, 2005 [42]
DeVita-Cochrane, 2015 [48]
* *28. Driving styleAggressive
Defensive
Salmon et al., 2010 [24]
Li et al., 2018 [29]
Hunter, 2005 [42]
*29. Car ownershipOwn car
Rented car
Company car
Kaur and Rampersad, 2018 [15]
Salmon et al., 2010 [24]
7. Cultural influence 30. FamilyCivil status
Dependents
** 31. CollectivismIndividualism
Groupism
Campbell et al., 2009 [35]
Ion, 2011 [36]
International Atomic Energy Agency, 2002 [49]
8. Work * 32. Job positionOperative
Administrative
Supervision
Management
Strategic
International Atomic Energy Agency, 2002 [49]
**33. Working shiftDay
Night
Rotative
Salmon et al., 2010 [24]
Carvalho et al., 2006 [47]
9. Tech trust 34. Smartphone userAverage hours use
Higher average hours use
35. Computer user
* *36. Trust in ADASHigh confidence
No confidence
Dixit et al., 2016 [26]
Cho et al., 2017 [28]
Kappenberger and Stepniczka, 2012 [41]
Table 3. The National Highway Traffic Safety Administration (NHTSA)’s critical reasons for vehicle accidents.
Table 3. The National Highway Traffic Safety Administration (NHTSA)’s critical reasons for vehicle accidents.
Critical ReasonEstimated (Based on 94% of the NMVXCCS Crashed)Take-Over Subtasks
NumberPercentage *
± 95% Conf. Limits
Recognition error845,00041% ± 2.2%2. Recognize Warning
Decision error684,00033% ± 3.7%3. Situation Awareness
4. Decide Action
Performance error210,00011% ± 2.7%5. Avoid Danger
6. Stabilize Driving Mode
Non-performance error (e.g., sleep)145,0007% ± 1.0%1. Perceive Warning Signal
Others162,0008% ± 1.9%
Total2,046,000100%
* Percentages are based on unrounded estimated frequencies
(Data source: NMVCCS 2005–2007)
Table 4. Interrelationship matrix: Take-over subtasks and HF category.
Table 4. Interrelationship matrix: Take-over subtasks and HF category.
Take-Over SubtasksHF CategoryWeightCritical Reason
DemographyPhysical CapabilitiesHealthCognitiveExperienceBehaviorCultural InfluenceWorkTech Trust
2. Recognize Warning 9939931130.41Recognition Error
3. Situation Awareness
4. Decide Action
3193311330.33Decision Error
5. Avoid Danger
6. Stabilize Driving Mode
3119333310.11Performance Error
1. Perceive Warning Signal9139311110.07Non-Performance Error
Importance5.644.24.526.35.221.961.141.182.4
Ratio (%)17.3212.9013.8819.3516.036.023.503.627.37
Priority254137986
Table 5. Interrelation matrix: Take-over subtasks and HFs.
Table 5. Interrelation matrix: Take-over subtasks and HFs.
Take-Over SubtasksHFs in Demography and Cognitive WorkloadWeightCritical Reason
GenderAgeHeightWeightResidencyFocusMultitaskingIQLearning SpeedEducation Level
2. Recognize Warning 19001993330.41Recognition Error
3. Situation Awareness
4. Decide Action
19113939930.33Decision Error
5. Avoid Danger
6. Stabilize Driving Mode
19003999310.11Performance Error
1. Perceive Warning Signal19001993330.07Non-Performance Error
Importance0.928.280.330.331.808.287.625.44.742.54
Ratio (%)2.2920.580.820.824.4720.5818.9413.4211.786.31
Priority8199713456
Table 6. Independent variables.
Table 6. Independent variables.
Human FactorLevels
Age* Age < 20
* 20 Age < 40
* 40 Age < 60
* Age 60
Focus* Easy to divert
* Ease of boredom
Multitasking* Multitasker
* Not able to multitask
IQ* Superior ( > 110)
* Average (90–109)
* Lower ( < 90)
Learning speed* Superior
* Average
* Lower

Share and Cite

MDPI and ACS Style

Cárdenas, J.F.S.; Shin, J.G.; Kim, S.H. A Few Critical Human Factors for Developing Sustainable Autonomous Driving Technology. Sustainability 2020, 12, 3030. https://doi.org/10.3390/su12073030

AMA Style

Cárdenas JFS, Shin JG, Kim SH. A Few Critical Human Factors for Developing Sustainable Autonomous Driving Technology. Sustainability. 2020; 12(7):3030. https://doi.org/10.3390/su12073030

Chicago/Turabian Style

Cárdenas, José Fernando Sabando, Jong Gyu Shin, and Sang Ho Kim. 2020. "A Few Critical Human Factors for Developing Sustainable Autonomous Driving Technology" Sustainability 12, no. 7: 3030. https://doi.org/10.3390/su12073030

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop