*Article* **A Change of Paradigm for the Design and Reliability Testing of Touch-Based Cabin Controls on the Seats of Self-Driving Cars**

**Tiago Custódio 1,2,†, Cristiano Alves 1,2,†, Pedro Silva 3,\*, Jorge Silva 3, Carlos Rodrigues 3, Rui Lourenço 3, Rui Pessoa 3, Fernando Moreira 3, Ricardo Marques 3, Gonçalo Tomé <sup>3</sup> and Gabriel Falcao 1,2**


**Abstract:** The current design paradigm of car cabin components assumes seats aligned with the driving direction. All passengers are aligned with the driver that, until recently, was the only element in charge of controlling the vehicle. The new paradigm of self-driving cars eliminates several of those requirements, releasing the driver from control duties and creating new opportunities for entertaining the passengers during the trip. This creates the need for controlling functionalities that must be closer to each user, namely on the seat. This work proposes the use of low-cost capacitive touch sensors for controlling car functions, multimedia controls, seat orientation, door windows, and others. In the current work, we have reached a proof of concept that is functional, as shown for several cabin functionalities. The proposed concept can be adopted by current car manufacturers without changing the automobile construction pipeline. It is flexible and can adopt a variety of new functionalities, mostly software-based, added by the manufacturer, or customized by the end-user. Moreover, the newly proposed technology uses a smaller number of plastic parts for producing the component, which implies savings in terms of production cost and energy, while increasing the life cycle of the component.

**Keywords:** cabin control functions; touch sensor; touch button; capacitive sensor; injected plastic; plastic button; embedded lighting; reliability testing; automotive industry; self-driving car

#### **1. Introduction**

Recent advances in self-driving cars are expected to translate into a significant number of new vehicles circulating using this new paradigm in the coming years [1,2]. Many works state that by 2050 self-driving cars will dominate, which creates new opportunities but also new challenges [3]. In this context, the human driver will likely be released from functions, offloading that responsibility to the machine and artificial intelligence algorithms [4,5]. Once relieved from driving duties [6], the driver will benefit from new services inside the car cabin, including those related to using multimedia and communications such as games, work meetings, movies, music and Internet browsing, to name a few. Furthermore, there may no longer exist a clear separation of the passengers the way we have it today due to the transmission tunnel on the cockpit's floor. In this context, it is possible that a brand differentiator may well consist of the interior design of the car cabin, namely the seat arrangement and orientation, as well as the floor of the car cabin [7,8].

Therefore, the drivers' seat, as well as the seats of the other passengers no longer need to be aligned with the moving direction. Moreover, the seats should be able to rotate on their axis instead of simply sliding back and forth to adjust proper leg positioning as they do today.

**Citation:** Custódio, T.; Alves, C.; Silva, P.; Silva, J.; Rodrigues, C.; Lourenço, R.; Pessoa, R.; Moreira, F.; Marques, R.; Tomé, G.; et al. A Change of Paradigm for the Design and Reliability Testing of Touch-Based Cabin Controls on the Seats of Self-Driving Cars . *Electronics* **2022**, *11*, 21. https://doi.org/ 10.3390/electronics11010021

Academic Editors: Calin Iclodean, Bogdan Ovidiu Varga and Felix Pfister

Received: 13 November 2021 Accepted: 13 December 2021 Published: 22 December 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

To serve this purpose, the passengers no longer will be aligned with the driving direction, possibly turning to face each other, which creates the need to add control buttons to operate general functionalities from their seats, as opposed to having these control buttons on the doors and central panel.

The development of this technology will imply redesigning the seats to incorporate these controls. Today's standard for implementing funcional controls in the automotive industry is to build buttons which are composed of mechanical devices comprised of several plastic polymer parts [9]. The actual standard presents several disadvantages as compared to robust alternatives that can provide a functionality within a single plastic part.

Touch-based technologies [10] can pave the way towards the integration of sensors coupled to a single plastic part attached to the seat. In this case, the touch sensors lying below the plastic part will be able to sense the interaction of the passenger to control a certain functionality. This is important as it allows a reduced assembling time and a larger Mean Time Between Failures (MTBF). Moreover, it allows the same part design to be incorporated into many models of the brand, since the different funcionalities are controlled by software.

In this work, we developed the proposed technology and designed a new car seat that incorporates a viable design for controlling the opening and closing of car side windows and seat positioning. We were able to develop and test the technology on a real car used in the market and also on a new seat with an innovative design able to be incorporated in self-driving cars.

This article presents the following contributions:


This work is also part of the 'Collective Efficiency Strategies', totally aligned with the 'Mobinov—Automobile Cluster Association', which identified as one of its main goals "to contribute to making Portugal a reference in research, innovation, design, development, manufacture and testing of products and services of the automotive industry" and "strengthen the competitiveness of a fundamental sector of the economy, promoting an increase in exportation" [9].

This article is structured as follows. Section 2 describes the materials and methods used, analyzing in depth the capacitive touch sensor technology selected for the current context, and validating its use over thousands of cycles of operation. Section 3 addresses implementation and results. It analyzes the design of the newly developed car seat and the methods used to integrate injectable plastic with capacitive touch sensors in it. The discussion and future research directions in Section 4 close the article.

#### **2. Materials and Methods**

Distinct sensing technologies such as inductive, infrared, ultrasound, resistive and capacitive-based ones were analyzed and tested in depth [9], and their characteristics compared. The decision for the most appropriate technology to use in this context befell upon capacitive touch sensors depicted in Figure 1a,b. The comparison criteria ranged from ease of use, reliability, cost, ease of integration and smallest design footprint [11,12].

**Figure 1.** Touch-based technology. (**a**) General illustration that resembles the use of mobile phones and (**b**) technical detail of the touch surface.

#### *2.1. Touch Sensors for Activating Functions in Polymers*

Capacitive touch sensing is a low-cost and low-complexity technology [13] ubiquitous in today's devices. Thus, in a way, there is already a precedent on how people expect these interfaces to work and how to interact with them [14]. Several capacitive sensors (four examples are depicted in Figure 2) were tested with plastic samples of varying thickness and composition as exhaustively described in Subsection 3.1 of [9]. The sensors experimented offered two different designs and two different channel counts. The experimental results were successful to the extent where the touch detection rate nearly reached 100% for every sensor tested, as presented in Table 1.

After validating the technology and method, in order to reduce the cost and the design footprint, it was decided to conceive the sensors directly on the Printed Circuit Board (PCB) (see Figure 3). The way these capacitive sensors work is based on a method known as self-capacitance [15], where each sensor is read using a single input of the system. Self-capacitance touch sensors use a single sensor electrode to measure the apparent capacitance between the electrode and the ground of the touch sensor. This method offers good immunity to noise induced from neighbouring sensors and circuitry.

To build a proof-of-concept design using this sensor, a microcontroller is required for reading sensing data and acting accordingly. A microcontroller consists of an embedded processor with auxiliary peripherals, I/O and electronics to interface with external sensors and other hardware. The microcontroller necessary for this prototype should ease the job of integrating all the technology by having peripherals to interact with capacitive sensors and potentially interact with an automobile's Electronic Control Unit (ECU) [16], as depicted in Figure 4.

**Figure 2.** The figure depicts the capacitive sensors tested for the current design from (**a**–**d**). The electrical properties and datasheets for each sensor can be found in Table 1. The effectiveness tests performed these sensors are described in Table 2.

**Table 1.** Electrical characteristics of the tested sensors in Figure 2.



**Table 2.** Effectiveness tests for sensors 1–4 depicted in Figure 2.

(**a**)

**Figure 3.** First prototype of a touch sensor with self capacitance technology. The resulting touch sensor design is shown in (**a**). The bottom PCB is shown in (**b**). The PCB assembled on a plastic part is depicted in (**c**).

With these criteria in mind, the ATSAMC21J18A microcontroller from Microchip was adopted. It features a 32-bit processor architecture, a processor speed of 48 MHz, 32 KB of RAM, 264 KB of non-volatile (Flash) storage memory and 52 general-purpose input/output (GPIO) pins. It also features two important peripherals needed to ease the integration: a Peripheral Touch Controller (PTC) and a Controller Area Network (CAN) [17]. The PTC peripheral makes the process of sampling and validating the capacitive touch sensors more reliable and robust as all of these steps are performed automatically and periodically by the hardware itself. The CAN peripheral [18] was planned as a viable means of integrating this prototype with the automobile's ECU since it implements a standard bus of communications used by the automobile industry. The microcontroller development board and the developed prototype are depicted in Figures 4 and 5, respectively.

**Figure 4.** PCB and plastic part connected to the sensors and the ATSAMC21J18A microcontroller development board.

**Figure 5.** PCB with sensors based on a mutual-capacitance design assembled on a plastic part.

After prototyping this first version and validating the integration method, further design changes were implemented in order to reduce cost and footprint even more. Where previously there was a need to match the amount of GPIOs of the microcontroller to the number of sensors employed using the "Self-Capacitance" [15] design previously explained how we further reduced the number of GPIOs needed using a design called "Mutual-Capacitance" [19]. By using this alternative approach, if sensors are arranged with the electrical connections in a matrix-like array, it would only require M × N GPIOs on the microcontroller, M being the number of rows and N being the number of columns.

Compared against the previous version of the prototype, where 19 GPIOs were needed in order to read the 19 sensors using the "Self-Capacitance" design, in this case only 10 GPIOs are required using a configuration of 3 rows and 7 columns. This design improvement allowed for a greater reduction in cost by making it viable to select a cheaper but

powerful microcontroller at the cost of a slightly more complex PCB design (with negligible effect in final cost). The microcontroller chosen was the ATTINY3217 [20], which allows a saving of nearly 71% of the cost when compared to the previously chosen microcontroller.

#### *2.2. Touch Surface Backlighting*

Another challenge tackled is the need for adequate backlighting on the touch surface. Backlighting can be used to appropriately guide the user to the touch control, to illuminate pictograms depicting the control function, or both functions.

Integrating backlighting poses a technical challenge as the light needs to be concentric with the touch sensor but cannot interfere with the touch sensing, neither by constraining the surface area for the sensor employment nor by inducing noise on the sensor line. The employed method consisted of drilling a hole in the middle of the sensor that is large enough to allow proper dispersion of the light on the control pictograms. The LEDs were assembled on the underside of the PCB, but shining upwards as suggested in Figure 6.

**Figure 6.** Illustration on how to couple a LED back light to a capacitive touch sensor. The hole where the LED scatters light has a diameter equal to 3 mm.

This approach allows the LED not to interfere electrically with the sensor while properly illuminating the control surface.

After carefully iterating over several drilling diameters, a proper dispersion was achieved using 3 mm, as depicted in Figure 7 and this development was employed in the final prototype. The backlighting effect is illustrated in the intermediate prototype developed shown in Figure 8.

**Figure 7.** PCB with LEDs shining from behind the sensors and the measuring of the light's hotspot and dispersion diameters.

**Figure 8.** Prototype with backlight testing of two different LED colors.

#### *2.3. Validation and Reliability*

The testing of such a piece of hardware can be performed manually or automated with machinery. Testing manually is a time-intensive task and—albeit closer to the real usage of the equipment—requires at least one person to perform the tests by hand and keep track of the success and failure rates. Furthermore there is no way to ensure that the tester performs every test in the same way, with the same motions and adequate finger pressure. Automating the testing by using machinery allows faster tests and a greater degree of accuracy in the motions required. In addition, performing tests faster allows for a greater number of test samples to be acquired in the same time-frame, thus making the whole validation more reliable. Therefore, whenever possible, tests should be automated. An adequate machine for performing this task is a 3-axis Computer Numerical Control (CNC) system [21]. These machines have 3 axis of independent movement and are ubiquitous in several industries (e.g., ranging from 3D printers to assembly lines). They work by having a coupled computer following a precise script telling the machine the movements that have to be done. The adopted setup uses an industrial 3-axis CNC machine [21] with a capacitive "finger" probe (please see Figure 9c) attached in the gripper part as depicted in Figure 9d.

In order to automate the testing procedure we made use of available digital input lines on the CNC controller board in order to allow the CNC to be told by the tested equipment if a given probe touch was detected. All of this was orchestrated using a G-Code script which is the standard scripting language to program a CNC machine to perform a set of movements. The source code and flowchart developed are illustrated in the next page and in Figure 10.

The algorithm employed in our G-Code script can be summarized in a few simple steps:


**Figure 9.** The figure depicts the testing environment specifically designed and developed from scratch to validate the touch-based prototype. In (**a**) the structure of the CNC is shown. (**b**) Depicts the CNC controller board with digital inputs. (**c**) Illustrates the the "finger" probes that emulate a human touch. The gripper in (**d**) was custom-designed and printed using 3D printing technology to allow attachment the probe to the moving part of the CNC strucutre.

**Figure 10.** Flowchart of the CNC test script. The G-Code source code is illustrated next.

```
#<loop_count> = 0
#<timeout> = 0
#<timeout_limit> = 50 ;tenths of second
#<successes> = 0
#<misses> = 0
#<t_start> = datetime[]
(logopen,Teste_Touch_Tranca.log)
G00 X338 Y188 Z-430
o100 repeat [10000] ;repeat ten thousand times
    #<s> = [datetime[] - #<t_start>]
    (print,#<s>, Iteration #<loop_count,0>, Successes: #<successes,0>, Misses: #<misses,0>)
    (log,#<s>, Iteration #<loop_count,0>, Successes: #<successes,0>, Misses: #<misses,0>)
    G00 Z-430 ;move to start position
    o200 while [[#<timeout> LE #<timeout_limit>] AND [#<_hw_input> EQ 1]] ;wait for previous touch release
     G04 P0.1
     #<timeout> = [#<timeout> + 1]
    o200 endwhile
    o300 if [#<timeout> EQ #<timeout_limit>]
     (print,Touch release timeout)
     (log,Touch release timeout)
    o300 endif
    #<timeout> = 0
    G00 Z-440 ;move to touch position
    o400 while [[#<timeout> LE #<timeout_limit>] AND [#<_hw_input> EQ 0]] ;wait for touch signal or
    timeout
     G04 P0.1
     #<timeout> = [#<timeout> + 1]
    o400 endwhile
    o500 if [[#<timeout> LE #<timeout_limit>] AND [#<_hw_input> EQ 1]] ;check if we reached a timeout
     #<timeout> = [#<timeout> / 10]
     (print,Touch signal received after #<timeout,1>s)
     (log,Touch signal received after #<timeout,1>s)
     #<successes> = [#<successes> + 1] ;increment touch success counter
    o500 else
     (print,Touch signal timeout)
     (log,Touch signal timeout)
     #<misses> = [#<misses> + 1] ;increment touch miss counter
    o500 endif
    #<timeout> = 0
    #<loop_count> = [#<loop_count> + 1] ;increment iteration counter
o100 endrepeat
#<t_duration> = [datetime[] - #<t_start>]
G00 Z-420
(print,Took #<t_duration> seconds doing #<loop_count,0> iterations)
(print,#<successes,0> touches detected and #<misses,0> touches missed)
(log,Took #<t_duration> seconds doing #<loop_count,0> iterations)
(log,#<successes,0> touches detected and #<misses,0> touches missed)
(logclose)
```
The complete definition of the testing methodology allowed each one of the 19 slider and on-off sensors on the PCB to be tested and validated in the CNC structure developed, illustrated in Figure 11. Each test was comprised of 10,000 iterations, running from 2 to 3 h. This allowed testing of all the sensors within a week's time.

All tests reported more than 99% reliability, the worst performing sensor achieving 99.45% reliability (see Figure 12 and Table 3) and the best performing one achieving 100% (please refer to Figure 13 and Table 4).

**Figure 11.** (**a**) Testing of a slider-type sensor. (**b**) Testing an on–off sensor.

**Figure 12.** Sensor presenting the worst performance after 10,000 iterations.



**Figure 13.** Sensor presenting the best performance after 10,000 iterations.

Figure 12 is a good example of a phenomenon observed where errors tend to show up later during the tests. This might be explained taking into consideration that the integration system calibrates the capacitive sensor baseline over time and, because each one of the 10,000 iterations is performed as quickly as possible (1.1 s for the worst case scenario) the integration system might not have enough time to reliably calibrate back to where the baseline was before, resulting in this accumulated error over time. In a real usage scenario no touch sensor or any other button is used thousands of times during a single trip and not as frequently as in these tests. Considering the strict test conditions, these sensors were also submitted, and considering that the tests show reliability values above 99%, it can be concluded that they are appropriate and reliable.

Another interesting aspect is that the worst performing sensors are positioned within the PCB layout near noise sources such as the power supply circuitry and communication data lines. Therefore, future designs should take care in properly isolating the sensor lines from such circuitry either by properly employing ground planes, by changing the routing of the data and sensor lines or by simply changing the layout of the components. Sensors positioned far from these noise sources have close to no errors as shown in the sensors portrayed in Figures 13–15 and the corresponding Tables 4–6.

**Table 4.** Statistical data regarding the best performing sensor.



**Table 5.** Statistical data regarding the sensor in Figure 14.

**Figure 14.** Sensor presenting a typical performance after 10,000 iterations.

**Figure 15.** Sensor presenting another typical performance after 10,000 iterations.


**Table 6.** Statistical data regarding the sensor in Figure 15.

Finally, one aspect of the technology that deserves consideration is what happens if a sensor becomes out of control. For this, we can assume two possible cases: (1) the sensor starts sending undesired control commands or (2) the sensor freezes and stops working. For both cases there are approaches that must necessarily be foreseen. First, if there is a malfunction of the sensing mechanism, then it may need repair assistance as happens with current systems and cars. However, if the system systematically (or sporadically) assumes an erratic behavior after a certain number of utilizations, then a solution that incorporates a periodic reset of the electrical reference values of the touch system can easily be implemented. For any case, it must be noted that in the thousands of tests conducted through the CNC-based testing system developed, we never experienced a situation where the sensor became out of control. The tests we are currently implementing consist of the utilization of a CNC specifically developed for testing the developed system thousands of times and assessing its response capability and robustness. The manuscript describes these tests, which indicate from 0 (zero) to a maximum of 55 failures detected per 10,000 touches performed, which means the system is robust and reliable (less than 0.55% of failures detected in the worst case scenario, and (0) zero in the best one). Moreover, the newly proposed touch-based system will require less maintenance and become more robust to the presence of water and humidity (e.g., resulting from rain, coming from an open door window) inside the car cabin and near the control buttons, a problem that has been known and described by manufacturers for decades.

#### *2.4. Comparing New and Current Paradigms*

The integration tests were developed on a mass-produced automobile (please see Section 4 of [9]). The original vehicle's button responsible for controlling the doors' windows and side mirrors is comprised of 26 discreet components. Amongst these components there are those made of plastic, rubber, metal and PCBs with circuitry as shown in Figure 16. Each one of these components has its own production and/or assembly line which, all together, contribute to a complex sourcing and assembly process that produces a part with many moving components. Moreover, each component fabrication process needs to comply with physical tolerances that may lead to faults. Furthermore, hand assembly is often required and, thus, it is not an error-free process. All this contributes to a part that may often result in waste during fabrication and, having a reduced quality standard due to the tolerances may lead to short MTBF, which produces high maintenance and replacement costs.

The developed prototype has no moving parts as it is made up of a polymer plastic cover and two PCBs. Both the polymer plastic component and the PCBs with their respective circuitry have a fully automated and mature fabrication process that does not require human intervention. Thus, there is little room for tolerance error propagation along the process. This results in larger MTBF which, in turn, represents lower maintenance and replacement costs.

**Figure 16.** Example of the number of components and complexity of (**a**) side mirror's and (**b**) windows' control buttons

Of particular interest is the fact that this part is fully programmable and can communicate with the automobile using industry standard protocols. Therefore, it is suitable to be mass-produced and adapted to several automobiles or even different applications, or environments such as aircraft seats, working space seats, etc.

#### **3. Implementation and Results**

The first prototype was developed to run a proof of concept with a mass-produced automobile from 2018 [9]. However, the monitor-supported prototype seat (depicted in Figure 17b) was also built to house the SPaC part so that it could be presented to customers and disseminate the new technology. This seat was conceived, developed and built by the company AlmaDesign [22].

In Figure 17a we can see the prototype seat with the integrated SPaC part. The location of the SPaC part in an autonomous vehicle environment of the future will allow the passenger to access different commands for controlling functions regardless of the position of the seat inside the vehicle. In the current case study, these functions are: positioning of the seat, opening and closing of the windows, and control of the rear-view mirrors. In case it becomes necessary to change/add functions, the developed technology has flexibility to allow further development and scaled integration.

This implementation led to the development of a touch system with two PCBs interconnected by a flat-cable, as the curved geometry of the plastic part dictated by the aesthetics conceived by the design team did not allow the electronic system to be implemented using only a single PCB. This is depicted in more detail in Figure 5. There is a possibility of using a flexible PCB, which would allow the PCB to adapt to complex contours of the control surface. However, this would raise costs significantly and consequently this possibility was discarded.

Figure 17a,b illustrates the seat developed under the context of this change of paradigm that are capable of integrating futuristic vehicles' requisites. The new touch-based technology developed adapts to the design of novel seats that integrate functionalities of the car cabin. In fact, as self-driving cars assume more relevance in a global context, it is expected that the passengers no longer have to travel aligned with the driving direction. This implies that they may not be able to reach all parts of the cabin in order to control some functionalities.

**Figure 17.** (**a**) Button-controlled seating prototype developed for controlling car cabin functionalities. (**b**) Demo screen developed for emulating the car's windows and seat positioning functionalities.

Therefore, the current buttons were designed for the seat to incorporate the car functionalities. They can also be applied in other parts of the vehicle. Moreover, the incorporated functionalities can control the seat positioning and movement, as well as the window closest to that passenger. Figure 18a,b details the controls of window opening and rear mirror positioning. Rear seats, which in many cases may not move, can include advanced multimedia controls that are tipically accessible only by front-seat users.

Another aspect to consider is that unknown (at this point) functionalities may have to be included in the near future to control novel features of self-driving vehicles, which do not exist in current automobiles. The proposed technology makes room for a variety of human–machine interfaces, including the use of other types of sensors, wgucg have never been tried before on a single vehicle. These should allow the passengers of self-driving vehicles to better enjoy the travelling experience towards the destination.

**Figure 18.** (**a**) Front and (**b**) lateral details of the developed car seats for controlling functionalities inside the cabin.

Since the prototype seat was developed with aesthetic concerns in mind, a demo screen was developed to emulate the operation of the previously described car functionalities (shown in Figure 17b). This allows the seat to be demonstrated at conferences, fairs or to any other interested parties while proving the touch-based control concept for the car seat. The demo screen shows animations depicting some functionalities idealized for the seat control panel. The functionalities are: seat rotation (as illustrated in Figure 19) seat height adjustment, seat front and back sliding, seat reclination, and front and back door windows opening and closing.

**Figure 19.** Driver's seat rotation animation as shown on the demo screen.

#### **4. Discussion**

The current paradigm of car seats aligned with the driving direction is about to change [23]. This may imply that passengers no longer have access to the usual controls on the front panel or in doors inside the car cabin [24], thus creating the need to incorporate new control functionalities near the seats' surfaces.

#### *4.1. Conclusions of This Study*

In this paper we address this change of paradigm in the sense that we propose new low-cost and easy to implement touch-based models for controlling car cabin functionalities. The sensors are based on self capacitance technology coupled to a low-cost microcontroller that connects to the automobile's ECU [16]. To this end, we have developed the necessary electronics and performed thousands of tests on real plastic injected components, to assess reliability and robustness. We have been able to verify that, for the thousands of tests performed to each touch-based sensor, the maximum failure rate achieved was below 0.55% for a behaviour that is far more demanding than real-life utilization on a vehicle. On a conventional utilization, no sensor is consecutively used thousands of times without a reset. This is even more significant as these can be integrated with car seats in order to allow the passenger control over functionalities that otherwise would be no longer accessible when the seat changes position, a possibility that may become a reality when the passenger becomes free from driving duties.

The main **contributions** of this paper can be summarized below:


#### *4.2. Future Research Directions*

There is an active discussion [25] about the attention that touch-based technology requires from the driver and passengers. It is not clear if the feedback obtained by pressing a touch sensor is enough to create the sense of action completed by the user.

As future research directions there is an ample debate that haptic feedback [26] may have to be incorporated in touch-based control [27]. The haptic effect caused by the pressure of a mechanical button needs to be emulated for the new vehicle's control interfaces to act more naturally and give the user the perception that the action has been launched.

Another aspect that is indirect implications to this paper regards the ongoing evolution of deep neural networks and the challenges still to be faced [28] before self-driving cars can be massively adopted. In particular, several aspects of autonomous vehicles still have to undergo strict assessment concerning functional safety [29]. Furthermore, the use of deep learning for predicting decisions [30] based on data captured from cameras [31] and their association to other metrics such as positioning, velocity of the car, traffic or the presence of pedestrians near by, will have to be validated for many more thousands of kilometers to come.

**Author Contributions:** Conceptualization, T.C., C.A., P.S., J.S., C.R., R.L., R.P., F.M., R.M., G.T. and G.F.; methodology, C.A., T.C., P.S., J.S., G.T. and G.F.; software, C.A. and T.C.; validation, C.A., T.C. and G.F.; formal analysis, P.S. and G.F.; investigation, C.A., T.C., P.S., J.S. and G.F.; resources, P.S., G.T. and G.F.; data curation, P.S. and J.S.; writing—original draft preparation, C.A., T.C. and G.F.; writing—review and editing, P.S., C.R., R.L., F.M. and G.F.; visualization, J.S., R.P., R.M.; supervision, P.S. and G.F.; project administration, P.S.; funding acquisition, P.S. and G.T. All authors have read and agreed to the published version of the manuscript.

**Funding:** This work has been financially supported by project SPaC (POCI-01-0247-FEDER-038379), co-financed by the European Community Fund FEDER through POCI—Programa Operacional Competitividade e Internacionalização. It has also been financially supported by Instituto de Telecomunicações and Fundação para a CiênciaeaTecnologia under grants UIDB/50008/2020 and UIDP/50008/2020.

**Acknowledgments:** The authors would like to acknowledge the support provided by CJR Motors and Leiribéria.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **Abbreviations**

The following abbreviations are used in this manuscript:


#### **References**

