Next Article in Journal
Estimation of Road Adhesion Coefficient Based on Camber Brush Model
Previous Article in Journal
Lightweight Type-IV Hydrogen Storage Vessel Boss Based on Optimal Sealing Structure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Data and Energy Impacts of Intelligent Transportation—A Review

by
Kaushik Rajashekara
* and
Sharon Koppera
Department of Electrical and Computer Engineering, University of Houston, Houston, TX 77004, USA
*
Author to whom correspondence should be addressed.
World Electr. Veh. J. 2024, 15(6), 262; https://doi.org/10.3390/wevj15060262
Submission received: 22 March 2024 / Revised: 13 June 2024 / Accepted: 13 June 2024 / Published: 17 June 2024

Abstract

:
The deployment of intelligent transportation is still in its early stages and there are many challenges that need to be addressed before it can be widely adopted. Autonomous vehicles are a class of intelligent transportation that is rapidly developing, and they are being deployed in selected cities. A combination of advanced sensors, machine learning algorithms, and artificial intelligence are being used in these vehicles to perceive their environment, navigate, and make the right decisions. These vehicles leverage extensive data sourced from various sensors and computers integrated into the vehicle. Hence, massive computational power is required to process the information from various built-in sensors in milliseconds to make the right decision. The power required by the sensors and the use of additional computational power increases the energy consumption, and, hence, could reduce the range of the autonomous electric vehicle relative to a standard electric car and lead to additional emissions. A number of review papers have highlighted the environmental benefits of autonomous vehicles, focusing on aspects like optimized driving, improved route selection, fewer stops, and platooning. However, these reviews often overlook the significant energy demands of the hardware systems—such as sensors, computers, and cameras—necessary for full autonomy, which can decrease the driving range of electric autonomous vehicles. Additionally, previous studies have not thoroughly examined the data processing requirements in these vehicles. This paper provides a more detailed review of the volume of data and energy usage by various sensors and computers integral to autonomous features in electric vehicles. It also discusses the effects of these factors on vehicle range and emissions. Furthermore, the paper explores advanced technologies currently being developed by various industries to enhance processing speeds and reduce energy consumption in autonomous vehicles.

1. Introduction

Autonomous vehicles (AVs), also referred to as self-driving cars, are a rapidly developing category of intelligent transportation. These vehicles operate without human intervention, utilizing advanced technologies such as communication systems, sensors, data analytics, machine learning, and artificial intelligence. This integration enhances their efficiency, safety, and decision-making capabilities in navigation. The adoption of AVs holds the potential to revolutionize transportation by improving safety, reducing traffic congestion, and expanding mobility for those unable to drive. These vehicles are also expected to yield economic, social, and environmental advantages. The Society of Automotive Engineers (SAE) defines six levels of driving automation, ranging from Level 0 (fully manual) to Level 5 (fully autonomous), as shown in Table 1 [1,2,3,4]. The U.S. Department of Transportation has endorsed these classifications. In Level 0–2, the human monitors the driving environment, and in Level 3–5, the automated system monitors the driving environment.
The level of automated vehicles currently on the market are mostly Level 1 and 2. Mercedes-Benz offers Level 3 autonomous driving in its S-Class and EQS models. A few other companies, including Audi, BMW, and Tesla, are planning to soon market Level 3 vehicles. Level 4 vehicles are currently being deployed for testing in a few cities in limited areas. Level 5 will hopefully be achieved someday in the future.
Connected vehicles, shown in Figure 1, are those equipped with wireless connectivity that can communicate with the internal and external systems supporting the interactions of V2S (vehicle-to-sensor onboard), V2V (vehicle-to-vehicle), and V2R (vehicle-to-road infrastructure) [5,6]. These connected vehicles are the building blocks of the Internet of Vehicles (IoV), which enables the evolution to the next generation of intelligent transportation systems (ITSs). Connected vehicles continuously collect and transmit various types of data, such as speed, real-time location, acceleration, state of charge in electric vehicles (EVs), and diagnostics, from one vehicle to another. These data are essential for analyzing vehicle performance and road conditions, helping the drivers to make better decisions and managing the traffic conditions. V2V communication enables vehicles to wirelessly exchange information about their location, speed, and where they are heading. Real-time traffic information, safety messages, traffic signal messages, eco-speed limits, eco-routes, parking information, etc. can be exchanged between the vehicles. By using this information, vehicles can anticipate potential collisions or traffic issues, and can adjust their routes or speeds accordingly. The data from the connected vehicles can be used by traffic management systems to optimize traffic flow and reduce congestion. This connectivity between vehicles enhances the overall efficiency and results in more efficient use of the powertrain, including in electric vehicles, fuel cell, and hybrid vehicles. The connected vehicle capability being deployed in autonomous vehicles results in “Connected Automated Vehicles (CAVs)”, as shown in Figure 2. Because of the use of artificial intelligence (AI) in all these vehicles, they are also called intelligent connected autonomous vehicles (ICAVs) [5,6,7].
Autonomous vehicles, including connected vehicles, use a number of sensors, such as radar (radio detection and ranging), LiDAR (light detection and ranging), cameras, and ultrasonic sensors, to collect real-time data about the vehicle’s surroundings. The collected sensor data are processed by machine learning algorithms and AI systems to enable the vehicle to recognize and interpret objects, pedestrians, other vehicles, and road conditions. These vehicles also have advanced computation and control systems for interpreting the data to make the driving decisions, such as steering, acceleration, and braking, all without human intervention. The primary use of these data coming from various IoT devices of the vehicle include the following: (i) perception and action—receiving information, planning, and responding based on the collected data, (ii) precise mapping of the surroundings, (iii) identifying speed, range (in EVs), and distance using cameras and LiDAR (or radars), and (iv) communication with other vehicles and sharing information [5,6,7,8].
In AVs and CAVs, information from various sensors has to be processed and analyzed very fast, in a few milliseconds. This requires massive computation power to be efficiently delivered. These additional energy demands from the sensors and also the additional use of computational power overall increases the auxiliary load in the vehicle, thus reducing the range of the autonomous electric vehicle relative to a standard electric vehicle. This extra power requirement from the vehicle battery could also lead to additional emissions, depending on the source of power for charging the EV batteries.
Several review papers have been published in the literature that examine different facets of autonomous vehicles. A comprehensive review of the impacts of connected and autonomous vehicles on urban transportation and environment is presented in [9]. This paper reviews the materials presented in a number of papers on the short-, medium-, and long-term effects of AVs on urban transportation and the environment. The paper also mentions that there is a consensus that AVs would help in reducing vehicle ownership, traffic delay and congestion, travel costs, traffic crashes, parking demand, long-term energy consumption, etc. It has also been noted that AVs could significantly impact urban transportation systems and human mobility by enhancing accessibility, increasing mobility, boosting the number of vehicle miles traveled, and, thus, augmenting revenue generation for commercial operators. The environmental impacts of autonomous vehicles are reviewed in [10], by considering their effect on land, water, noise, pollution, and air pollution. The energy consumption due to factors such as type of propulsion, vehicle design, platooning, route choice, congestion reduction, distance travelled, shared mobility, etc. is also briefly mentioned. A comprehensive review of the earlier studies related to the implications of AVs for safety, public behavior, land use, society and environment, public health, and accessibility for senior citizens, as well as the miscellaneous benefits of autonomous vehicles, is presented in [11]. Advanced driver-assistance system (ADAS) technologies and automated driving system (ADS) safety and regulation considerations are reviewed in [12]. A brief review of the impacts of AVs and CAVs on the traffic flow in terms of safety, delays, vehicles density on the roads, and speed is also presented in this paper. The paper summarizes that AVs and CAVs have promising benefits, compared to Level 1 to Level 3 vehicles. The review is limited to the advancements in AVs and CAVs technologies made up to the year 2019. In [13], a literature review related to the impacts of AVs on greenhouse gas emissions addressed both the positive and negative impacts reported in the literature in two categories of AVs, namely partial automation and full automation. Finally, the paper concludes that the effects of autonomous vehicles on greenhouse gas emissions largely depend on ongoing technological advancements, market responses, and regulatory measures. This makes it difficult to definitively predict the overall benefits that autonomous vehicles are anticipated to bring to transportation systems in reducing greenhouse gas emissions. There are a few other review papers [14,15,16,17] that explore the environmental impacts of AVs and CAVs. These studies examine the energy and environmental effects influenced by various factors, including vehicle size, route selection, market penetration, platooning, congestion alleviation, distance travelled, and shared mobility. Additionally, some of these papers delve into strategies for maximizing environmental benefits by analyzing deployment locations, driving distances, and other relevant aspects.
None of the reviewed papers focus on the energy implications associated with the hardware components in AVs, such as cameras, sensors, and computers, although they are briefly mentioned. They also do not consider the necessary processing power and the energy demands resulting from the extensive data that need to be processed in AVs and CAVs. Instead, these studies primarily focus on factors like driving routes, penetration levels, platooning, congestion reduction, safety considerations, and similar topics. This review paper provides a comprehensive review of the volume of data and energy utilization by various sensors and computers in autonomous electric vehicles. It examines how these factors influence the vehicle’s range and emissions. Furthermore, the paper highlights recent technological advancements by different industries to minimize power consumption and processing times in autonomous vehicles.

2. Data and Power Consumption by the Main Components of Autonomous Vehicles

The main components of autonomous vehicles are varied, and are discussed in each section below.

2.1. Sensors

AVs, including CAVs, are equipped with various sensors like LiDAR, radar, cameras, and ultrasonic sensors. These sensors gather real-time data concerning both the vehicle’s functionality and its environment [18]. They are crucial for providing perception, which is key to acquiring accurate and current environmental information. Essentially acting as the vehicle’s eyes and ears, these sensors supply essential data that allow other systems to make informed decisions and navigate safely. High-resolution cameras specifically are adept at capturing visual information about the road ahead, recognizing traffic signs, pedestrians, and other vehicles, and are particularly effective at identifying lane markings and traffic signals, and even at interpreting human gestures. This capability is vital for enhancing ADAS, which improves vehicle safety. As the driver’s involvement in tasks like driving, steering, and braking decreases, especially in Level 3 and above, ADAS becomes more complicated and expensive. Figure 3 shows the typical location of the various sensors [8]. The ADAS system, including the global navigation satellite system (GNSS), is shown in Figure 4, which is an illustration of the functions of each sensor and their location in autonomous vehicles [18]. Based on the information in Figure 4, Table 2 is generated, which summarizes different sensor applications around the vehicle.
The estimates of the data generated by each type of sensor are given in Table 3 [19]. The various camera models used in AVs and their power consumption are given in Table 4 [20,21,22,23]. The LiDAR creates precise 3D maps of the surroundings, even in low-visibility conditions like fog or darkness. A few examples of LiDAR models used in autonomous vehicles and their power usage are given in Table 5 [24,25,26,27,28].
Radar sensors are deployed for lane change assistance, blind spot detection, collision mitigation, parking assistance, etc. The amount of data that a radar in an autonomous vehicle generates depends on a number of factors, including the type of radar sensor, the environment in which it is being used, and the desired resolution of the point cloud. For example, the Continental ARS540 radar senor, as shown in Table 6, can generate up to 320,000 points per second. The top radar manufacturers for autonomous vehicles and their power consumption are given in Table 6 [29,30,31,32,33].
Ultrasonic sensors are used for detecting nearby objects like curbs and walls. They are also used for monitoring the immediate surroundings of the vehicle, to measure distance to obstacles, and for parking assistance systems. A few of the top ultrasonic sensor manufacturers for autonomous vehicles and their power consumption are shown in Table 7 [34,35,36,37].
In addition to the sensors, global positioning system (GPS)/ global navigation satellite system (GNSS) receivers also consume electric power. These systems provide the position of the vehicle’s location on the earth. Knowing the exact position of the vehicle on the map is crucial for any autonomous vehicle. A few examples of GPS/GNSS receivers used in autonomous vehicles and their power consumption are given in Table 8 [38,39,40,41].
The power consumption of autonomous vehicles is due to the following factors:
  • Sensors such as LiDAR, cameras, etc.;
  • Processing units such as onboard computers and processors, which process the large amounts of data collected by the sensors and run complex algorithms for perception, decision-making, and control;
  • Communication systems, which communicate with other vehicles, infrastructure, and cloud-based services for real-time updates, mapping data, and coordination;
  • Actuators that are used for control, steering systems, and braking, the power consumption of which depends on the vehicle’s size, weight, and specific requirements;
  • Infotainment systems, climate control, and other comfort features.

2.2. Computation, Data, Machine Learning, and AI

As the degree of automation moves up the ladder (L1 to L5) of AVs, the complexity of the overall control, management, and the associated tasks grow exponentially. Hence, a massive processing system, equivalent to a powerful electronic brain, is needed with assistance from a comprehensive set of sensors tasked with collecting large quantities of data of diverse types. The data must consist of static and dynamic objects, highlighting visible and hidden obstacles surrounding the vehicle in motion. In addition, geographic coordinates to localize the vehicle and identify its environmental conditions and characteristics are also required. The collected sensor data are processed using massive computational power with low latency by machine learning algorithms and AI systems. These vehicles collect real-time data about the weather, road conditions, pedestrians, street signs, and other vehicles, which are combined with information about the vehicle and the intelligence needed to make instant driving decisions, thus generating up to a terabyte of data per hour for each vehicle. These technologies enable the vehicle to recognize and interpret objects, pedestrians, other vehicles, and road conditions. CAVs use large amounts of these data from image recognition systems, along with machine learning and neural networks, to make the vehicle drive autonomously. If we include an average person’s video, chat, and other internet use, the average driven vehicle can produce between 500 TB and 6000 TB of data in just one year Present day vehicles, with just cameras and radar, generate about 12 gigabytes of data every minute. This is even higher for a self-driving vehicle, with additional sensors like LiDARs, cameras, etc., thus creating up to 1 terabyte of data each hour. The approximate data from different sensors are given in Table 9 [42]. This Table summarizes the data rates for different sensors used in autonomous vehicles and highlights the total daily data usage. Autonomous vehicles generate about 4000 GB of data each day. The large amount of data that comes from various built-in IoT devices represents a barrier to the quick adoption of autonomous vehicles. By considering factors such as road conditions, traffic congestion, and energy consumption models, big data aids the autonomous vehicles to minimize idle time, avoid unnecessary detours, and reduce overall energy consumption, by selecting the most efficient routes.
As the levels of autonomy of the vehicles increase from L1 to L5, the computational requirements increase exponentially, as can be seen from Table 10. These can span from hundreds of gigaFLOPS (floating point operations per second) at L1 to tens of TFLOPS (teraFLOPS) at L2 to hundreds of TFLOPS at L3. At L4 and L5, the required processing power reaches 1000 TFLOPS or more [43].

2.3. Connectivity

Many autonomous vehicles communicate with each other and with infrastructure units like traffic lights and road signs. Communications and connectivity enable vehicles to exchange critical information with one another and their surroundings. This connectivity enhances the overall efficiency and safety of autonomous driving. Various types of vehicle communications that may exist in autonomous vehicles are already mentioned above and also in Figure 5. These are vehicle-to-infrastructure (V2I), vehicle-to-network (V2N), vehicle-to-vehicle (V2V), vehicle-to-grid (V2G), vehicle-to-device (V2D), and vehicle-to-pedestrian (V2P) systems.

2.4. Control Systems

There are several controllers in an autonomous vehicle that interpret the data to make the right driving decisions. These decisions include steering, acceleration, and braking, all without human intervention. A recent automotive innovation employs convolutional neural networks (CNNs) to translate the visual data captured by a front-facing camera directly into steering commands for a self-driving vehicle. This comprehensive approach allows the system to autonomously learn steering behavior using minimal human-provided training data. It can effectively navigate various road conditions, including those without clear lane markings, both on local streets and highways. Furthermore, the system shows effectiveness in settings with minimal visual indicators, such as parking lots or unpaved roads.

3. Power Consumption Problems

The AVs consume a significant amount of power to run the computers that carry out all the calculations needed to process and analyze the significant amount of data, onboard sensors, and signals, in order to make the driving decisions. All the data from various units need to be merged, sorted, and turned into comprehensive representation, i.e., a robot-friendly picture of the world, with instructions on how to move through it (fusion). This takes a large amount of computing power, which raises concerns about environmental impact, range limitations, and even the feasibility of widespread adoption. The sensors and computer processing in autonomous vehicles create additional electrical loads, which adds to the regular electrical loads (such as lighting, radio, etc.) of the vehicles, thus reducing the range of an automated electric vehicle in comparison to a standard electric vehicle. In addition, some of the sensors located on the body of the vehicle create additional drag and thus contribute to the power consumption. Furthermore, a fully automated vehicle must be fail-safe, from sensors to vehicle control, resulting in adding additional electrical loads due to required redundancies in hardware throughout the vehicle.
Sensors, including LiDARs, consume varying amounts of power depending on their design and functionality. Some sensors may have lower power requirements, while others may require substantial power resources. It is crucial for sensor manufacturers to optimize their designs for energy efficiency to minimize the impact on the vehicle’s power consumption. For achieving partial autonomy, the minimum number of sensors required could be four to eight of various types. For full autonomy, more than 15 sensors are generally used. In the case of Tesla, there are 20 sensors (8 camera sensors plus 12 ultrasonic sensors for Level 3 or below) with no LiDAR or radar [45,46,47]. L4 vehicles could employ 15–30 cameras, 5–20 radars, and 5–7 LiDARs per vehicle. The total power consumption of all these sensors could contribute to a relatively significant total power consumption.
Research conducted by the University of Michigan and Ford Motor Company examines the environmental footprint or life-cycle emissions of autonomous vehicle technologies [47]. The study reported that the hardware for autonomy adds approximately 3 to 4% to the energy usage during vehicle operation, inclusive of the effects from increased drag and weight. Of this 3 to 4%, the majority of the energy consumption is from computer processing power (about 40%), 15% is from added weight, and 10% is from increased drag, among other things [47,48,49]. These sources of added energy consumption in Ford’s Fusion’s autonomy system are shown in Figure 6 [47,48,49].
The power consumption in autonomous vehicles greatly depends on the sensor types and the technological development level. It can range from an estimated 500 watts for a Tesla to 2.5 kilowatts for an experimental autonomous vehicle with LiDAR and onboard computers [48]. As an example, the Waymo Driver self-driving system uses approximately 1 kW of power. The Waymo Driver uses a variety of sensors, including LiDAR, cameras, and radar. Production versions can be from 200 W to 600 W or even higher, depending on the type of vehicle. This consumption is significantly high as a percentage of the power during cruising, particularly if the vehicle is cruising down the street in an urban area. The power required for AVs is in addition to the power required for the already existing electrical systems in an automobile. This could be in the range of 1.5 to 3 kW, depending on the size of the vehicle.
The primary extra electrical power demand in an automated vehicle is from its on-board computer, which is a vital factor in advancing to Levels 3 to 5 in automation, and it must also be fail-safe. Additional loads, such as laptops or entertainment systems, typically consume between 50 and 100 watts. NVIDIA’s AI computing solution for robotaxis, known as DRIVE, consumes 800 watts of power.

4. Energy Impacts of Autonomous Vehicles

Autonomous driving requires significant data flow and computational power, which in turn reduces the energy available for propulsion in EVs or increased fuel consumption in internal combustion based vehicles, leading to additional pollution. The energy impacts of autonomous vehicles can vary greatly based on the following two factors: the degree of automation, whether it is partial or full automation vehicle technology; and the proportion of shared versus personal autonomous vehicles. It is important to note that the energy impact of intelligent transportation can also vary depending on the specific implementation and context. The subsystems of CAVs (for example, Level 4) could increase vehicle primary energy consumption and greenhouse gas emissions by 3–20%, due to increases in power consumption, weight, drag, and data transmission.
The energy impact of converting different commercially available electric vehicles into autonomous electric vehicles (L4 and L5) is reported in a study presented in [50]. Based on the estimation of the power consumed by different vehicles due to the conversion, the study reported that the energy consumed by autonomous vehicles represents about 10% of the total energy stored in a fully charged battery for highway driving, 20% for combined driving, and 30% for city driving. Hence, by activating complete autonomous driving at Levels 4 and 5 in a current electric vehicle would decrease its range by approximately 10% on highways, 30% in a city, and 20% in combined driving conditions. This decline in mileage efficiency in cities can be due to the higher energy consumption in city driving, compared to highway travel. The higher energy usage in cities is primarily due to driving at lower speeds over longer durations at a constant power consumption, which results in greater energy consumption. Additionally, city driving requires more computational processing due to the higher density of vehicles, pedestrians, bicycles, and other unpredictable situations.
A study presented in [51,52] finds that the emissions from sensors and computers onboard the autonomous vehicles have the potential to be comparable to that of emissions from all data centers (in 2020). Emissions from one billion AVs driving 1 h/day with an average computer power of 840 W are equivalent to emissions from all data centers. It is also mentioned, in the same paper, that 100 million AVs driving 1 h/day consuming a computational power of 3.1 kW results in emissions equivalent to all of Canada in one year (2020). The paper also recommends targets for computer power and rates of hardware energy efficiency improvement for various scenarios.
It is important to note that the energy impact of AVs can vary, depending on the specific implementation and context. Efficient algorithms and hardware accelerators can be employed to reduce the computational load and associated power consumption. This also involves integrating multiple sensors and their functionalities to eliminate redundancies, improve data fusion, and reduce overall power requirements. Efforts are underway to enhance the energy efficiency of sensors, including LiDARs, by exploring new technologies, miniaturization, and power management strategies. The overall power consumption is optimized by using efficient components, intelligent power management systems, and advanced algorithms to maximize energy efficiency and extend the vehicle’s range or operational time. Continuous advancements in sensor design and integration, along with optimization techniques, aim to strike a balance between the energy impact of sensors and their crucial role in enabling advanced driver assistance systems, autonomous driving, and intelligent transportation applications. However, when the potential operational effects of CAVs are included (e.g., eco-driving, platooning, and intersection connectivity), the net result is up to a 9% reduction in energy and greenhouse gas emissions (GHG) in the base case (Mid-size BEV-Ford Fusion) [47,48,49]. Efficient algorithms and hardware accelerators can be employed to reduce the computational load and associated power consumption. This also involves integrating multiple sensors and their functionalities to eliminate redundancies, improve data fusion, and reduce overall power requirements.

5. Industry Attempts to Reduce Power Consumption and Future Strategies

The industry has recognized the substantial power demands of autonomous driving systems, and is addressing the issue by advancing chip design and technology. For example, Nvidia has introduced a processor specifically engineered for self-driving cars, named Jetson Xavier [53]. This processor has an eight-core CPU, a 512-core GPU, a deep learning accelerator, computer vision accelerators, and 8 K video processors. Nvidia claims it to be the most sophisticated system-on-a-chip (SoC) ever developed. Xavier achieves greater efficiency by performing more tasks with less power, delivering 30 trillion operations per second while consuming only 30 watts of power. Utilizing a configuration of two Xavier chips and an additional two GPUs, the system can process 320 trillion operations per second while maintaining power usage at a level of about 500 watts. Additionally, Nvidia’s newer Orin system-on-a-chip can perform up to 254 trillion operations per second [53,54].
A few of the system-on-a-chip technologies available and their functionalities, architecture, performance parameters, power consumption, memory, interconnectivity, software used, and target applications are listed in Table 11. Tesla, in August 2019, announced a 260 square millimeter piece of silicon, with 6 billion transistors, that the company claims offers 21 times the performance of the Nvidia chips it was using [55,56]. This chip is capable of 36 trillion operations per second with a wattage of 72 W and it can continue to promise the same range from each car without impacting the cost. Since this chip is dedicated for use in Tesla cars, it can have better performance with a lower power consumption. It uses two such chips for full self-driving (FSD) operation.
Tesla unveiled the new D1 chip, a cutting-edge AI-specific chip capable of achieving 362 TFLOPs with a power consumption of about 70 W. The chip features 354 training nodes that form a network of functional units, interconnected to enable substantial computational capabilities. Each functional unit is equipped with a quad-core, 64-bit Instruction Set Architecture (ISA)CPU, specifically designed for tasks such as transpositions, compilations, broadcasts, and link traversal. This CPU utilizes a superscalar architecture, including 4-wide scalar and 2-wide vector pipelines.
In 2023, Tesla began production of Dojo, a supercomputer designed specifically for processing and recognizing computer vision video data [57]. Dojo is used for intensive training of Tesla’s machine learning models to enhance its full self-driving advanced driver-assistance system. Dojo aims to efficiently process the vast amounts of video data captured in real-world driving scenarios from over four million Tesla vehicles. Dojo has over an exaflop (a million teraflops) of computing power and incorporates the developed D1 chip [53,54,55,56,57,58,59,60].
Intel is developing low-power chips optimized for self-driving cars and Tesla has developed its own chip for Autopilot. Mobileye, a subsidiary of Intel, is developing Eye Q5, which is a system-on-a-chip (SoC) designed for autonomous vehicles [61]. It is the fifth generation of the EyeQ family of SoCs from Mobileye. The EyeQ5 is also designed to be scalable and can be used to support a variety of autonomous driving systems, from Level 2 to Level 5. This makes it a versatile and powerful solution for a wide range of autonomous vehicle manufacturers. It has 24 trillion operations per second of performance and provides support for up to 20 high-resolution cameras, and has power consumption of about 25 W. It is scalable for Level 2 to Level 5 autonomous driving systems. It uses deep learning accelerators and hardware for computer vision algorithms. It is a powerful and efficient SoC that is specifically designed to meet the demands of autonomous driving.

6. Conclusions

Intelligent transportation systems can have both positive and negative impacts on the energy consumption of vehicles. The power required by sensors and additional computational power may decrease the range of AVs compared to standard electric cars, thereby increasing energy consumption and potentially leading to additional emissions. The subsystems of Level 4 and Level 5 CAVs could increase vehicles’ primary energy use and greenhouse gas emissions by 3–20%, due to their higher power consumption, added weight, increased drag, and the need for more data transmission, which also has an environmental impact.
On the positive side, AVs contribute to traffic flow optimization by dynamically adjusting signal timings, managing lane usage, and providing real-time traffic information to drivers. Intelligent systems recommend the most efficient routes by considering real-time traffic data, including congestion, road closures, and accidents. By guiding vehicles through less congested routes, travel distances are reduced, and vehicles can maintain a more consistent speed, leading to smoother and more fuel-efficient travel. AVs also offer eco-driving assistance, providing feedback on driving behaviors such as excessive acceleration or braking. Encouraging smoother driving patterns and reducing unnecessary acceleration and braking, these systems help lower overall energy consumption. By monitoring and analyzing data from numerous sensors and systems, AVs can detect anomalies or patterns indicative of potential failures. This proactive approach helps to avoid sudden breakdowns, reduces downtime, and optimizes maintenance schedules, thus efficiently managing energy consumption and preventing unexpected issues that may lead to energy wastage. With these number of benefits that could be achieved by using AVs, a small effect on the range may not be a major concern. It is expected that, with the energy savings attributable to improved driving performance due to CAVs, the resulting enhanced traffic flows outweigh the additional energy required for data processing and transmission. It could be possible that, with the advancement of computational processing capability and the technology of sensors, cameras, and other devices, combined with the operational advantages of CAVs like eco-driving, platooning, and intersection connectivity, there is a potential net decrease in energy usage and greenhouse gas emissions. By 2050, the efficiency improvement in connected and autonomous vehicles could be as high as 4 to 10 per cent. Also, these vehicles give drivers a safe, reliable, and informed experience behind the wheel.
Autonomous electric vehicles will soon be the standard to be followed in the automotive industry. Companies like Tesla, GM, Volkswagen, BMW, and Waymo represent a selection of manufacturers and tech companies that are committed to producing all-electric autonomous vehicles in the coming years. Mercedes-Benz has already started selling Level 3 autonomous cars in the United States.

Author Contributions

Conceptualization, K.R. and S.K.; methodology, K.R.; software, S.K.; validation, K.R. and S.K.; formal analysis, K.R.; investigation, K.R.; resources, S.K.; data curation, S.K.; writing—original draft preparation, K.R.; writing—review and editing, K.R.; visualization, S.K.; supervision, K.R.; project administration, K.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dickert, C. The Drive for a Fully Autonomous Car. 2023. Available online: https://www.visualcapitalist.com/the-drive-for-a-fully-autonomous-car (accessed on 6 March 2024).
  2. Synopsys. What Is an Autonomous Car? Available online: https://www.synopsys.com/automotive/what-is-autonomous-car.html (accessed on 6 March 2024).
  3. Center for Sustainable Systems, University of Michigan. Autonomous Vehicles Factsheet. Pub. No. CSS16-18. 2023. Available online: https://css.umich.edu/factsheets/autonomous-vehicles-factsheet (accessed on 6 March 2020).
  4. Society of Automotive Engineers, SAE Levels of Driving Automation™ Refined for Clarity and International Audience. 2021. Available online: https://www.sae.org/blog/sae-j3016-update (accessed on 6 March 2024).
  5. Tahir, M.N.; Leviäkangas, P.; Katz, M. Connected Vehicles: V2V and V2I Road Weather and Traffic Communication Using Cellular Technologies. Sensors 2022, 22, 1142. [Google Scholar] [CrossRef] [PubMed]
  6. “Connected and Automated Vehicles”, State of Wisconsin Department of Transportation. Available online: https://wisconsindot.gov/Pages/projects/multimodal/cav.aspx (accessed on 6 March 2024).
  7. Rajashekara, K. Why Choose Electric Connected and Autonomous Vehicles, and Energy Impacts. In Proceedings of the 2019 World Intelligent Connected Vehicles Conference, Beijing, China, 22–24 October 2019. [Google Scholar]
  8. Vargas, J.; Alsweiss, S.; Toker, O.; Razdan, R.; Santos, J. An Overview of Autonomous Vehicles Sensors and Their Vulnerability to Weather Conditions. Sensors 2021, 21, 5397. [Google Scholar] [CrossRef] [PubMed]
  9. Rahman, M.; Thill, J.-C. Impacts of connected and autonomous vehicles on urban transportation and environment: A comprehensive review. Sustain. Cities Soc. 2023, 96, 104649. [Google Scholar] [CrossRef]
  10. Silva, Ó.; Cordera, R.; González-González, E.; Nogués, S. Environmental impacts of autonomous vehicles: A review of the scientific literature. Sci. Total Environ. 2022, 830, 154615. [Google Scholar] [CrossRef] [PubMed]
  11. Othman, K. Exploring the implications of autonomous vehicles: A comprehensive review. Innov. Infrastruct. Solut. 2022, 7, 165. [Google Scholar] [CrossRef]
  12. Ahmed, H.U.; Huang, Y.; Lu, P.; Bridgelall, R. Technology Developments and Impacts of Connected and Autonomous Vehicles: An Overview. Smart Cities 2022, 5, 382–404. [Google Scholar] [CrossRef]
  13. Massar, M.; Reza, I.; Rahman, S.M.; Abdullah, S.M.H.; Jamal, A.; Al-Ismail, F.S. Impacts of Autonomous Vehicles on Greenhouse Gas Emissions—Positive or Negative? Int. J. Environ. Res. Public Health 2021, 18, 5567. [Google Scholar] [CrossRef] [PubMed]
  14. Silva, Ó.; Cordera, R.; González-González, E.; Nogués, S. Connected & autonomous vehicles—Environmental impacts—A review. Sci. Total Environ. 2020, 712, 135237. [Google Scholar] [CrossRef] [PubMed]
  15. Lee, J.; Kockelman, K.M. Energy Implications of Self-Driving Vehicles. In Proceedings of the 98th Annual Meeting of the Transportation Research Board, Washington, DC, USA, 13–17 January 2019. [Google Scholar]
  16. Taiebat, M.; Brown, A.L.; Safford, H.R.; Qu, S.; Xu, M. A Review on Energy, Environmental, and Sustainability Implications of Connected and Automated Vehicles. Environ. Sci. Technol. 2018, 52, 11449–11465. [Google Scholar] [CrossRef] [PubMed]
  17. Ross, C.; Kuhathakurta, S. Autonomous Vehicles and Energy Impacts: A Scenario Analysis. In Proceedings of the World Engineers Summit—Applied Energy Symposium & Forum: Low Carbon Cities & Urban Energy Joint Conference, WES-CUE 2017, Singapore, 19–21 July 2017. [Google Scholar]
  18. Elsagheer Mohamed, S.A.; Alshalfan, K.A.; Al-Hagery, M.A.; Ben Othman, M.T. Safe Driving Distance and Speed for Collision Avoidance in Connected Vehicles. Sensors 2022, 22, 7051. [Google Scholar] [CrossRef] [PubMed]
  19. Wright, S. Autonomous Cars Generate More Than 300 TB of Data per Year. 2021. Available online: https://www.tuxera.com/blog/autonomous-cars-300-tb-of-data-per-year/ (accessed on 6 March 2024).
  20. Data Brief. Available online: https://www.st.com/en/solutions-reference-designs/image-sensing-solutions.html#tools-software (accessed on 6 March 2024).
  21. Data Brief. Available online: https://www.sony-semicon.com/files/62/flyer_security/IMX307LQD_LQR_Flyer.pdf (accessed on 6 March 2024).
  22. Data Brief. Available online: https://pdf1.alldatasheet.com/datasheet-pdf/download/312432/OMNIVISION/OV10620.htm (accessed on 6 March 2024).
  23. Data Brief. Available online: https://www.onsemi.com/products/sensors/imagesensors/AR0231AT (accessed on 6 March 2024).
  24. Data Sheet. Available online: https://autonomoustuff.com/-/media/Images/Hexagon%20/Hexagon%20Core/autonomousstuff/pdf/velodyne-alpha-prime-datasheet.ashx (accessed on 6 March 2024).
  25. Data Sheet. Available online: https://hexagondownloads.blob.core.windows.net/public/AutonomouStuff/wp-content/uploads/2019/05/ibeo_LUX_datasheet_whitelabel.pdf (accessed on 6 March 2024).
  26. Data Sheet. Available online: https://www.generationrobots.com/media/Robosense-RS-LiDAR-M1-solid-state-datasheet.pdf (accessed on 6 March 2024).
  27. Data Sheet. Available online: https://www.oxts.com/wp-content/uploads/2021/01/Hesai-Pandar40P_Brochure.pdf (accessed on 6 March 2024).
  28. Data Sheet. Available online: https://www.livoxtech.com/tele-15/specs (accessed on 7 March 2024).
  29. Data Sheet. Available online: https://epan-utbm.github.io/utbm_robocar_dataset/docs/140119_FS_ARS-308-21_EN_HS.pdf (accessed on 7 March 2024).
  30. Radars Comparison Chart. Available online: https://autonomoustuff.com/radar-chart (accessed on 6 March 2024).
  31. Data Sheet. Available online: https://www.yumpu.com/en/document/read/32290786/datasheet-long-range-radar-sensor-pdf-15364-kb-bosch- (accessed on 6 March 2024).
  32. Data Sheet. Available online: https://hexagondownloads.blob.core.windows.net/public/AutonomouStuff/wp-content/uploads/2020/06/radar-comp-chart.pdf (accessed on 6 March 2024).
  33. Data Sheet. Available online: https://autonomoustuff.com/products/aptiv-esr-2-5-24v (accessed on 6 March 2024).
  34. Data Sheet. Available online: https://www.aircraftspruce.com/catalog/pdf/servicebulletin.pdf (accessed on 21 February 2024).
  35. Data Sheet. Available online: https://www.nxp.com/products/no-longer-manufactured/3-axis-digital-gyroscope:FXAS21002C (accessed on 21 February 2024).
  36. Data Sheet. Available online: https://www.st.com/resource/en/data_brief/vl6180x.pdf (accessed on 21 February 2024).
  37. Data Sheet. Available online: https://www.te.com/content/dam/te-com/documents/sensors/global/te-sensor-solutions-catalog.pdf (accessed on 6 March 2024).
  38. Data Sheet. Available online: https://www.digikey.ch/de/products/detail/u-blox/LEA-6T-0/9817982 (accessed on 6 March 2024).
  39. Data Sheet. Available online: https://info.intech.trimble.com/bd992-datasheet (accessed on 6 March 2024).
  40. Data Sheet. Available online: https://novatel.com/products/receivers/gnss-gps-receiver-boards/oem729 (accessed on 6 March 2024).
  41. Data Sheet. Available online: https://www.septentrio.com/en/products/gps/gnss-boards/asterx-m3-pro-plus (accessed on 6 March 2024).
  42. Intellias. How Big Data in Autonomous Vehicles Defines the Future. 2023. Available online: https://intellias.com/how-big-data-in-autonomous-vehicles-defines-the-future/ (accessed on 6 March 2024).
  43. EEweb. The Next Steps to Consider as Vehicles Move toward Autonomy. 2022. Available online: https://www.eeweb.com/the-next-steps-to-consider-as-vehicles-move-toward-autonomy/?utm_source=newsletter&utm_campaign=link&utm_medium=PowerElectronicsNewsWeekly-20220803 (accessed on 6 March 2024).
  44. Lee, D.; Camacho, D.; Jung, J.J. Smart Mobility with Big Data: Approaches, Applications, and Challenges. Appl. Sci. 2023, 13, 7244. [Google Scholar] [CrossRef]
  45. Koon, J. How Many Sensors for Autonomous Driving. Semiconductor Engineering. 2023. Available online: https://semiengineering.com/how-many-sensors-for-autonomous-driving/ (accessed on 6 March 2024).
  46. Bojarski, M.; Firner, B.; Flepp, B.; Jackel, L.; Muller, U.; Zieba, K.; Del Testa, D. “End-to-End Deep Learning for Self-Driving Cars” Nvidia Developer. 2016. Available online: https://developer.nvidia.com/blog/deep-learning-self-driving-cars/ (accessed on 7 March 2024).
  47. Mitchell, O. Self-Driving Cars Have Power Consumption Problems. The Robot Report. 2018. Available online: https://www.therobotreport.com/self-driving-cars-power-consumption/ (accessed on 18 March 2024).
  48. Available online: https://www.designnews.com/electric-vehicles/how-autonomous-driving-affects-heat-loads-and-component-sizing-in-electric-vehicles (accessed on 10 April 2024).
  49. Gawron, J.H.; Keoleian, G.A.; De Kleine, R.D.; Wallington, T.J.; Kim, H.C. Life Cycle Assessment of Connected and Automated Vehicles: Sensing and Computing Subsystem and Vehicle Level Effects. Environ. Sci. Technol. 2018, 52, 3249–3256. [Google Scholar] [CrossRef] [PubMed]
  50. Teraki. Autonomous Cars’ Big Problem: The Energy Consumption of Edge Processing Reduces a Car’s Mileage with Up to 30%. 2019, Medium. Available online: https://medium.com/@teraki/energy-consumption-required-by-edge-computing-reduces-a-autonomous-cars-mileage-with-up-to-30-46b6764ea1b7 (accessed on 6 March 2024).
  51. Sudhakar, S.; Sze, V.; Karaman, S. Data Centers on Wheels: Emissions from Computing Onboard Autonomous Vehicles. IEEE Micro 2023, 43, 29–39. [Google Scholar] [CrossRef]
  52. Sudhakar, S.; TEDxBoston. Data Centers on Wheels: The Carbon Footprint of Self-Driving Cars. 2022. Available online: https://www.ted.com/talks/soumya_sudhakar_data_centers_on_wheels_the_carbon_footprint_of_self_driving_cars (accessed on 6 March 2024).
  53. Leela, S. Karumbunathan, NVIDIA Jetson AGX Orin Series Technical Brief v1.2. Available online: https://www.nvidia.com/content/dam/en-zz/Solutions/gtcf21/jetson-orin/nvidia-jetson-agx-orin-technical-brief.pdf (accessed on 6 March 2024).
  54. Global ADAS/AD Chip Industry Research Report 2022: In Addition to Computing Power, Self-Developed Core IP Is the Focus of Competition for Major SoC Vendors, Research and Markets, 2022, Global Newswire. Available online: https://www.globenewswire.com/en/news-release/2022/05/26/2450958/28124/en/Global-ADAS-AD-Chip-Industry-Research-Report-2022-In-addition-to-Computing-Power-Self-Developed-Core-IP-is-the-Focus-of-Competition-for-Major-SoC-Vendors.html (accessed on 6 March 2024).
  55. Dominguez, D. Tesla Introduces D1 Dojo Chip to Train AI Models. 2021. Available online: https://www.infoq.com/news/2021/09/tesla-dojo-ai-models/ (accessed on 18 March 2024).
  56. Morgan, T.P. INSIDE TESLA’S INNOVATIVE AND HOMEGROWN “DOJO” AI SUPERCOMPUTER. 2022. Available online: https://www.nextplatform.com/2022/08/23/inside-teslas-innovative-and-homegrown-dojo-ai-supercomputer/ (accessed on 6 March 2024).
  57. Kulkarni, A. Tesla Reveals Whitepaper on Dojo Supercomputer. 2021. Available online: https://analyticsdrift.com/tesla-reveals-whitepaper-on-dojo-supercomputer/ (accessed on 6 March 2024).
  58. Tesla Hardware 3 (Full Self-Driving Computer) Detailed, Autopilot Review. Available online: https://www.autopilotreview.com/tesla-custom-ai-chips-hardware-3/ (accessed on 20 March 2024).
  59. Pogla, M. Tesla’s AI Supercomputer a Game Changer in Autonomous Driving. 2023. Available online: https://autogpt.net/teslas-ai-supercomputer-a-game-changer-in-autonomous-driving/ (accessed on 6 March 2024).
  60. The System-on-Chip for Automotive Applications. Mobile Eye. Available online: https://www.mobileye.com/technology/eyeq-chip/ (accessed on 6 March 2024).
  61. Driving the Autonomous Vehicle Evolution. Available online: https://www.mobileye.com/ (accessed on 6 March 2024).
Figure 1. A typical connected vehicle system [5].
Figure 1. A typical connected vehicle system [5].
Wevj 15 00262 g001
Figure 2. Connected automated vehicles.
Figure 2. Connected automated vehicles.
Wevj 15 00262 g002
Figure 3. Location of the sensors and autonomous vehicle technologies [8].
Figure 3. Location of the sensors and autonomous vehicle technologies [8].
Wevj 15 00262 g003
Figure 4. ADAS 3600 vision sensors and applications [18].
Figure 4. ADAS 3600 vision sensors and applications [18].
Wevj 15 00262 g004
Figure 5. Various types of communications that may exist in connected autonomous vehicles [44].
Figure 5. Various types of communications that may exist in connected autonomous vehicles [44].
Wevj 15 00262 g005
Figure 6. Sources of added energy consumption from Ford’s Fusion’s autonomy system.
Figure 6. Sources of added energy consumption from Ford’s Fusion’s autonomy system.
Wevj 15 00262 g006
Table 1. Autonomous vehicle level classifications, as defined by SAE.
Table 1. Autonomous vehicle level classifications, as defined by SAE.
LevelDescriptionExplanation
0No AutomationManual control. The driver performs all the tasks such as steering, acceleration, braking, etc. The driver is responsible for all aspects of dynamic driving.
1Driver AssistanceThe vehicle features a single automated system (for example, it monitors speed through cruise control). The driver must remain engaged and monitor the environment at all times.
2Partial AutomationThe vehicle features multiple automated systems (e.g., cruise control and lane-keeping). The driver must remain engaged and monitor the environment at all times.
3Conditional AutomationThe vehicle can perform most driving tasks, but human override is still required. Environmental detection capabilities are present. The driver must be ready to take control when requested.
4High AutomationThe vehicle performs all driving tasks under specific circumstances. Human override is still an option, but the system can handle the majority of situations independently. Mostly limited to certain speeds and certain geographical locations.
5Full AutomationThe vehicle performs all driving tasks under all conditions. Zero human attention or interaction is required. The system is capable of managing all driving scenarios without any human intervention.
Table 2. Different sensors used in ADAS, based on Figure 4.
Table 2. Different sensors used in ADAS, based on Figure 4.
ApplicationSensor Type
Surround viewCamera
Park assistanceCamera
Blind spot detectionRadar/LiDAR
Rear collision warningRadar/LiDAR
Cross traffic alertRadar/LiDAR
Emergency brakingRadar/LiDAR
Pedestrian detectionRadar/LiDAR
Collision avoidanceRadar/LiDAR
Traffic sign recognitionCamera
Adaptive cruise controlRadar/LiDAR
Lane departure warningCamera
Table 3. Estimates of the data generated by each type of sensor.
Table 3. Estimates of the data generated by each type of sensor.
SensorAverage Data Generated
Radar0.1–15 Mbit/s/sensor
LIDAR20–100 Mbit/s/sensor
Camera500–3500 Mbit/s/sensor
Ultrasonic0.01 Mbit/s/sensor
Vehicle motion, GNSS0.1 Mbit/s/sensor
Table 4. Various camera models used in autonomous vehicles and their power consumption.
Table 4. Various camera models used in autonomous vehicles and their power consumption.
ModelPower Consumption
Sony IMX3071.5 W
OmniVision OV10A201 W
ON Semiconductor AR02312 W
STMicroelectronics D55G13 W
Table 5. Examples of LiDAR models and their power usage.
Table 5. Examples of LiDAR models and their power usage.
ModelPower Consumption
Velodyne Alpha Prime22 W
Ouster OS014–20 W
RoboSense RS-LiDAR-M118 W
Hesai Pandar40P18 W
Livox Tele-1512 W
Table 6. Radar manufacturers for autonomous vehicles and their power consumption.
Table 6. Radar manufacturers for autonomous vehicles and their power consumption.
ModelPower Consumption
Continental ARS 408-26.6 W
Bosch LRR34.0 W
Aptiv SRR26.0 W
Aptiv MRR4.5 W
smartmicro UMRR-0A Type 293.7 W
Table 7. Ultrasonic sensor manufacturers and their power consumption.
Table 7. Ultrasonic sensor manufacturers and their power consumption.
ModelPower Consumption
Continental USR2-3P0.5 W
NXP Semiconductors FXAS210020.7 W
STMicroelectronics VL6180X1 W
TE Connectivity SENSONICS USI-602 W
Table 8. Examples of GPS/GNSS receivers used in autonomous vehicles and their power consumption.
Table 8. Examples of GPS/GNSS receivers used in autonomous vehicles and their power consumption.
ModelPower Consumption
u-blox LEA-6T0.5 W
Trimble BD9920.7 W
NovAtel OEM7290.9 W
AsteRx-m3 Pro+1.8 W
Table 9. Data usage by autonomous vehicles [42].
Table 9. Data usage by autonomous vehicles [42].
SensorData Rate (KB per Second)
Cameras20–40 KB
Radar10–100 KB
Sonar10–100 KB
GPS50 KB
Lidar10–70 KB
Table 10. Computational requirements at different autonomy levels.
Table 10. Computational requirements at different autonomy levels.
LevelDescriptionTOPSKey FeatureUser State
L1–L2+Semi-autonomous, partially automated driving0–30People-oriented, simple system control functionsFEET-OFF
L3 (30–60 TFLOPS)Conditional autonomous driving30–60System-oriented, with driver interventionHANDS-OFF
L4 (>100 TFLOPS)High autonomous driving>100System-oriented, optional manual overrideEYES-OFF
L5 (1000 TFLOPS?)Full autonomous drivingAbout 1000Fully automated, driver not requiredMIND-OFF
TFLOPS: trillion floating point operations per second.
Table 11. Various system-on-a-chip technologies available and their functionalities, architecture, performance parameters, power consumption, memory, interconnectivity, software used, and target applications [53,54,55,56,57,58,59,60].
Table 11. Various system-on-a-chip technologies available and their functionalities, architecture, performance parameters, power consumption, memory, interconnectivity, software used, and target applications [53,54,55,56,57,58,59,60].
FeatureNvidia Orin SoCTesla Dojo ChipD1 ChipMobileye EyeQ5
FocusAutonomous driving, ADAS, roboticsAI training and inferenceHigh-performance computingAutonomous driving, ADAS
ArchitectureArm Cortex-A78AE CPUs, NVIDIA Ampere GPUsCustom Arm CPUs,
custom AI accelerators
Arm Neoverse V1
CPUs, Imagination
BXT-4 GPUs
ArmCortex-A76AE
CPUs, Arm Mali-G78
GPUs
PerformanceUp to 275TOPS(INT8)Up to 1.3 exaFLOPS
(FP16)
Up to 360 TOPS
(BFLOAT16)
Up to 24 TOPS (INT8)
Power consumption
Memory
Interconnectivity
59 W(typical)
LPDDR5, HBM2e
PCIe Gen4, NVLink
Power capability varies
HBM3
NVSwitch
70 W(typical)
LPDDR5
PCLe Gen4, CXL
25 W(typical)
LPDDR4X
PCLe Gen4, Ethernet
SoftwareNvidia DRIVE
Orin Platform (https://www.nvidia.com/en-us/self-driving-cars/in-vehicle-computing/)
TensorFlow (https://www.tensorflow.org/), PyTorch (https://pytorch.org/), Triton Inference Server (https://www.nvidia.com/en-us/ai-data-science/products/triton-inference-server/)Open-source software stack (https://www.openstack.org/)Mobileye EyeQ software suite (https://www.mobileye.com/technology/eyeq-chip/)
Target applicationsLevel 2+ ADAS, Level 3–5 autonomous driving, roboticsHigh Performance Computing (HPC), AI training, large language models, scientific computingHPC, AI inference, edge computingADAS, Level 2+ and Level 3 autonomous
driving
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rajashekara, K.; Koppera, S. Data and Energy Impacts of Intelligent Transportation—A Review. World Electr. Veh. J. 2024, 15, 262. https://doi.org/10.3390/wevj15060262

AMA Style

Rajashekara K, Koppera S. Data and Energy Impacts of Intelligent Transportation—A Review. World Electric Vehicle Journal. 2024; 15(6):262. https://doi.org/10.3390/wevj15060262

Chicago/Turabian Style

Rajashekara, Kaushik, and Sharon Koppera. 2024. "Data and Energy Impacts of Intelligent Transportation—A Review" World Electric Vehicle Journal 15, no. 6: 262. https://doi.org/10.3390/wevj15060262

Article Metrics

Back to TopTop