Overview of Sensor Applications in Intelligent Driving

Research Article
Open access

Overview of Sensor Applications in Intelligent Driving

Shuaikun Yan 1*
  • 1 School of Electrical Engineering, Sichuan University, Chengdu, China    
  • *corresponding author shuaikunyan2002@outlook.com
Published on 26 November 2024 | https://doi.org/10.54254/2755-2721/80/2024CH0086
ACE Vol.80
ISSN (Print): 2755-273X
ISSN (Online): 2755-2721
ISBN (Print): 978-1-83558-561-0
ISBN (Online): 978-1-83558-562-7

Abstract

With the rapid development of autonomous driving technology, sensors are the core component of the intelligent driving system, whose selection and combination are crucial to the safety and efficiency of the system. This paper reviews the applications of different types of sensors in intelligent driving systems, including cameras, LIDAR, millimeter-wave radar, and infrared sensors. By analyzing the best choice of these sensors in urban traffic, highway, night driving and field driving, this paper explores the effectiveness of the sensor combination and cites the relevant literature in the past five years.

Keywords:

Sensor, intelligent driving, fusion sensor.

Yan,S. (2024). Overview of Sensor Applications in Intelligent Driving. Applied and Computational Engineering,80,196-202.
Export citation

1. Introduction

The progress of autonomous driving technology relies on the effective combination of multiple sensors to achieve comprehensive environmental perception. According to the standards of the Society of Automotive Engineers International (SAE), the classification of autonomous driving technology ranges from L0 (completely manual) to L5 (completely automatic), and the comprehensive application of sensors is particularly crucial in L2 and above.

The requirements for sensors differ in various driving scenarios. Take urban traffic as an example, there are not only pedestrians, cyclists, and other vehicles but also various complex traffic facilities such as traffic lights and signs. Therefore, in this environment, the combined use of multiple sensors is especially significant. LIDAR can provide high-precision 3D environmental modeling, while cameras can identify traffic signs, signal lights, and the dynamic behavior of other vehicles. Millimeter-wave radar sensors perform well under adverse weather conditions, ensuring the stability of the system in rainy and snowy weather, and infrared sensing has good night vision capabilities.

In the scenario of highway driving, the main task of the autonomous driving system is to maintain lane driving and safely overtake. In this case, the ability for long-distance detection is particularly critical, so high-performance radars and cameras become the preferred choices. Radars can detect distant obstacles and the speed and distance of other vehicles in this environment, thereby providing the necessary reaction time for the autonomous driving system. Additionally, cameras also play a significant role in identifying traffic signs and lane lines.

Night driving is equally challenging. In conditions of insufficient light, the effectiveness of traditional cameras is greatly reduced, while infrared sensors and LIDAR can provide clearer perception capabilities in the night environment. Infrared sensors can detect heat sources, such as pedestrians and animals, thereby enhancing safety. LIDAR relies on the reflection of laser beams and can form detailed environmental images in the dark, providing necessary navigation information for autonomous driving vehicles.

In the scenario of off-road driving, the complexity of the environment increases, with rugged terrain and various natural obstacles. In this case, a combination of sensors that can adapt to multiple terrains and environmental changes must be selected. LIDAR and high-resolution cameras can provide accurate terrain data in this scenario and assist the autonomous driving system in path planning.

The selection and integration of sensors are of paramount importance for achieving efficient and reliable autonomous driving. This article will explore the optimal sensor choices in urban traffic, highway driving, night driving, and off-road driving.

2. Sensor types

In intelligent driving systems, sensors are an important part of environment perception, which provide detailed information about the surrounding environment, thus ensuring safety and autonomous decision-making capabilities. The types commonly used include cameras, LIDAR, millimeter-wave radar and infrared sensors. Each sensor has different working principles and application scenarios, and has different advantages and limitations in different environments.

2.1. Camera

The camera is one of the most commonly used sensors in intelligent driving vehicles. It provides visual information to the vehicle by capturing images, similar to the human eye. Cameras are good at identifying objects, lane lines, traffic lights and road signs. It is capable of generating high-resolution images, suitable for the recognition of static objects. However, the camera has poor performance in under light environments (such as night or fog) and does not have distance perception capability.

2.2. LIDAR

LIDAR emits laser pulses and measures its return time to generate high-precision 3D maps around the vehicle, especially in spatial modeling and object contour recognition. The advantage of LIDAR is that it provides accurate distance information, but it is expensive and its performance is affected in bad weather (such as rain and fog).

2.3. Millimeter-wave radar

By transmitting electromagnetic waves and receiving reflected signals, millimeter-wave radar can provide information about the distance, speed and angle of objects, and is especially suitable for detecting distant objects when driving at high speeds. Compared with LIDAR, millimeter-wave radar is more stable in bad weather, has strong penetration, and can detect environmental information under rain, fog and other conditions

2.4. Infrared sensor

Infrared sensors sense it by detecting the heat emitted by objects, especially at night or in low-light environments. Infrared sensors have dark condition advantages over cameras, but they cannot provide accurate object shape information, and therefore are often used in combination with other sensors.

3. Sensor application selection in different scenarios

3.1. Urban traffic

In the urban traffic environment, the challenges of vehicles come not only from other vehicles, but also from complex and diverse dynamic elements, such as pedestrians, bicycles, traffic signals, etc. Therefore, cameras are predominant in urban traffic. It provides high-definition visual information to identify and classify complex traffic scenarios, which is crucial for accurate decision-making in autonomous driving. However, the weakness of the camera is vulnerable to light conditions, especially poor performance under insufficient light. In the paper of Wei, Z. and others, they proposed that the long-range and all-weather characteristics of millimeter-wave radar make it very suitable for dynamic object detection in urban traffic, while the camera provides high-resolution image information to identify traffic signals, pedestrians and vehicles. This multi-sensor fusion can effectively compensate for the limitations of a single sensor, especially in a busy urban traffic environment [1]. To make up for this deficiency, the millimeter wave radar is usually used in combination with cameras, especially in complex urban traffic. The millimeter wave radar can provide all-weather object detection, especially on congested roads, where the radar can accurately detect the distance and relative speed of surrounding vehicles and pedestrians. The penetration capability of millimeter-wave radar is critical in cities because it is able to provide stable performance in rainy, foggy or other low-visibility situations. Therefore, the combination of cameras and millimeter-wave radar provides a stable and efficient environmental perception capability for autonomous driving.

In some advanced autonomous driving systems, LIDAR usually acts as an enhanced sensor to provide high-precision three-dimensional environment perception. This point is particularly important in the complex urban traffic environment. For example, in narrow streets, building-dense areas, or crowded traffic intersections, LIDAR can accurately detect objects around the vehicle through 3D modeling, helping autonomous vehicles to better understand the environment [2]. Although LIDAR is expensive, its performance in dense, dynamic traffic environments cannot be completely replaced by other sensors, especially when high-precision distance measurements are required. Moreover, Abdu, F.J. et al. believe that although LIDAR provides three-dimensional maps with very high accuracy, it is mainly used for scenes requiring high precision perception due to its high cost. The camera solution combined with millimeter-wave radar is excellent at balancing cost and performance [3]. Therefore, the combination of camera and millimeter-wave radar is the most common and economical choice in urban traffic, while LIDAR is a supplement to the need for high precision.

In urban traffic, through multi-sensor fusion technology, autonomous driving systems can obtain information from different types of sensors to improve the overall perception ability. Cameras provide high-resolution visual information, millimeter-wave radar complements all-weather long-range detection capabilities, and LIDAR increases perceptual accuracy through three-dimensional modeling. This combination can improve the robustness of the system and ensure that autonomous vehicles can cope with complex and changeable urban environments and reduce the risk of accidents

3.2. Highways

Highways are characterized by high speed and long travel distances, so sensors need to have remote detection capability and be able to quickly and accurately detect the relative speed of obstacles and the vehicle in front in a high-speed environment. In this case, the millimeter-wave radar is considered the optimal option. Millimeter-wave radar can provide a detection range of hundreds of meters long, and it can accurately measure the speed of a target object, which is crucial for high-speed vehicles. In addition, the millimeter-wave radar can also maintain good performance in severe weather conditions (such as heavy rain or fog), which makes it widely used in highway scenarios.

In the highway scene, the sensors need to have long-range detection capabilities and rapid response capabilities. Abdu, F.J. et al. proposed that millimeter-wave radar is a core sensor on highways because of its long-distance detection and speed measurement capabilities when driving at high speeds [3]. In addition, cameras are used to detect lane lines, signs and traffic signals to ensure that vehicles are driving in the right lane.

Although the millimeter-wave radar is the main sensor on the highway, it has a low angular resolution, making it difficult to accurately identify the contour of objects. At this point, the 3D modeling capability of the LIDAR plays a complementary role. Especially when the vehicle requires high-speed obstacle avoidance, the LIDAR can provide accurate object distance and profile information. At the same time, the camera can identify important visual information such as traffic signs and lane lines to ensure that the vehicle can accurately drive along the right lane at high speeds.

At the same time, Wei, Z. et al. point out that by combining millimeter-wave radar and LIDAR, the autonomous driving system can achieve more accurate environmental perception in high-speed environment, especially in high-speed lane change and obstacle avoidance scenarios, LIDAR provides a very high spatial accuracy [1]

In the highway scene, sensor fusion technology can combine the long-range detection of millimeter-wave radar, the accurate modeling of LIDAR and the visual recognition ability of the camera to ensure that the autonomous driving system has a full range of environmental perception ability at high speeds. For example, millimeter-wave radar can detect distant vehicles and obstacles, while LIDAR can accurately measure their distance and profile information, and cameras can provide road identification information to help vehicles achieve adaptive cruise and lane keeping.

3.3. Foggy weather

Fog has a significant impact on the performance of visible light sensors such as cameras, as fog significantly reduces the clarity of visual perception. millimeter-wave radar shows clear advantages under such conditions. Electromagnetic waves emitted by millimeter-wave radar can penetrate fog and can still accurately detect vehicles and obstacles ahead in low-visibility conditions. In addition, the long-range detection capability of MM-wave is particularly important in foggy weather, providing enough reaction time for vehicles to slow down or change lanes in advance to avoid collisions. Fog has a significant impact on the performance of visual sensors such as cameras and LIDAR, as fog significantly reduces the penetration of visible light and lasers. Abdu, F.J. et al. emphasized that millimeter-wave radar can maintain its stable performance in low visibility weather due to its fog and penetrating characteristics [3]. This makes the millimeter-wave radar the preferred sensor in bad weather.

Although millimeter-wave radar has strong detection ability in foggy weather, it has shortcomings in identifying heat source targets (such as pedestrians and animals). Infrared sensors can make up for this defect, especially in extremely low-visibility environments, which sense the environment by detecting the heat of objects. Infrared sensors provide effective thermal imaging that can help autonomous vehicles identify the heat source in front of them, thus avoiding collisions. Such sensors are particularly suited for use in combination with millimeter-wave radar to provide more comprehensive environmental sensing capabilities. In addition, Zhou T et al. showed that infrared sensors can identify traveling humans and animals in very low-visibility environments such as fog by detecting the heat of objects [4]. The solution of combining millimeter-wave radar and infrared sensors can provide a more comprehensive sensing capability for the vehicle to ensure safe driving in extreme weather conditions [3].

In foggy weather, a single sensor makes it difficult to guarantee sufficient sensing accuracy. By combining millimeter-wave radar, infrared sensors and LIDAR, the autonomous driving system can maintain high perception in low-visibility environments. For example, millimeter-wave radar can be used to detect obstacles over long distances, infrared sensors can identify heat source targets, while LIDAR can provide close-range object detection. This multi-sensor fusion technology ensures the safety of autonomous vehicles in extreme weather conditions.

3.4. Night driving

During night driving, traditional cameras perform poorly due to a lack of light. At this point, the infrared sensor can detect the heat of objects, in the dark. Thermal imaging of infrared sensors makes it an ideal tool for night perception, especially in environments with no street lights or extremely poor light.

When driving at night, the camera performance drops dramatically. Abdu, F.J. et al. show that infrared sensors can identify pedestrians, animals and other targets in front of them under darkness by detecting heat sources, and become an essential sensor when driving at night [3]. However, although infrared sensors can provide thermal imaging information, they lack precise environmental profile data.

Although the infrared sensor performs well at night, it cannot provide detailed information on the environmental profile. At this point, the combination of cameras and millimeter-wave radar became the mainstream solution for night driving. Cameras can still capture high-resolution images with light, for example, on streetlights on highways that can clearly identify road signs and lane lines. Millimeter-wave radar is responsible for long-distance obstacle detection in a dark environment to ensure the safety of vehicles.

Wei, Z. et al. suggest that the camera should be combined with millimeter wave radar, which can provide visual information under the condition of the light source, while millimeter wave radar will continue to be responsible for long-distance detection and obstacle avoidance to ensure the safety of night driving [1].

During night driving, the multi-sensor fusion technology combining infrared sensors, cameras and millimeter-wave radar can provide a full range of environmental sensing capabilities for autonomous vehicles. Infrared sensors capture heat sources, and the camera provides visual information, while millimeter-wave radar ensures long-distance detection. By integrating the data from the three sensors, the autonomous driving system can have a similar perception ability at night as during the day, thus improving the safety and perception of night driving [5].

3.5. Field driving

In field driving, the environment is complex and changeable, and the terrain is diverse. Zhou, Y. et al. point out that the high-precision 3D modeling capability of LIDAR is very important for the identification of complex terrain [5]. In addition, millimeter-wave radar can be used for long-distance target detection, while infrared sensors help identify wild animals at night or in low-light environments. Therefore, the combination of LIDAR, millimeter-wave radar and infrared sensors is the best choice in the field driving scene.

4. The broad prospect of millimeter-wave radar in fusion sensing

4.1. Characteristics of the mm-wave wave

In the current civil autonomous driving, the camera is the most widely used, but in extreme weather, relying solely on cameras for perception in autonomous driving tasks is not reliable. Compared with the camera, the detection performance of millimeter-wave radar is less affected by extreme weather [6]. The millimeter-wave radar can not only measure the speed but also measure the speed vector by using the Doppler effect of the reflected signals from moving objects [7]. However, the millimeter-wave radar cannot provide the contour information of the target, and it is difficult to distinguish between the relatively stationary targets. Therefore, relying on millimeter-wave radar alone cannot complete the task of autonomous driving. In conclusion, the detection capabilities of visual sensors and millimeter-wave radar can complement each other. Detection algorithms based on millimeter-wave radar and visual fusion can significantly improve the perception ability of autonomous vehicles, helping vehicles to better cope with the challenges of accurate target detection in complex scenarios.

4.2. The advantages of millimeter wave over LIDAR

Millimeter-wave radar has significant advantages over LIDAR in intelligent driving systems, primarily in terms of weather adaptability, cost-effectiveness, anti-interference capability, and detection performance. Millimeter-wave radar outperforms LIDAR in detection capability under adverse weather conditions such as rain, fog, and snow, ensuring stable operation in complex environments. For instance, in a comparative study of various sensors' performance under harsh weather, millimeter-wave radar maintained a better detection range than LIDAR, ensuring the safety of vehicle operation.

The manufacturing cost of millimeter-wave radar typically ranges in the hundreds of dollars, while LIDAR costs can reach several thousand to tens of thousands of dollars, making millimeter-wave radar highly suitable for large-scale applications. From a production cost perspective, adopting a combination of millimeter-wave radar and cameras in intelligent driving systems can significantly reduce overall hardware costs, accelerating the technology's adoption.

Additionally, millimeter-wave radar has strong resistance to electromagnetic interference, making it capable of stable operation in electrically noisy urban environments. Its performance in dynamic target tracking is often evaluated as more reliable, providing more effective information support in high-speed driving scenarios.

Currently, intelligent driving systems commonly integrate millimeter-wave radar with cameras instead of LIDAR, mainly due to millimeter-wave radar’s advantages in cost, environmental adaptability, and real-time performance. Firstly, the lower manufacturing costs of millimeter-wave radar help reduce the financial burden of the entire autonomous driving system, making it suitable for large-scale applications. Secondly, millimeter-wave radar performs well in adverse weather conditions, ensuring stable detection capabilities and safe vehicle operation in various environments, while the precision and range of LIDAR often suffer under such conditions. Furthermore, millimeter-wave radar offers enhanced resistance to interference, making it suitable for complex urban settings.

Therefore, mainstream intelligent automobile manufacturers are choosing to combine millimeter-wave radar with cameras, leveraging the stable performance of millimeter-wave radar in various weather conditions along with the advantage of cameras in recognizing traffic signs and lane markings, to achieve a more comprehensive environmental perception.

4.3. The broad use of millimeter-wave and visual fusion

At present, the vast majority of car manufacturers will choose millimeter-wave radar and camera or laser radar and camera fusion sensing scheme, using millimeter-wave radar fusion sensing is the mainstream trend in the field of autonomous vehicles, because the camera and millimeter-wave radar have complementary characteristics, and millimeter-wave radar with remote detection, low cost, dynamic target detectability. Because of these advantages, the sensing capability and safety of vehicles that fuse millimeter-wave radar with visual sensing have been improved. Compared with LIDAR, millimeter-wave is less expensive to deal with bad weather and deploy, so the cost has not been much higher.

The radar is the best sensor for detecting distance and radial speed. It has "all-weather" capabilities, especially given that it still works properly at night. However, the radar cannot distinguish the colors, and the ability to classify the targets is poor. The camera has good color perception and classification ability, and the angular resolution is not weak [8]. However, they are limited in terms of estimating speed and distance [9]. Moreover, image processing relies on the powerful computing power of the onboard chip without requiring information processing from the millimeter-wave radar. Maximizing the utilization of radar sensing data can significantly reduce the number of computational resources needed.

Radar and cameras have several complementary characteristics, making the integration of visual and radar fusion perception technology beneficial for improving obstacle detection in autonomous vehicles. Both millimeter wave radar and LIDAR, when combined with visual data, enhance perception accuracy and target detection capabilities. Each fusion approach capitalizes on the strengths of millimeter wave radar and LIDAR. Future research indicates that the combination of these three technologies—millimeter wave radar, LIDAR, and visual systems—could yield even greater advancements in the field.

Table 1. Autonomous driving sensor solutions of major manufacturers [10-15].

Company

Sensor Configuration

Tesla

8 cameras, 12 ultrasonic radars,

MM-wave radar

Baidu

LIDAR, MM-wave radar, Camera

Xpeng

6 cameras, 2 MM-wave radars,

12 ultrasonic radars

Audi

6 cameras, 5 MM-wave radars,

12 ultrasonic radars, LIDAR

NIO

LIDAR, 11 cameras, 5 MM-wave radars,

12 ultrasonic radars

Mercedes Benz

4 panoramic cameras, LIDAR,

MM-wave radar

5. Conclusion

This paper analyzes the application of different types of sensors in intelligent driving and discusses the best sensor combination and their use ratio in urban traffic, highway, night driving and field driving. The selection and combination of sensors are of great significance in improving the safety and efficiency of the intelligent driving system. In the future, with the continuous progress of technology, sensor fusion and algorithm optimization will provide more powerful support for the development of intelligent driving systems.


References

[1]. Wei, Z.; Zhang, F.; Chang, S.; Liu, Y.; Wu, H.; Feng, Z. MmWave Radar and Vision Fusion for Object Detection in Autonomous Driving: A Review. Sensors 2022, 22, 2542.

[2]. Zhou Y, Liu L, Zhao H, López-Benítez M, Yu L, Yue Y. Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges. Sensors. 2022; 22(11):4208.

[3]. Abdu, F.J.; Zhang, Y.; Fu, M.; Li, Y.; Deng, Z. Application of Deep Learning on Millimeter wave Radar Signals: A Review. Sensors 2021, 21, 1951.

[4]. Zhou T, Yang M, Jiang K, Wong H, Yang D. MMW Radar-Based Technologies in Autonomous Driving: A Review. Sensors. 2020; 20(24):7283.

[5]. Zhou, Y.; Liu, L.; Zhao, H.; López-Benítez, M.; Yu, L.; Yue, Y. Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges. Sensors. 2022, 22, 4208.

[6]. Zhang, R.; Cao, S. Real-time human motion behavior detection via CNN using mmWave radar. IEEE Sens. Lett. 2019, 3, 1–4.

[7]. Nagasaku, T.; Kogo, K.; Shinoda, H. 77 GHz Low-Cost Single-Chip Radar Sensor for Automotive Ground Speed Detection. In Proceedings of the IEEE Compound Semiconductor Integrated Circuits Symposium, Monterey, CA, USA, 12–15 October 2008;pp. 1–4.

[8]. Cho, M. A Study on the Obstacle Recognition for Autonomous Driving RC Car Using LIDAR and Thermal Infrared Camera. In Proceedings of the Eleventh International Conference on Ubiquitous and Future Networks (ICUFN), Zagreb, Croatia, 2–5 July2019; pp. 544–546.

[9]. Alland, S.; Stark, W.; Ali, M.; Hegde, M. Interference in Automotive Radar Systems: Characteristics, Mitigation Techniques, and Current and Future Research. IEEE Signal Process. Mag. 2019, 36, 45–59.

[10]. Tesla. Future of Driving. Available online: https://www.tesla.com/autopilot (accessed on 13 July 2021).

[11]. Apollo. Robotaxi Autonomous Driving Solution. Available online: https://apollo.auto/robotaxi/index.html (accessed on 23 July 2021).

[12]. XPENG. XPILOT Driving. Available online: https://www.xiaopeng.com/p7.html?fromto=gqad004 (accessed on 23 July 2021).

[13]. Audi. Audi AI Traffic Jam Pilot. Available online: https://www.audi-technology-portal.de/en/electrics-electronics/driverassistant-systems/audi-a8-audi-ai-traffic-jam-pilot (accessed on 23 July 2021).

[14]. NIO. NIO Autonomous Driving. Available online: https://www.nio.cn/nad (accessed on 23 July 2021).

[15]. Daimler. Drive Pilot. Available online: https://www.daimler.com/innovation/case/autonomous/drive-pilot-2.html (accessedon 23 July 2021).


Cite this article

Yan,S. (2024). Overview of Sensor Applications in Intelligent Driving. Applied and Computational Engineering,80,196-202.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of CONF-MLA Workshop: Mastering the Art of GANs: Unleashing Creativity with Generative Adversarial Networks

ISBN:978-1-83558-561-0(Print) / 978-1-83558-562-7(Online)
Editor:Mustafa ISTANBULLU, Marwan Omar
Conference website: https://2024.confmla.org/
Conference date: 21 November 2024
Series: Applied and Computational Engineering
Volume number: Vol.80
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).

References

[1]. Wei, Z.; Zhang, F.; Chang, S.; Liu, Y.; Wu, H.; Feng, Z. MmWave Radar and Vision Fusion for Object Detection in Autonomous Driving: A Review. Sensors 2022, 22, 2542.

[2]. Zhou Y, Liu L, Zhao H, López-Benítez M, Yu L, Yue Y. Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges. Sensors. 2022; 22(11):4208.

[3]. Abdu, F.J.; Zhang, Y.; Fu, M.; Li, Y.; Deng, Z. Application of Deep Learning on Millimeter wave Radar Signals: A Review. Sensors 2021, 21, 1951.

[4]. Zhou T, Yang M, Jiang K, Wong H, Yang D. MMW Radar-Based Technologies in Autonomous Driving: A Review. Sensors. 2020; 20(24):7283.

[5]. Zhou, Y.; Liu, L.; Zhao, H.; López-Benítez, M.; Yu, L.; Yue, Y. Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges. Sensors. 2022, 22, 4208.

[6]. Zhang, R.; Cao, S. Real-time human motion behavior detection via CNN using mmWave radar. IEEE Sens. Lett. 2019, 3, 1–4.

[7]. Nagasaku, T.; Kogo, K.; Shinoda, H. 77 GHz Low-Cost Single-Chip Radar Sensor for Automotive Ground Speed Detection. In Proceedings of the IEEE Compound Semiconductor Integrated Circuits Symposium, Monterey, CA, USA, 12–15 October 2008;pp. 1–4.

[8]. Cho, M. A Study on the Obstacle Recognition for Autonomous Driving RC Car Using LIDAR and Thermal Infrared Camera. In Proceedings of the Eleventh International Conference on Ubiquitous and Future Networks (ICUFN), Zagreb, Croatia, 2–5 July2019; pp. 544–546.

[9]. Alland, S.; Stark, W.; Ali, M.; Hegde, M. Interference in Automotive Radar Systems: Characteristics, Mitigation Techniques, and Current and Future Research. IEEE Signal Process. Mag. 2019, 36, 45–59.

[10]. Tesla. Future of Driving. Available online: https://www.tesla.com/autopilot (accessed on 13 July 2021).

[11]. Apollo. Robotaxi Autonomous Driving Solution. Available online: https://apollo.auto/robotaxi/index.html (accessed on 23 July 2021).

[12]. XPENG. XPILOT Driving. Available online: https://www.xiaopeng.com/p7.html?fromto=gqad004 (accessed on 23 July 2021).

[13]. Audi. Audi AI Traffic Jam Pilot. Available online: https://www.audi-technology-portal.de/en/electrics-electronics/driverassistant-systems/audi-a8-audi-ai-traffic-jam-pilot (accessed on 23 July 2021).

[14]. NIO. NIO Autonomous Driving. Available online: https://www.nio.cn/nad (accessed on 23 July 2021).

[15]. Daimler. Drive Pilot. Available online: https://www.daimler.com/innovation/case/autonomous/drive-pilot-2.html (accessedon 23 July 2021).