1. Introduction
With the rapid development of science and technology, ranging technology has been an Indispensable part in industrial automation, unmanned driving, smart devices, and such fields. Therefore, accuracy is very important for distance measurement, as different devices now have a higher requirement for security and reliability. Among various technologies, ultrasonic, laser, infrared, and millimeter wave sensors are the most common sensors to be used. Each of them has its situations where they are suitable for use, which show their different advantages and disadvantages in practical applications.
In recent years, research on various types of sensors around the world has made significant progress. Ultrasonic sensors are widely used in short-distance detection because of their low cost and simple structure, such as the ultrasonic sensor type HC-SR04 and type AJ-SR04M which can achieve 99.95% and 99.99% accuracy at 15 to 25 cm, respectively [1]. Laser sensors have become the first choice for autonomous driving, industrial mapping, and robot navigation as they are highly precise and have a high-speed response. For example, ToF LiDAR can achieve 5 cm accuracy at 200 meters [2]. Infrared sensors are outstanding in close-range ranging and obstacle avoidance due to their small size and low power consumption. For instance, VL53L1X can detect up to 3.6 meters with ±3% accuracy with a size of 4.9 × 2.5 × 1.56 mm and power consumption which is about 20 mW [3]. Millimeter wave radars have shown excellent anti-interference ability in harsh environments. The milliMap system can achieve a <0.2m error and ∼90% accuracy in object classification, even in smoke. However, as sensors tend to be used in more and more complicated environments now, a single sensor is often difficult to meet the needs of high-precision ranging; therefore, multi-sensor fusion technology has been highly valued and has gradually become a research hotspot.
Although the multi-sensor fusion technology has great potential, it is still challenging to achieve highly accurate distance measurement in an extreme environment. For example, in high-humidity and strong-interference environments, the signal attenuation and error accumulation problems of different sensors have not been well solved. Therefore, it is of great research significance to optimize the sensor fusion solution, which the study aims to do.
The study will evaluate the performance and limitations of the four sensors in different complex environments, and introduce the optimization methods of multi-sensor fusion work by comparing and analyzing the working principles, unique characteristics and practical applications, which aims to provide a basic and comprehensive reference for preliminary understanding various sensors, sensor selection and application.
2. Principle and application of each single sensor
This article will introduce four different types of distance sensors, including laser, infrared, ultrasonic, and millimetre wave, and cameras will also be mentioned as it has been an essential part in sensor fusion. First of all, the Time of Flight (ToF) is first explained, as it underpins the operation of most ultrasonic sensors and a significant number of laser-based sensors. Cameras will also be introduced first, as it is widely used in sensor fusion technology.
2.1. The ToF principle and cameras
The algorithm that the ToF uses is
In recent years, cameras have also become important tools for distance measurement in many applications. There are three most common camera-based distance sensing technologies: stereo vision, structured light, and ToF [5]. Stereo vision uses two cameras to simulate human eyes. By comparing the images from both cameras, the system can calculate the depth of objects. It works well in environments with enough texture and is used in robotics, self-driving cars, and industrial systems [5].
Structured light works by projecting a known light pattern, like dots or lines, onto an object. The camera captures how the pattern changes, and then measures the 3D shape according to this, which uses triangulation to calculate. It is very accurate at short distances and is used in face recognition and 3D scanning [5]. The principle of ToF has been introduced. ToF cameras can achieve fast and wide-area depth sensing. ToF is used in smartphones, indoor navigation, and warehouse systems [5].
Each type of camera has its strengths and weaknesses of cost, speed, accuracy, and lighting conditions. Which one to choose depends on the situation and needs.
2.2. Ultrasonic sensors
As mentioned above, almost all ultrasonic sensors are based on the Time-of-Flight (ToF) principle. They are widely used in microcontroller technology, which is developing rapidly. These micro-controllers can be easily used with Arduino platforms [1].
Ultrasonic sensors have the advantages of small size, low price, and high accuracy in close-range distance measurement. They have a simple structure and are widely used in many applications that require distance measurement. For example, the ultrasonic sensor types HC-SR04 and AJ-SR04M can achieve 99.95% and 99.99% accuracy at 15 to 25 cm, respectively [1]. However, in actual applications, the error can be much larger. The JSN-SR04T and HC-SR04 sensors have shown error rates of 1.28% and 2.48%, respectively [6]. Although ultrasonic water-resistant sensors have better resistance to environmental factors like rain water and dust, they are still easily disturbed by noise, and their effective distance is usually limited to between 2 cm and 5 m [1].
Applications for ultrasonic sensors include assisting vehicles, detecting air flow velocity in pipes, pool water level control, alarm systems, and other areas where distance measurement is required [1, 7, 8].
2.3. Laser sensors (LiDAR)
Laser sensors, or LiDAR (Light Detection and Ranging), can be based on three principles: ToF, which has been explained, Phase Shift, and Triangulation. The principle of the phase difference method is: continuous waves are emitted and received, the phase difference between them is compared then to calculate the distance, which has high precision. Triangulation uses geometry to measure distance by detecting the angle change of the reflected laser beams [9].
The advantages of laser sensors are that they can offer high accuracy, fast response, and are not easily affected by ambient light. For example, ToF LiDAR can achieve 5 cm accuracy at 200 meters, making it suitable for robot navigation and SLAM mapping [2]. However, there are still some limitations. High-end devices like the Velodyne VLP-16 cost over $8000, and they have worse performance in fog, rain, or with low-reflective surfaces [2, 9].
Laser sensors are widely used in industrial automation, construction, forestry, and smart transportation for distance measurement, object detection, and environmental monitoring [2, 9].
2.4. Infrared sensors
Infrared sensors are often used to refer to sensors that detect thermal radiation emitted by objects, while the type introduced in this paragraph is used for measurement, which are usually called infrared distance sensors or IR(Infrared) rangefinders. According to different distance measurement principles, there are mainly two types of infrared sensors, which are IR LEDs with triangulation and VCSEL lasers with ToF technology.
IR systems based on triangulation usually use IR LEDs and photodiodes to estimate the position of the object according to the light angle. One such system can achieve a 50 Hz update rate, which is better than many commercial laser sensors with a typical 10-40 Hz [10]. It showed an average position error of 6.55 cm, with an angular error of around 0.51°, which is acceptable for an indoor localisation. The advantage of the system is that it is static and low-cost, but it is sensitive to signal reflections from walls and performs best near the centre of the beacon layout [10].
ToF sensors use a different light source, which is a VCSEL (Vertical-Cavity Surface-Emitting Laser). For instance, VL53L0X and VL53L1X use a VCSEL to emit 940 nm IR light for measurement. VL53L1X can detect up to 3.6 meters with ±3% accuracy, and it is not easily affected by the color or reflectivity of the target. It has a compact size, which is only 4.9 × 2.5 × 1.56 mm, and a low power consumption of about 20 mW as well [3]. However, ToF sensors can be affected by ambient light easily and perform worse when measuring some objects with certain materials like glass or acrylic. Besides, when multiple targets are present, it returns a weighted average distance, which reduces accuracy [3].
In conclusion, triangulation systems are suitable for indoor positioning in wide spaces, while ToF sensors are more effective for short-range, touchless applications like smart buttons or interactive displays.
2.5. Millimetre waves sensors
Different from other types of sensors, millimeter wave (mmWave) radar works mainly based on the FMCW (Frequency Modulated Continuous Wave) principle to measure distance and speed by analyzing the frequency difference between signals sent and reflected. It also estimates direction by using the Angle of Arrival (AoA) to calculate from antenna phase differences.
The most significant advantage of mmWave radar is that it works well in low-visibility environments such as smoke or darkness, where optical sensors often have poor performance. The milliMap system, which uses a low-cost radar, can achieve <0.2m error and ∼90% accuracy in object classification, even in smoke. It is also cheap ($299), light (<30g), and low-power (2W) compared to lidar systems, which are larger and more expensive, such as VLP-16 [11].
However, it has its weaknesses as well. The point cloud is sparse, with only ∼100 points per scan. It also suffers from multi-path noise, with up to 75% ghost points, especially indoors. The angular resolution is limited to 15° horizontally, which makes it hard to detect small or close objects. Therefore, complex algorithms like GANs are needed to improve mapping [11].
mmWave radar is ideal for robot navigation and emergency rescue, where it can build reliable maps and recognize walls, doors, glass, and elevators in challenging environments [11].
3. Sensor fusion
As mentioned, one single type of sensor can not meet all the needs due to the various environments. Each sensor has its own strengths and weaknesses. To overcome their limitations, sensor fusion combines information from different kinds of sensors to improve accuracy, robustness, and adaptability. This method is now widely used in areas like autonomous driving, robotics, and smart surveillance systems. The following text will introduce sensor fusion, which focuses on the several types mentioned above.
3.1. Common types of sensor fusion
This section focuses on sensor fusion involving the sensors that have been introduced. These sensors are often used together in modern perception systems.
3.1.1. Millimeter-wave radar and camera fusion
Millimeter-wave radar has an excellent ability to measure under poor weather conditions such as fog, rain, or dust. Cameras can provide rich color and texture details, but are sensitive to lighting and weather. When used together, the system becomes much more reliable. Wei et al. found that combining these two types of sensors can increase obstacle detection accuracy in bad weather conditions [12].
3.1.2. Millimeter-wave radar and LiDAR fusion
LiDAR gives a clear 3D view of the surroundings, but its performance degrades in harsh environments. Millimeter-wave radar helps by adding speed and motion data, which can achieve complementary advantages. According to Yan, this combination reduced detection errors, making it useful for tracking moving objects like cars or pedestrians [13]. However, fusion is not always beneficial. It can have problems with tall objects like a truck [14].
3.1.3. Camera and LiDAR fusion
This is a popular method in autonomous driving. LiDAR measures exact distances to objects, while cameras can recognize traffic signs, road markings, and colors. With these two, a full understanding of the scene can be provided. Zhang et al. reported that this kind of fusion can reach very high accuracy in 3D object detection tasks, which is critical for safe and reliable self-driving [15].
3.1.4. Ultrasonic sensor and camera fusion
Ultrasonic sensors are low-cost and good at detecting nearby objects, but they can’t recognize what the object is. When combined with a camera, the system is able to gain visual details. Lee et al. showed that this fusion achieved high accuracy in real-time object detection with a lightweight embedded system, which makes it suitable for parking assistance and close-range obstacle detection [16].
3.1.5. Fusion of Infrared distance sensor and ultrasonic sensor
Infrared distance sensors are fast and accurate at short ranges but sensitive to surface reflectivity. In contrast, ultrasonic sensors are not easily affected by different material properties but are less precise. When fused, these sensors can complement each other. Experiments show a reduction in measurement error, which makes it a prior when the distance is less than 0.5m [17]. This combination is widely used in mobile robotics for reliable obstacle avoidance [17].
3.1.6. Fusion of infrared distance sensor and LiDAR
Infrared rangefinders are lightweight and ideal for near-field detection, while LiDAR provides high-accuracy long-range mapping. Combining them can enhance multi-scale perception. PX4 documentation highlights that infrared ToF sensors are faster and more compact than ultrasonic or laser-based systems, making them suitable for helping LiDAR in tight environments [18].
3.2. Advantages of sensor fusion
Sensor fusion has several clear benefits compared with a single type of sensor: The first advantage is better accuracy and reliability. By using sensors with different strengths, the system can detect more things with fewer errors. For example, if a camera has trouble seeing in the fog, a radar can still detect objects. This kind of cooperation makes the system more dependable and can cope with different occasions.
It can offer more information for proper decisions as well. Fusion sensors can give the system a fuller picture. It can know how far away something is from information given by LiDAR, how fast it’s moving from information given by radar, and what it looks like from information given by a camera, which helps the system make safer and more reliable decisions.
Another benefit is that it can achieve a good balance between speed and cost. Some fusion systems can run in real time on affordable hardware. For instance, the system designed by Lee et al. used simple sensors and still can achieve high accuracy [16]. This means fusion can be used even in low-cost applications without losing performance.
3.3. Applications of sensor fusion
Finally, some applications of fusion sensors will be introduced, namely, autonomous driving, robotics, surveillance, and security. Self-driving cars need to recognize lanes, pedestrians, traffic signs, and obstacles at all times. Sensor fusion allows these vehicles to “see” better and react more safely in complex traffic environments.
Mobile robots use fusion to build maps, avoid obstacles, and move around by themselves. Combining vision, distance, and motion data helps robots make better choices in real time. Fusion sensors can be used in intelligent security systems. Through the lidar ranging and camera face recognition, they can accurately track targets even at night or in backlight conditions. In close-range situations like parking, ultrasonic sensors and cameras work together to detect nearby objects and guide the driver or vehicle safely.
4. Conclusion
This paper analyses the working principles, performance characteristics, and performance in application of several mainstream ranging sensors in different application scenarios, including laser, infrared, ultrasonic, and millimetre wave types, respectively. Camera is also mentioned as it has played an indispensable role in assisting sensors. Various types of sensors have their own advantages in ranging accuracy, ability to resist interference, environmental adaptability, and cost. For example, some are suitable for high-precision measurement in short ranges, while others can work stably even in bad weather. However, they also have certain limitations, such as being easily affected by noise, light, or surface materials, resulting in measurement errors or failures. Therefore, it is almost impossible to complete so many kinds of tasks in complex environments with a single type of sensor.
To solve this problem, sensor fusion technology came into being. By complementing and integrating data from different types of sensors, it not only makes the sensor more adaptable to various environments but also increases the accuracy of target recognition and helps make more proper decisions. Nowadays, this technology has been widely used in fields such as autonomous driving, robot navigation, and security monitoring, which shows good potential and development prospects.
In the future, with the continuous development of artificial intelligence and edge computing, sensor fusion will be further improved. It will be more instantaneous, more intelligent, and have a lower cost, which will be highly adaptable for unmanned driving, or be used in robotics and such automation. But at the same time, it is also necessary to solve some challenges, such as optimising algorithms and systems to achieve better data fusion. Therefore, sensor fusion is not only a trend in the future’s technological development, but also an important direction for continuous exploration in future research and application in engineering.
References
[1]. Sze, E., Hindarto, D., Wirayasa, I. K. A., & Haryono, H. (2022). Performance Comparison of Ultrasonic Sensor Accuracy in Measuring Distance. Sinkron: Jurnal dan Penelitian Teknik Informatika, 6(4), 2556-2562.
[2]. Yang, T., Li, Y., Zhao, C., Yao, D., Chen, G., Sun, L., ... & Yan, Z. (2022). 3D ToF LiDAR in mobile robotics: A review. arXiv preprint arXiv: 2202.11025.
[3]. Durão, M. I. C. S. (2022). Analysis of a ToF Sensor for Applications in Touchless Interfaces (Master's thesis, Universidade do Porto (Portugal)).
[4]. Brand, T. (2021). Time of flight system for distance measurement and object detection. Çevrimiçi]. Available: https: //www. Analog. com/en/technicalarticles/tof-system-for-distance-measurement-and-object-detection. html.
[5]. Vzense Technology. (n.d.). Comprehensive guide to depth-sensing 3D cameras. Vzense. https: //www.vzense.com/depth-sensing-3d-camera-guide
[6]. Purwanto, H., Riyadi, M., Astuti, D. W. W., & Kusuma, I. W. A. W. (2019). Komparasi sensor ultrasonik HC-SR04 dan JSN-SR04T untuk aplikasi sistem deteksi ketinggian air. Simetris: Jurnal Teknik Mesin, Elektro dan Ilmu Komputer, 10(2), 717-724.
[7]. Suastika, K. G., Nawir, M., & Yunus, P. (2014). Sensor Ultrasonik Sebagai Alat Pengukur Ultrasonic Sensor As a Measurement Device of Air Flow Velocity in the Pipe. Jurnal Pendidikan Fisika Indonesia (Indonesian Journal of Physics Education), 10(1), 163–172.
[8]. Rojikin, I., & Gata, W. (2019). Pemanfaatan Sensor Suhu DHT-22, Ultrasonik HC-SR04 Untuk Mengendalikan Kolam Dengan Notifikasi Email. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 3(3), 544-551.
[9]. Pu, Y. (2024). An analysis of laser distance measuring by different laser rangefinders. Theoretical and Natural Science, 38(1), 235–239.
[10]. Ciężkowski, M., & Kociszewski, R. (2024). Fast 50 Hz updated static infrared positioning system based on triangulation method. Sensors, 24(5), 1389.
[11]. Lu, C. X., Rosa, S., Zhao, P., Wang, B., Chen, C., Stankovic, J. A., ... & Markham, A. (2020, June). See through smoke: robust indoor mapping with low-cost mmwave radar. In Proceedings of the 18th International Conference on Mobile Systems, Applications, and Services (pp. 14-27).
[12]. Wei, Z., Zhang, F., Chang, S., Liu, Y., Wu, H., & Feng, Z. (2022). Mmwave radar and vision fusion for object detection in autonomous driving: A review. Sensors, 22(7), 2542.
[13]. Yan, S. (2024, November 26). Overview of sensor applications in intelligent driving. Applied and Computational Engineering, 80, 196–202.
[14]. Deng, Y. W. J., Liu, Y. L. J. H. C., Ji, Y. Z. J., & Zhang, W. O. Y. (n.d.). Bi-LRFusion: Bi-Directional LiDAR-Radar Fusion for 3D Dynamic Object Detection Supplementary Material.
[15]. Aniobi, A. (2024). Sensor Fusion for Real‐Time Object Detection and Spatial Positioning in Unmanned Vehicles Using YOLOv8 and ESP32‐Cam.
[16]. Nguyen, V.-Q., Kyun, S.-B., & Han, S.-H. (2010, February 4–6). Robust real-time control of autonomous mobile robot by using ultrasonic and infrared sensors. In Proceedings of the Fifteenth International Symposium on Artificial Life and Robotics (AROB ’10) (pp. 561–564). ALife Robotics Corporation Ltd.
[17]. PX4 Dev Team. (n.d.). Distance sensors (rangefinders). PX4 User Guide. https: //docs.px4.io/main/en/sensor/rangefinders.html
[18]. Wang, H., Liu, J., Dong, H., & Shao, Z. (2025). A Survey of the Multi-Sensor Fusion Object Detection Task in Autonomous Driving. Sensors, 25(9), 2794.
Cite this article
Li,J. (2025). Study of the Principles and Applications of Common Distance Measuring Sensors. Applied and Computational Engineering,172,179-185.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of CONF-FMCE 2025 Symposium: Semantic Communication for Media Compression and Transmission
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Sze, E., Hindarto, D., Wirayasa, I. K. A., & Haryono, H. (2022). Performance Comparison of Ultrasonic Sensor Accuracy in Measuring Distance. Sinkron: Jurnal dan Penelitian Teknik Informatika, 6(4), 2556-2562.
[2]. Yang, T., Li, Y., Zhao, C., Yao, D., Chen, G., Sun, L., ... & Yan, Z. (2022). 3D ToF LiDAR in mobile robotics: A review. arXiv preprint arXiv: 2202.11025.
[3]. Durão, M. I. C. S. (2022). Analysis of a ToF Sensor for Applications in Touchless Interfaces (Master's thesis, Universidade do Porto (Portugal)).
[4]. Brand, T. (2021). Time of flight system for distance measurement and object detection. Çevrimiçi]. Available: https: //www. Analog. com/en/technicalarticles/tof-system-for-distance-measurement-and-object-detection. html.
[5]. Vzense Technology. (n.d.). Comprehensive guide to depth-sensing 3D cameras. Vzense. https: //www.vzense.com/depth-sensing-3d-camera-guide
[6]. Purwanto, H., Riyadi, M., Astuti, D. W. W., & Kusuma, I. W. A. W. (2019). Komparasi sensor ultrasonik HC-SR04 dan JSN-SR04T untuk aplikasi sistem deteksi ketinggian air. Simetris: Jurnal Teknik Mesin, Elektro dan Ilmu Komputer, 10(2), 717-724.
[7]. Suastika, K. G., Nawir, M., & Yunus, P. (2014). Sensor Ultrasonik Sebagai Alat Pengukur Ultrasonic Sensor As a Measurement Device of Air Flow Velocity in the Pipe. Jurnal Pendidikan Fisika Indonesia (Indonesian Journal of Physics Education), 10(1), 163–172.
[8]. Rojikin, I., & Gata, W. (2019). Pemanfaatan Sensor Suhu DHT-22, Ultrasonik HC-SR04 Untuk Mengendalikan Kolam Dengan Notifikasi Email. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 3(3), 544-551.
[9]. Pu, Y. (2024). An analysis of laser distance measuring by different laser rangefinders. Theoretical and Natural Science, 38(1), 235–239.
[10]. Ciężkowski, M., & Kociszewski, R. (2024). Fast 50 Hz updated static infrared positioning system based on triangulation method. Sensors, 24(5), 1389.
[11]. Lu, C. X., Rosa, S., Zhao, P., Wang, B., Chen, C., Stankovic, J. A., ... & Markham, A. (2020, June). See through smoke: robust indoor mapping with low-cost mmwave radar. In Proceedings of the 18th International Conference on Mobile Systems, Applications, and Services (pp. 14-27).
[12]. Wei, Z., Zhang, F., Chang, S., Liu, Y., Wu, H., & Feng, Z. (2022). Mmwave radar and vision fusion for object detection in autonomous driving: A review. Sensors, 22(7), 2542.
[13]. Yan, S. (2024, November 26). Overview of sensor applications in intelligent driving. Applied and Computational Engineering, 80, 196–202.
[14]. Deng, Y. W. J., Liu, Y. L. J. H. C., Ji, Y. Z. J., & Zhang, W. O. Y. (n.d.). Bi-LRFusion: Bi-Directional LiDAR-Radar Fusion for 3D Dynamic Object Detection Supplementary Material.
[15]. Aniobi, A. (2024). Sensor Fusion for Real‐Time Object Detection and Spatial Positioning in Unmanned Vehicles Using YOLOv8 and ESP32‐Cam.
[16]. Nguyen, V.-Q., Kyun, S.-B., & Han, S.-H. (2010, February 4–6). Robust real-time control of autonomous mobile robot by using ultrasonic and infrared sensors. In Proceedings of the Fifteenth International Symposium on Artificial Life and Robotics (AROB ’10) (pp. 561–564). ALife Robotics Corporation Ltd.
[17]. PX4 Dev Team. (n.d.). Distance sensors (rangefinders). PX4 User Guide. https: //docs.px4.io/main/en/sensor/rangefinders.html
[18]. Wang, H., Liu, J., Dong, H., & Shao, Z. (2025). A Survey of the Multi-Sensor Fusion Object Detection Task in Autonomous Driving. Sensors, 25(9), 2794.