Edgars Lielamurs, Ibrahim Sayed, Andrejs Cvetkovs, Rihards Novickis, Anatolijs Zencovs, Maksis Celitans, Andis Bizuns, George Dimitrakopoulos, Jochen Koszescha, Kaspars Ozols. A Distributed Time-of-Flight Sensor System for Autonomous Vehicles: Architecture, Sensor Fusion, and Spiking Neural Network Perception. Electronics, 14(7), MDPI, 2025.

Bibtex citation:
@article{17815_2025,
author = {Edgars Lielamurs and Ibrahim Sayed and Andrejs Cvetkovs and Rihards Novickis and Anatolijs Zencovs and Maksis Celitans and Andis Bizuns and George Dimitrakopoulos and Jochen Koszescha and Kaspars Ozols},
title = {A Distributed Time-of-Flight Sensor System for Autonomous Vehicles: Architecture, Sensor Fusion, and Spiking Neural Network Perception},
journal = {Electronics},
volume = {14},
issue = {7},
publisher = {MDPI},
year = {2025}
}

Abstract: Mechanically scanning LiDAR imaging sensors are abundantly used in applications ranging from basic safety assistance to high-level automated driving, offering excellent spatial resolution and full surround-view coverage in most scenarios. However, their complex optomechanical structure introduces limitations, namely limited mounting options and blind zones, especially in elongated vehicles. To mitigate these challenges, we propose a distributed Time-of-Flight (ToF) sensor system with a flexible hardware–software architecture designed for multi-sensor synchronous triggering and fusion. We formalize the sensor triggering, interference mitigation scheme, data aggregation and fusion procedures and highlight challenges in achieving accurate global registration with current state-of-the-art methods. The resulting surround view visual information is then applied to Spiking Neural Network (SNN)-based object detection and probabilistic occupancy grid mapping (OGM) for enhanced environmental awareness. The proposed system is demonstrated on a test vehicle, achieving coverage of blind zones in a range of 0.5–6 m with a scalable and reconfigurable sensor mounting setup. Using seven ToF sensors, we can achieve a 10 Hz synchronized frame rate, with a 360° point cloud registration and fusion latency below 40 ms. We collected real-world driving data to evaluate the system, achieving 65% mean Average Precision (mAP) in object detection with our SNN. Overall, this work presents a replacement or addition to LiDAR in future high-level automation tasks, offering improved coverage and system integration.

URL: https://www.mdpi.com/2079-9292/14/7/1375

Quartile: Q2

Scopus search