Edgars Lielāmurs, Kaspars Ozols. Spatio-temporal Object Detection with Deep Spiking CNNs Using Time-of-Flight Data. 2024 19th Biennial Baltic Electronics Conference (BEC) , IEEE, 2024.

Bibtex citation:
@inproceedings{16868_2024,
author = {Edgars Lielāmurs and Kaspars Ozols},
title = {Spatio-temporal Object Detection with Deep Spiking CNNs Using Time-of-Flight Data},
journal = {2024 19th Biennial Baltic Electronics Conference (BEC) },
publisher = {IEEE},
year = {2024}
}

Abstract: Perceiving the surrounding environment with vision sensors, including LiDAR and direct time-of-flight (dToF), is a common task across several domains, such as autonomous vehicles, industrial automation and robotics. As sensors and perception algorithms continue to evolve, their applications are likely to expand further and put more strain on computing resources, necessitating more efficient processing. Neuromorphic computing is a promising solution for efficiently handling sparse event streams, aligning well with the characteristics of sparse LiDAR sensory data. Moreover, the advent of dedicated neuro- morphic vision processors and efficient spike backpropagation training through surrogate gradient (SG) is further inspiring the development of Spiking Neural Networks (SNNs). Thus, in this work, we take advantage of sparse LiDAR sensor point cloud data by formalizing a temporal spike encoding method and implement a 3D object detection convolutional SNN. We conducted comprehensive experiments on the KITTI automotive dataset, showing that the proposed model outperforms closely related spiking neural network solutions and approaches conven- tional state-of-the-art solution performance. More importantly, achieving a mean sparsity of 55.73% underlines the potential of using SNNs for a more efficient way of processing time-of-flight Data.

Quartile: Q1

Scopus search