Single-photon avalanche diodes (SPAD) are powerful sensors for 3D light detection and ranging (LiDAR) in low light scenarios due to their single-photon sensitivity. However, accurately retrieving ranging information from noisy time-of-arrival (ToA) point clouds remains a challenge. This paper proposes a photon-efficient, non-fusion neural network architecture that can directly reconstruct high-fidelity depth images from ToA data without relying on other guiding images. Besides, the neural network architecture was compressed via a low-bit quantization scheme so that it is suitable to be implemented on embedded hardware platforms. The proposed quantized neural network architecture achieves superior reconstruction accuracy and fewer parameters than previously reported networks.
- single photon avalanche diode
- 3D light detection and ranging
- time of arrival
- depth imaging