This page looks best with JavaScript enabled

Low-Cost SPAD Sensing for Non-Line-Of-Sight Tracking, Material Classification and Depth Imaging

 ·  ☕ 3 min read

Authors:

CLARA CALLENBERG, University of Bonn, Germany
ZHENG SHI, Princeton University, USA
FELIX HEIDE, Princeton University, USA
MATTHIAS B. HULLIN, University of Bonn, Germany

Link: https://light.princeton.edu/publication/cheapspad/

1. Background

  • Time-correlated imaging, or the recording of the optical response of a scene to transient illumination, allows to analyze the temporal dimension of light transport, a feature that is not accessible in pure intensity imaging
  • Time-correlated optical measurements have established themselves as a valuable source of information
  • The approaches available for recording time-correlated measurements are rich and varied, but most require bulky and expensive hardware and are too fragile to be used outside of lab settings
  • A notable exception is the emerging technology of single-photon avalanche diodes (SPADs)
  • Single-photon avalanche diodes
  • Time-of-flight (ToF), transient and depth imaging
  • Non-line-of-sight (NLOS) tracking
  • Material classification

3. Main Work

img

  • Propose to use an off-the-shelf sensor evaluation kit as a lowcost alternative to high-end SPAD sensors, and equip the board with a custom firmware to output raw photon count histograms
  • Introduce hardware add-ons such as collimating optics and galvanometer scanners to meet the needs of a selection of key applications for time-resolved imaging. Further propose reconstruction pipelines based on inverse filtering, deep learning, and other computational sensing paradigms that are capable of handling the low-resolution time-tagged measurements produced by our system
  • Validate the proposed platform for some of the most iconic application modes of time-resolved imaging, namely non-line-of-sight object tracking, material classification, and depth imaging
  • Propose cost-neutral feature additions to the sensor hardware that would greatly improve their interfacing to external hardware, and their suitability as a general-purpose sensing platform for time-resolved light transport

3.1. Low-Cost SPAD System

  • VL53L1X time-of-flight sensor module by STMicroelectronics
  • The 12-pin package, priced around USD 3 for large volumes, has a footprint of 15mm^2 and integrates a 940nm light source and a 16X16 SPAD array sensor with a field of view of 27 degrees imaged by a miniature lens
  • Use additional optical equipments including glasses and galvanometer scanners for increased flexibility of the system

3.2. Material Classification

  • When placing the sensor right onto the surface of a material, the infrared light from the VL53L1X light source penetrates the material, is scattered inside, and part of it is reflected back to the SPAD sensor
  • Depending on the structure of the material, the signal measured by the sensor can vary temporally and spatially
  • By training a neural network, characteristics of different materials can be learned and they can later be distinguished by holding the sensor to an object

3.3. Tracking Objects “Around the Corner”

img

  • VL53L1X can be used to track an object “around the corner” by illuminating a wall facing the hidden area and recording the echoing light signal that is reflected from the target object
  • Train a neural network to recognize the target position from the SPAD data of four measurements on the wall

3.4. Depth Imaging

img

  • The VL53L1X can yield a spatially resolved transient image by scanning all possible 4×4 ROIs on the 16×16 pixel sensor, which yields a 13×13 pixel measurement
  • Use additional glasses and galvanometer scanners to avoid the substantial blur due to the overlapping ROIs and the poor optical quality of the imaging lens
  • Depth maps are calculated in two different ways:
    • Calculate the given pixel’s depth as the weighted mean of the captured histogram. This way we achieve even smooth depth gradients and sub-bin accuracy in the depth estimation
    • Compute the depth by fitting Gaussian functions to the histogram of each pixel, which yields sharper and more reliable results at the cost of a longer runtime

img

Share on

Wenbo Chen
WRITTEN BY
Wenbo Chen
CG Student

What's on this Page