A Large Scale Event-based Detection Dataset for Automotive

A Large Scale Event-based Detection Dataset for Automotive

In this study, Prophesee introduces the first very large detection dataset for event cameras. The dataset is composed of more than 39 hours of automotive recordings acquired with a 304×240 GEN1 sensor. It contains open roads and very diverse driving scenarios, ranging from urban, highway, suburbs and countryside scenes.

Learning to Detect Objects with a 1 Megapixel Event Camera

Learning to Detect Objects with a 1 Megapixel Event Camera

Our model outperforms by a large margin feed-forward event-based architectures. Moreover, our method does not require any reconstruction of intensity images from events, showing that training directly from raw events is possible, more efficient, and more accurate than passing through an intermediate intensity image.

Photonic Neuromorphic Accelerators for Event-Based Imaging Flow Cytometry

Photonic Neuromorphic Accelerators for Event-Based Imaging Flow Cytometry

In this work, we present experimental results of a high-speed label-free imaging cytometry system that seamlessly merges the high-capturing rate and data sparsity of an event-based CMOS camera with lightweight photonic neuromorphic processing. The results confirm that neuromorphic sensing and neuromorphic computing can be efficiently merged to a unified bio-inspired system, offering a holistic enhancement in emerging bio-imaging applications.

Re-interpreting the Step-Response Probability Curve to Extract Fundamental Physical Parameters of Event-Based Vision Sensors

Re-interpreting the Step-Response Probability Curve to Extract Fundamental Physical Parameters of Event-Based Vision Sensors

In this work, we detail the method for generating accurate S-curves by applying an appropriate stimulus and sensor configuration to decouple 2nd-order effects from the parameter being studied. We use an EVS pixel simulation to demonstrate how noise and other physical constraints can lead to error in the measurement, and develop two techniques that are robust enough to obtain accurate estimates.

Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging

Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging

We introduce a sensor fusion framework to combine single-photon avalanche diodes (SPADs) with event cameras to improve the reconstruction of high-speed, low-light scenes while reducing the high bandwidth cost associated with using every SPAD frame. Our evaluation, on both synthetic and real sensor data, demonstrates significant enhancements (> 5 dB PSNR) in reconstructing low-light scenes at high temporal resolution (100 kHz) compared to conventional cameras. Event-SPAD fusion shows great promise for real-world applications, such as robotics or medical imaging.