This work builds an event-based SL system that consists of a laser point projector and an event camera, and devises a spatial-temporal coding strategy that realizes depth encoding in dual domains through a single shot.
This paper presents a new compact vision sensor consisting of two fish eye event cameras mounted back to-back, which offers a full 360-degree view of the surrounding environment. We describe the optical design, projection model and practical calibration using the incoming stream of events, of the novel stereo camera, called SFERA.
This paper replaces the traditional frame camera with event camera, a novel sensor that harmonizes in sampling frequency with mmWave radar within the ground platform setup, and introduce mmE-Loc, a high-precision, low-latency ground localization system designed for drone landings.
This paper proposes a novel, computationally efficient regularizer to mitigate event collapse in the CMax framework. From a theoretical point of view, the regularizer is designed based on geometric principles of motion field deformation (measuring area rate of change along point trajectories).
Event cameras offer high temporal resolution and efficiency but remain underutilized in static traffic monitoring. We present eTraM, a first-of-its-kind event-based dataset with 10 hours of traffic data, 2M annotations, and eight participant classes. Evaluated with RVT, RED, and YOLOv8, eTraM highlights the potential of event cameras for real-world applications.