This paper proposes a novel method to calibrate the extrinsic parameters between a dyad of an event camera and a LiDAR without the need for a calibration board or other equipment. Our approach takes advantage of the fact that when an event camera is in motion, changes in reflectivity and geometric edges in the environment trigger numerous events, which can also be captured by LiDAR.
Dans cet article, nous proposons un prototype de pipeline stéréo événementiel pour la reconstruction 3D et le suivi d’une caméra en mouvement. Le module de reconstruction 3D repose sur la fusion DSI (“disparity space image”), tandis que le module de suivi utilise les surfaces temporelles comme champs de distance anisotropes, pour estimer la pose de la caméra.
This work builds an event-based SL system that consists of a laser point projector and an event camera, and devises a spatial-temporal coding strategy that realizes depth encoding in dual domains through a single shot.
This paper presents a new compact vision sensor consisting of two fish eye event cameras mounted back to-back, which offers a full 360-degree view of the surrounding environment. We describe the optical design, projection model and practical calibration using the incoming stream of events, of the novel stereo camera, called SFERA.
This paper replaces the traditional frame camera with event camera, a novel sensor that harmonizes in sampling frequency with mmWave radar within the ground platform setup, and introduce mmE-Loc, a high-precision, low-latency ground localization system designed for drone landings.