Stereo Event-based Particle Tracking Velocimetry for 3D Fluid Flow Reconstruction

Stereo Event-based Particle Tracking Velocimetry for 3D Fluid Flow Reconstruction

First, we track particles inside the two event sequences in order to estimate their 2D velocity in the two sequences of images. A stereo-matching step is then performed to retrieve their 3D positions. These intermediate outputs are incorporated into an optimization framework that also includes physically plausible regularizers, in order to retrieve the 3D velocity field.

Fast Image Reconstruction with an Event Camera

Fast Image Reconstruction with an Event Camera

Previous works rely on hand-crafted spatial and temporal smoothing techniques to reconstruct images from events. We propose a novel neural network architecture for video reconstruction from events that is smaller (38k vs. 10M parameters) and faster (10ms vs. 30ms) than state-of-the-art with minimal impact to performance.

TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset

TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset

We provide ground truth poses from a motion capture system at 120Hz during the beginning and end of each sequence, which can be used for trajectory evaluation. TUM-VIE includes challenging sequences where state-of-the art visual SLAM algorithms either fail or result in large drift.

Event-based Visual Odometry on Non-Holonomic Ground Vehicles

Event-based Visual Odometry on Non-Holonomic Ground Vehicles

As demonstrated on both simulated and real data, our algorithm achieves accurate and robust estimates of the vehicle’s instantaneous rotational velocity, and thus results that are comparable to the delta rotations obtained by frame-based sensors under normal conditions. We furthermore significantly outperform the more traditional alternatives in challenging illumination scenarios.