Fast Image Reconstruction with an Event Camera

Fast Image Reconstruction with an Event Camera

Previous works rely on hand-crafted spatial and temporal smoothing techniques to reconstruct images from events. We propose a novel neural network architecture for video reconstruction from events that is smaller (38k vs. 10M parameters) and faster (10ms vs. 30ms) than state-of-the-art with minimal impact to performance.

TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset

TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset

We provide ground truth poses from a motion capture system at 120Hz during the beginning and end of each sequence, which can be used for trajectory evaluation. TUM-VIE includes challenging sequences where state-of-the art visual SLAM algorithms either fail or result in large drift.

Event-based Visual Odometry on Non-Holonomic Ground Vehicles

Event-based Visual Odometry on Non-Holonomic Ground Vehicles

As demonstrated on both simulated and real data, our algorithm achieves accurate and robust estimates of the vehicle’s instantaneous rotational velocity, and thus results that are comparable to the delta rotations obtained by frame-based sensors under normal conditions. We furthermore significantly outperform the more traditional alternatives in challenging illumination scenarios.

Table tennis ball spin estimation with an event camera

Table tennis ball spin estimation with an event camera

Event cameras do not suffer as much from motion blur, thanks to their high temporal resolution. Moreover, the sparse nature of the event stream solves communication bandwidth limitations many frame cameras face. To the best of our knowledge, we present the first method for table tennis spin estimation using an event camera. We use ordinal time surfaces to track the ball and then isolate the events generated by the logo on the ball.