A Point-image fusion network for event-based frame interpolation

A Point-image fusion network for event-based frame interpolation

Temporal information in event streams plays a critical role in event-based video frame interpolation as it provides temporal context cues complementary to images. Most previous event-based methods first transform the unstructured event data to structured data formats through voxelisation, and then employ advanced CNNs to extract temporal information.

eWand: A calibration framework for wide baseline event-based camera systems

eWand: A calibration framework for wide baseline event-based camera systems

To overcome calibration limitations, we propose eWand, a new method that uses blinking LEDs inside opaque spheres instead of a printed or displayed pattern. Our method provides a faster, easier-to-use extrinsic calibration approach that maintains high accuracy for both event- and frame-based cameras.