A Point-image fusion network for event-based frame interpolation

A Point-image fusion network for event-based frame interpolation

Temporal information in event streams plays a critical role in event-based video frame interpolation as it provides temporal context cues complementary to images. Most previous event-based methods first transform the unstructured event data to structured data formats through voxelisation, and then employ advanced CNNs to extract temporal information.

eWand: A calibration framework for wide baseline event-based camera systems

eWand: A calibration framework for wide baseline event-based camera systems

To overcome calibration limitations, we propose eWand, a new method that uses blinking LEDs inside opaque spheres instead of a printed or displayed pattern. Our method provides a faster, easier-to-use extrinsic calibration approach that maintains high accuracy for both event- and frame-based cameras.

Concept Study for Dynamic Vision Sensor Based Insect Monitoring

Concept Study for Dynamic Vision Sensor Based Insect Monitoring

In this concept study, the processing steps required for this are discussed and suggestions for suitable processing methods are given. On the basis of a small dataset, a clustering and filtering-based labeling approach is proposed, which is a promising option for the preparation of larger DVS insect monitoring datasets.

EventLFM: Event Camera integrated Fourier Light Field Microscopy for Ultrafast 3D imaging

EventLFM: Event Camera integrated Fourier Light Field Microscopy for Ultrafast 3D imaging

We introduce EventLFM, a straightforward and cost-effective system that overcomes these challenges by integrating an event camera with Fourier light field microscopy (LFM), a state-of-the-art single-shot 3D wide-field imaging technique. We further develop a simple and robust event-driven LFM reconstruction algorithm that can reliably reconstruct 3D dynamics from the unique spatiotemporal measurements captured by EventLFM.