Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

This paper demonstrates the effectiveness of Event-Based Vision Sensor in lightning research by presenting data collected during a full lightning storm and provides examples of how event-based data can be used to interpret various lightning features. We conclude that the Event-Based Vision Sensor has the potential to improve high-speed imagery due to its lower cost, data output, and ease of deployment, ultimately establishing it as an excellent complementary tool for lightning observation.

EventPS: Real-Time Photometric Stereo Using an Event Camera

EventPS: Real-Time Photometric Stereo Using an Event Camera

EventPS seamlessly integrates with both optimization-based and deep-learning-based photometric stereo techniques to offer a robust solution for non-Lambertian surfaces. Extensive experiments validate the effectiveness and efficiency of EventPS compared to frame-based counterparts. Our algorithm runs at over 30 fps in real-world scenarios, unleashing the potential of EventPS in time-sensitive and high-speed downstream applications.

Time Lens++: Event-Based Frame Interpolation With Parametric Non-Linear Flow and Multi-Scale Fusion

Time Lens++: Event-Based Frame Interpolation With Parametric Non-Linear Flow and Multi-Scale Fusion

In this work, we introduce multi-scale feature-level fusion and computing one-shot non-linear inter-frame motion—which can be efficiently sampled for image warping—from events and images. We also collect the first large-scale events and frames dataset consisting of more than 100 challenging scenes with depth variations, captured with a new experimental setup based on a beamsplitter.

eTraM: Event-Based Traffic Monitoring Dataset

eTraM: Event-Based Traffic Monitoring Dataset

eTraM offers 10 hr of data from different traffic scenarios in various lighting and weather conditions, providing a comprehensive overview of real-world situations. Providing 2M bounding box annotations, it covers eight distinct classes of traffic participants, ranging from vehicles to pedestrians and micro-mobility.

Time Lens: Event-Based Video Frame Interpolation

Time Lens: Event-Based Video Frame Interpolation

In this work, we introduce Time Lens, a novel method that leverages the advantages of both. We extensively evaluate our method on three synthetic and two real benchmarks where we show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods. We release a new large-scale dataset in highly dynamic scenarios, aimed at pushing the limits of existing methods.