eTraM: Event-based Traffic Monitoring for Resource-Efficient Detection and Tracking Across Varied Lighting Conditions

eTraM: Event-based Traffic Monitoring for Resource-Efficient Detection and Tracking Across Varied Lighting Conditions

This study proposes an innovative approach leveraging neuromorphic sensor technology to enhance traffic monitoring efficiency while still exhibiting robust performance when exposed to difficult conditions. The quantitative evaluation of the ability of event-based models to generalize on nighttime and unseen scenes further substantiates the compelling potential of leveraging event cameras for trac monitoring, opening new avenues for research and application.

Feasibility study of in‑line particle image velocimetry

Feasibility study of in‑line particle image velocimetry

This article describes recent tests and developments of imaging and evaluation techniques for particle image velocimetry (PIV) that exploit the forward scattering of tracer particles by placing the camera in-line with the illuminating light source, such as a laser or a light emitting diode. This study highlights the most promising approaches of the various recording configurations and evaluation techniques.

Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction

Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction

In this paper, we propose EvRGBHand – the first approach for 3D hand mesh reconstruction with an event camera and an RGB camera compensating for each other. By fusing two modalities of data across time, space, and information dimensions, EvRGBHand can tackle overexposure and motion blur issues in RGB-based HMR and foreground scarcity as well as background overflow issues in event-based HMR.

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

This paper demonstrates the effectiveness of Event-Based Vision Sensor in lightning research by presenting data collected during a full lightning storm and provides examples of how event-based data can be used to interpret various lightning features. We conclude that the Event-Based Vision Sensor has the potential to improve high-speed imagery due to its lower cost, data output, and ease of deployment, ultimately establishing it as an excellent complementary tool for lightning observation.

EventPS: Real-Time Photometric Stereo Using an Event Camera

EventPS: Real-Time Photometric Stereo Using an Event Camera

EventPS seamlessly integrates with both optimization-based and deep-learning-based photometric stereo techniques to offer a robust solution for non-Lambertian surfaces. Extensive experiments validate the effectiveness and efficiency of EventPS compared to frame-based counterparts. Our algorithm runs at over 30 fps in real-world scenarios, unleashing the potential of EventPS in time-sensitive and high-speed downstream applications.