Optimization of Event Camera Bias Settings for a Neuromorphic Driver Monitoring System

Optimization of Event Camera Bias Settings for a Neuromorphic Driver Monitoring System

This research is the first to investigate the impact of bias modifications on the event-based DMS output and propose an approach for evaluating and comparing DMS performance. The study investigates the impact of pixel-bias alteration on DMS features, which are: face tracking, blink counting, head pose and gaze estimation. The results indicate that the DMS’s functioning is enhanced with proper bias tuning based on the proposed metrics.

eTraM: Event-based Traffic Monitoring for Resource-Efficient Detection and Tracking Across Varied Lighting Conditions

eTraM: Event-based Traffic Monitoring for Resource-Efficient Detection and Tracking Across Varied Lighting Conditions

This study proposes an innovative approach leveraging neuromorphic sensor technology to enhance traffic monitoring efficiency while still exhibiting robust performance when exposed to difficult conditions. The quantitative evaluation of the ability of event-based models to generalize on nighttime and unseen scenes further substantiates the compelling potential of leveraging event cameras for trac monitoring, opening new avenues for research and application.

Feasibility study of in‑line particle image velocimetry

Feasibility study of in‑line particle image velocimetry

This article describes recent tests and developments of imaging and evaluation techniques for particle image velocimetry (PIV) that exploit the forward scattering of tracer particles by placing the camera in-line with the illuminating light source, such as a laser or a light emitting diode. This study highlights the most promising approaches of the various recording configurations and evaluation techniques.

Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction

Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction

In this paper, we propose EvRGBHand – the first approach for 3D hand mesh reconstruction with an event camera and an RGB camera compensating for each other. By fusing two modalities of data across time, space, and information dimensions, EvRGBHand can tackle overexposure and motion blur issues in RGB-based HMR and foreground scarcity as well as background overflow issues in event-based HMR.

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

This paper demonstrates the effectiveness of Event-Based Vision Sensor in lightning research by presenting data collected during a full lightning storm and provides examples of how event-based data can be used to interpret various lightning features. We conclude that the Event-Based Vision Sensor has the potential to improve high-speed imagery due to its lower cost, data output, and ease of deployment, ultimately establishing it as an excellent complementary tool for lightning observation.