Event-based Background-Oriented Schlieren

Event-based Background-Oriented Schlieren

This paper presents a novel technique for perceiving air convection using events and frames by providing the first theoretical analysis that connects event data and schlieren. We formulate the problem as a variational optimization one combining the linearized event generation model with a physically-motivated parameterization that estimates the temporal derivative of the air density.

On-orbit optical detection of lethal non-trackable debris

On-orbit optical detection of lethal non-trackable debris

Resident space objects in the size range of 0.1 mm–3 cm are not currently trackable but have enough kinetic energy to have lethal consequences for spacecraft. The assessment of small orbital debris, potentially posing a risk to most space missions, requires the combination of a large sensor area and large time coverage.

G2N2: Lightweight event stream classification with GRU graph neural networks

G2N2: Lightweight event stream classification with GRU graph neural networks

We benchmark our model against other event-graph and convolutional neural network based approaches on the challenging DVS-Lip dataset (spoken word classification). We find that not only does our method outperform state of the art approaches for similar model sizes, but that, relative to the convolutional models, the number of calculation operations per second was reduced by 81%.

Live Demonstration: Integrating Event Based Hand Tracking Into TouchFree Interactions

Live Demonstration: Integrating Event Based Hand Tracking Into TouchFree Interactions

To explore the potential of event cameras, Ultraleap have developed a prototype stereo camera using two Prophesee IMX636ES sensors. To go from event data to hand positions the event data is aggregated into event frames. This is then consumed by a hand tracking model which outputs 28 joint positions for each hand with respect to the camera.

X-Maps: Direct Depth Lookup for Event-Based Structured Light Systems

X-Maps: Direct Depth Lookup for Event-Based Structured Light Systems

We present a new approach to direct depth estimation for Spatial Augmented Reality (SAR) applications using event cameras. These dynamic vision sensors are a great fit to be paired with laser projectors for depth estimation in a structured light approach. Our key contributions involve a conversion of the projector time map into a rectified X-map, capturing x-axis correspondences for incoming events and enabling direct disparity lookup without any additional search.