Event-Based Visual Teach-and-Repeat via Fast Fourier-Domain Cross-Correlation

Event-Based Visual Teach-and-Repeat via Fast Fourier-Domain Cross-Correlation

In this paper, an event-camera-based visual teach-and-repeat system is presented, enabling robots to autonomously follow previously demonstrated paths by comparing current sensory input with recorded trajectories. Conventional frame-based cameras limit responsiveness due to fixed frame rates, introducing latency in control. The method uses a frequency-domain cross-correlation framework, transforming event matching into fast Fourier-space operations exceeding 300 Hz. By leveraging binary event frames and image compression, localization accuracy is maintained while computational speed is increased. Experiments with a Prophesee EVK4 HD on an AgileX Scout Mini demonstrate successful navigation over 4000+ meters, achieving ATEs below 24 cm with high-frequency control updates.

Leading Vision Publication Features Prophesee’s Solution for Enhancing XR Wearables

Leading Vision Publication Features Prophesee’s Solution for Enhancing XR Wearables

Prophesee’s event-based vision for XR wearables is transforming smart glasses, delivering lightweight, high-performance devices with improved usability, low power consumption, precise eye tracking, and foveated rendering. Featured in Imaging & Machine Vision Europe, this technology addresses key XR challenges and enables more immersive, intuitive, and energy-efficient experiences.

Neuromorphic Imaging Flow Cytometry combined with Adaptive Recurrent Spiking Neural Networks

Neuromorphic Imaging Flow Cytometry combined with Adaptive Recurrent Spiking Neural Networks

In this paper, an experimental imaging flow cytometer using an event-based CMOS camera is presented, with data processed by adaptive feedforward and recurrent spiking neural networks. PMMA particles flowing in a microfluidic channel are classified, and analysis of experimental data shows that spiking recurrent networks, including LSTM and GRU models, achieve high accuracy by leveraging temporal dependencies. Adaptation mechanisms in lightweight feedforward spiking networks further improve performance. This work provides a roadmap for neuromorphic-assisted biomedical applications, enhancing classification while maintaining low latency and sparsity.

Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras

Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras

In this paper, a novel probabilistic model is proposed that leverages the stochastic distribution of events along moving edges. A lightweight, patch-based algorithm is introduced that employs a linear combination of event spatial coordinates, making it highly suitable for specialized hardware. The approach scales linearly with dimensionality, making it compatible with emerging event-based 3D sensors such as Light-Field DVS (LF-DVS). Experimental results demonstrate the efficiency and scalability of the method, establishing a solid foundation for real-time, ultra-efficient 2D and 3D motion estimation in event-based sensing systems.