Welcome to the Prophesee Research Library, where academic innovation meets the world’s most advanced event-based vision technologies

 

 

We have brought together groundbreaking research from scholars who are pushing the boundaries with Prophesee Event-based Vision technologies to inspire collaboration and drive forward new breakthroughs in the academic community.

Introducing Prophesee Research Library, the largest curation of  academic papers, leveraging Prophesee event-based vision.

Together, let’s reveal the invisible and shape the future of Computer Vision.

 

 

SEARCH PUBLICATIONS

Can’t find your research? Submit it here 

Neuromorphic Imaging Flow Cytometry combined with Adaptive Recurrent Spiking Neural Networks

Neuromorphic Imaging Flow Cytometry combined with Adaptive Recurrent Spiking Neural Networks

In this paper, an experimental imaging flow cytometer using an event-based CMOS camera is presented, with data processed by adaptive feedforward and recurrent spiking neural networks. PMMA particles flowing in a microfluidic channel are classified, and analysis of experimental data shows that spiking recurrent networks, including LSTM and GRU models, achieve high accuracy by leveraging temporal dependencies. Adaptation mechanisms in lightweight feedforward spiking networks further improve performance. This work provides a roadmap for neuromorphic-assisted biomedical applications, enhancing classification while maintaining low latency and sparsity.

read more
Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras

Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras

In this paper, a novel probabilistic model is proposed that leverages the stochastic distribution of events along moving edges. A lightweight, patch-based algorithm is introduced that employs a linear combination of event spatial coordinates, making it highly suitable for specialized hardware. The approach scales linearly with dimensionality, making it compatible with emerging event-based 3D sensors such as Light-Field DVS (LF-DVS). Experimental results demonstrate the efficiency and scalability of the method, establishing a solid foundation for real-time, ultra-efficient 2D and 3D motion estimation in event-based sensing systems.

read more
A VCSEL based Photonic Neuromorphic Processor for Event-Based Imaging Flow Cytometry Applications

A VCSEL based Photonic Neuromorphic Processor for Event-Based Imaging Flow Cytometry Applications

This paper presents a novel approach that combines a photonic neuromorphic spiking computing scheme with a bio-inspired event-based image sensor. Designed for real-time processing of sparse image data, the system uses a time-delayed spiking extreme learning machine implemented via a two-section laser. Tested on high-flow imaging cytometry data, it classifies artificial particles of varying sizes with 97.1% accuracy while reducing parameters by a factor of 6.25 compared to conventional neural networks. These results highlight the potential of fast, low-power event-based neuromorphic systems for biomedical analysis, environmental monitoring, and smart sensing.

read more
A New Stereo Fisheye Event Camera for Fast Drone Detection and Tracking

A New Stereo Fisheye Event Camera for Fast Drone Detection and Tracking

In this paper, a new compact vision sensor consisting of two fisheye event cameras mounted back to back is presented, offering a full 360-degree view of the surrounding environment. The optical design, projection model, and practical calibration using incoming event streams of the novel stereo camera called SFERA are described. Its potential for real-time target tracking is evaluated using a Bayesian estimator adapted to the sphere’s geometry. Real-world experiments with a prototype including two synchronized Prophesee EVK4 cameras and a DJI Mavic Air 2 quadrotor demonstrate the system’s effectiveness for aerial surveillance.

read more
Asynchronous Multi-Object Tracking with an Event Camera

Asynchronous Multi-Object Tracking with an Event Camera

In this paper, the Asynchronous Event Multi-Object Tracking (AEMOT) algorithm is presented for detecting and tracking multiple objects by processing individual raw events asynchronously. AEMOT detects salient event blob features by identifying regions of consistent optical flow using a novel Field of Active Flow Directions built from the Surface of Active Events. Detected features are tracked as candidate objects using the recently proposed Asynchronous Event Blob (AEB) tracker to construct small intensity patches of each candidate object.

read more
FRED: The Florence RGB-Event Drone Dataset

FRED: The Florence RGB-Event Drone Dataset

The Florence RGB-Event Drone dataset (FRED) is a novel multimodal dataset specifically designed for drone detection, tracking, and trajectory forecasting, combining RGB video and event streams. FRED features more than 7 hours of densely annotated drone trajectories, using five different drone models and including challenging scenarios such as rain and adverse lighting conditions.

read more

Don’t miss the next story.
Subscribe to our newsletter!

INVENTORS AROUND THE WORLD

Feb 2025