Welcome to the Prophesee Research Library, where academic innovation meets the world’s most advanced event-based vision technologies
We have brought together groundbreaking research from scholars who are pushing the boundaries with Prophesee Event-based Vision technologies to inspire collaboration and drive forward new breakthroughs in the academic community.
Introducing Prophesee Research Library, the largest curation of academic papers, leveraging Prophesee event-based vision.
Together, let’s reveal the invisible and shape the future of Computer Vision.
Graph Neural Network Combining Event Stream and Periodic Aggregation for Low-Latency Event-based Vision
This paper proposes a novel architecture combining an asynchronous accumulation-free event branch and a periodic aggregation branch to break the accuracy-latency trade-off. The solution enables ultra low-latency and low-power optical flow prediction from event cameras, achieving per-event prediction with a latency of tens of microseconds.
Evaluation of Commercial-off-the-Shelf Event-Based Cameras for Space Surveillance Applications
The paper evaluates Prophesee’s 3rd-generation event-based cameras for space domain awareness, showing potential for efficient, low-power temporal sensing despite sensitivity limits.
Astrometric Calibration and Source Characterisation of the Latest Generation Neuromorphic Event-based Cameras for Space Imaging
In this paper, the traditional techniques of conventional astronomy are reconsidered to properly utilise the event-based camera for space imaging and space situational awareness.
Motion Segmentation for Neuromorphic Aerial Surveillance
This paper addresses these challenges by introducing a novel motion segmentation method that leverages self-supervised vision transformers on both event data and optical flow information. Our approach eliminates the need for human annotations and reduces dependency on scene-specific parameters.
CoSEC: A Coaxial Stereo Event Camera Dataset for Autonomous Driving
This paper introduces hybrid coaxial event-frame devices to build the multimodal system, and propose a coaxial stereo event camera (CoSEC) dataset for autonomous driving. As for the multimodal system, it first utilizes the microcontroller to achieve time synchronization, and then spatially calibrate different sensors, where they perform intra- and inter-calibration of stereo coaxial devices.
Ev-Layout: A Large-scale Event-based Multi-modal Dataset for Indoor Layout Estimation and Tracking
This paper presents Ev-Layout, a novel large-scale event-based multi-modal dataset designed for indoor layout estimation and tracking. Ev-Layout makes key contributions to the community by: Utilizing a hybrid data collection platform (with a head-mounted display and VR interface) that integrates both RGB and bio-inspired event cameras to capture indoor layouts in motion.
Don’t miss the next story.
Subscribe to our newsletter!
INVENTORS AROUND THE WORLD
Feb 2025