This paper proposes a novel architecture combining an asynchronous accumulation-free event branch and a periodic aggregation branch to break the accuracy-latency trade-off. The solution enables ultra low-latency and low-power optical flow prediction from event cameras, achieving per-event prediction with a latency of tens of microseconds.
This collaboration combines Tobii’s best-in-class eye tracking platform with Prophesee’s pioneering event-based sensor technology. Together, the companies aim to develop an ultra-fast and power-efficient eye-tracking solution, specifically designed to meet the stringent power and form factor requirements of compact and battery-constrained smart eyewear.
The paper evaluates Prophesee’s 3rd-generation event-based cameras for space domain awareness, showing potential for efficient, low-power temporal sensing despite sensitivity limits.
In this paper, the traditional techniques of conventional astronomy are reconsidered to properly utilise the event-based camera for space imaging and space situational awareness.
This paper addresses these challenges by introducing a novel motion segmentation method that leverages self-supervised vision transformers on both event data and optical flow information. Our approach eliminates the need for human annotations and reduces dependency on scene-specific parameters.