EvTTC: An Event Camera Dataset for Time-to-Collision Estimation

EvTTC: An Event Camera Dataset for Time-to-Collision Estimation

To explore the potential of event cameras in the above-mentioned challenging cases, this paper proposes EvTTC, which is the first multi-sensor dataset focusing on TTC tasks under high-relative-speed scenarios. EvTTC consists of data collected using standard cameras and event cameras, covering various potential collision scenarios in daily driving and involving multiple collision objects.

Eoptic, Inc. and Prophesee Forge Strategic Partnership to Evolve Multimodal, High-Speed Imaging Systems

Eoptic, Inc. and Prophesee Forge Strategic Partnership to Evolve Multimodal, High-Speed Imaging Systems

Eoptic, Inc., a leader in advanced imaging and optics systems integration, and Prophesee, the global pioneer in neuromorphic vision systems, today announced a strategic collaboration to integrate high-speed event detection into Eoptic’s innovative and flexible prismatic sensor module. By combining Eoptic’s Cambrian Edge imaging platform with Prophesee’s cutting-edge, event-based Metavision® sensors, the partnership aims to tackle real-time imaging challenges and open new frontiers in dynamic visual processing.

Event-based vision in magneto-optic Kerr effect microscopy

Event-based vision in magneto-optic Kerr effect microscopy

This paper explores the use of event cameras as an add-on to traditional MOKE microscopy to enhance time resolution for observing magnetic domains. Event cameras improve temporal resolution to 1 µs, enabling real-time monitoring and post-processing of fast magnetic dynamics. A proof-of-concept feedback control experiment demonstrated a latency of just 25 ms, highlighting the potential for dynamic material research. Limitations of current event cameras in this application are also discussed.

Learned Event-based Visual Perception for Improved Space Object Detection

Learned Event-based Visual Perception for Improved Space Object Detection

This paper presents a hybrid image- and event-based architecture for detecting dim space objects in geosynchronous orbit using dynamic vision sensing. Combining conventional and point-cloud feature extractors like PointNet, the approach enhances detection performance in high-background activity scenes. An event-based imaging simulator is also developed for model training and sensor parameter optimization, demonstrating improved recall for dim objects in challenging conditions.