Collision detection for UAVs using Event Cameras

Collision detection for UAVs using Event Cameras

This paper explores the use of event cameras for collision detection in unmanned aerial vehicles (UAVs). Traditional cameras have been widely used in UAVs for obstacle avoidance and navigation, but they suffer from high latency and low dynamic range. Event cameras, on the other hand, capture only the changes in the scene and can operate at high speeds with low latency.

Demystifying event-based camera latency: sensor speed dependence on pixel biasing, light, and spatial activity

Demystifying event-based camera latency: sensor speed dependence on pixel biasing, light, and spatial activity

This report explores how various mechanisms effect the response time of event-based cameras (EBCs) are based on unconventional electro-optical IR vision sensors, which are only sensitive to changing light. Because their operation is essentially “frameless,” their response time is not dependent to a frame rate or readout time, but rather the number of activated pixels, the magnitude of background light, local fabrication defects, and analog configuration of the pixel.blurry image and the concurrent event streams.

On the Generation of a Synthetic Event-Based Vision Dataset for Navigation and Landing

On the Generation of a Synthetic Event-Based Vision Dataset for Navigation and Landing

This paper presents a methodology and a software pipeline for generating event-based vision datasets from optimal landing trajectories during the approach of a target body. It constructs sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility (PANGU) at different viewpoints along a set of optimal descent trajectories obtained by varying the boundary conditions.