This paper proposes a simple but effective event-based pose estimation system using active LED markers (ALM) for fast and accurate pose estimation. The proposed algorithm is able to operate in real time with a latency below 0.5 ms while maintaining output rates of 3 kHz.
This paper proposes to develop a Hardware-in-the-Loop imaging setup that enables experimenting with an event-based and frame-based camera under simulated space conditions. The generated data sets were used to compare visual navigation algorithms in terms of an event-based and frame-based feature detection and tracking algorithm.
This paper explores the use of event cameras for collision detection in unmanned aerial vehicles (UAVs). Traditional cameras have been widely used in UAVs for obstacle avoidance and navigation, but they suffer from high latency and low dynamic range. Event cameras, on the other hand, capture only the changes in the scene and can operate at high speeds with low latency.
This report explores how various mechanisms effect the response time of event-based cameras (EBCs) are based on unconventional electro-optical IR vision sensors, which are only sensitive to changing light. Because their operation is essentially “frameless,” their response time is not dependent to a frame rate or readout time, but rather the number of activated pixels, the magnitude of background light, local fabrication defects, and analog configuration of the pixel.blurry image and the concurrent event streams.
This paper presents a methodology and a software pipeline for generating event-based vision datasets from optimal landing trajectories during the approach of a target body. It constructs sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility (PANGU) at different viewpoints along a set of optimal descent trajectories obtained by varying the boundary conditions.