APEIRON: a Multimodal Drone Dataset Bridging Perception and Network Data in Outdoor Environments

APEIRON: a Multimodal Drone Dataset Bridging Perception and Network Data in Outdoor Environments

In this work, we introduce APEIRON, a rich multimodal aerial dataset that simultaneously collects perception data from a stereocamera and an event based camera sensor, along with measurements of wireless network links obtained using an LTE module. The assembled dataset consists of both perception and network data, making it suitable for typical perception or communication applications.

Helios: An extremely low power event-based gesture recognition for always-on smart eyewear

Helios: An extremely low power event-based gesture recognition for always-on smart eyewear

This paper introduces Helios, the first extremely low-power, real-time, event-based hand gesture recognition system designed for all-day on smart eyewear. Helios can recognize seven classes of gestures, including subtle microgestures like swipes and pinches, with 91% accuracy. We also demonstrate real-time performance across 20 users at a remarkably low latency of 60ms.

SEVD: Synthetic Event-based Vision Dataset for Ego and Fixed Traffic Perception

SEVD: Synthetic Event-based Vision Dataset for Ego and Fixed Traffic Perception

In this paper, we present SEVD, a first-of-its-kind multi-view ego, and fixed perception synthetic event-based dataset using multiple dynamic vision sensors within the CARLA simulator. We evaluate the dataset using state-of-the-art event-based (RED, RVT) and frame-based (YOLOv8) methods for traffic participant detection tasks and provide baseline benchmarks for assessment.