This paper introduces a dataset from a subterranean (SubT) environment, captured with state-of-the-art sensors like RGB, RGB-D, event-based, and thermal cameras, along with 2D/3D lidars, IMUs, and UWB positioning systems. Synchronized raw data is provided in ROS message format, enabling evaluations of navigation, localization, and mapping algorithms.
In this paper, a new event camera dataset, EVIMO2, is introduced that improves on the popular EVIMO dataset by providing more data, from better cameras, in more complex scenarios. As with its predecessor, EVIMO2 provides labels in the form of per-pixel ground truth depth and segmentation as well as camera and object poses.
This paper presents the implementation details and experimental validation of a relatively low-cost motion capture system for multi-quadrotor motion planning using an event camera. The real-time, multi-quadrotor detection and tracking tasks are performed using a deep learning network You-Only-Look-Once (YOLOv5) and a k-dimensional (k-d) tree, respectively.
This paper concerns research on the load motion carried by a rotary crane. For this purpose, the laboratory crane model was designed in Solidworks software, and numerical simulations were made using the Motion module. The developed laboratory model is a scaled equivalent of the real Liebherr LTM 1020 object.
This paper proposes to use event cameras with bio-inspired silicon sensors, which are sensitive to radiance changes, to recover precise radiance values. It reveals that, under active lighting conditions, the transient frequency of event signals triggering linearly reflects the radiance value.