This letter presents a novel dynamic vision enabled contactless cross-domain fault diagnosis method with neuromorphic computing. The event-based camera is adopted to capture the machine vibration states in the perspective of vision. A specially designed bio-inspired deep transfer spiking neural network (SNN) model is proposed for processing the event streams of visionary data, feature extraction and fault diagnosis.
Here we propose a hybrid event- and frame-based object detector that preserves the advantages of each modality and thus does not suffer from this trade-off. Our method exploits the high temporal resolution and sparsity of events and the rich but low temporal resolution information in standard images to generate efficient, high-rate object detections, reducing perceptual and computational latency.
We propose DSEC, a new dataset that contains demanding illumination conditions and provides a rich set of sensory data. DSEC offers data from a wide-baseline stereo setup of two color frame cameras and two high-resolution monochrome event cameras. In addition, we collect lidar data and RTK GPS measurements, both hardware synchronized with all camera data.
Ultraleap has announced a strategic partnership with Prophesee and TCL RayNeo to develop low power tracking technology that can be used in AR glasses, pushing the boundaries of augmented reality (AR) experiences.
Developers can now take full advantage of Prophesee Event-based Metavision® sensor and AI performance, power, and speed to create the next generation of Edge AI machine vision applications running on AMD platforms.