Prophesee Appoints Jean Ferré as Chief Executive Officer to Lead Event-based Vision Sensing Pioneer in Next Stage of Growth

Prophesee Appoints Jean Ferré as Chief Executive Officer to Lead Event-based Vision Sensing Pioneer in Next Stage of Growth

Prophesee appoints Jean Ferré as Chief Executive Officer as the company enters a new phase of commercialization and growth, building on a strong technological and organizational foundation and welcoming new investors. The company is sharpening its near-term focus on sectors with high value use cases demonstrating today the strongest demand and adoption momentum such as security, defense and aerospace, as well as industrial automation.

NeuroCamTags: Long-Range, Battery-free, Wireless Sensing with Neuromorphic Cameras

NeuroCamTags: Long-Range, Battery-free, Wireless Sensing with Neuromorphic Cameras

In this paper, NeuroCamTags introduces a battery-free platform designed to detect a range of human interactions and activities in entire rooms and floors without batteries. The system comprises low-cost tags that harvest ambient light energy and utilize high-frequency LED modulation for wireless communication. Visual signals are captured by a neuromorphic camera with high temporal resolution. NeuroCamTags enables localization and identification of multiple tags, offering battery-free sensing for temperature, contact, button presses, key presses, and sound cues, with accurate detection up to 200 feet.

Low-latency neuromorphic air hockey player

Low-latency neuromorphic air hockey player

This paper focuses on using spiking neural networks (SNNs) to control a robotic manipulator in an air-hockey game. The system processes data from an event-based camera, tracking the puck’s movements and responding to a human player in real time. It demonstrates the potential of SNNs to perform fast, low-power, real-time tasks on massively parallel hardware. The air-hockey platform offers a versatile testbed for evaluating neuromorphic systems and exploring advanced algorithms, including trajectory prediction and adaptive learning, to enhance real-time decision-making and control.

Features for Classifying Insect Trajectories in Event Camera Recordings

Features for Classifying Insect Trajectories in Event Camera Recordings

In this paper, the focus is on classifying insect trajectories recorded with a stereo event-camera setup. The steps to generate a labeled dataset of trajectory segments are presented, along with methods for propagating labels to unlabelled trajectories. Features are extracted using FoldingNet and PointNet++ on trajectory point clouds, with dimensionality reduction via t-SNE. PointNet++ features form clusters corresponding to insect groups, achieving 90.7% classification accuracy across five groups. Algorithms for estimating insect speed and size are also developed as additional features.

Event Sensors Bring Just the Right Data to Device Makers

Event Sensors Bring Just the Right Data to Device Makers

Event-based sensors are redefining machine vision by mimicking the human eye. Rather than capturing full frames at fixed intervals, each pixel reacts independently, sending data only when brightness changes or motion occurs. This means devices capture only what truly matters, significantly reducing data and energy load while improving speed and dynamic range. From drones to AR wearables or medical robots, these neuromorphic sensors enable smarter, more efficient edge-device vision.