Leading Vision Publication Features Prophesee’s Solution for Enhancing XR Wearables

PARIS, Nov 4, 2025

The smart glasses space is one of the hottest areas in vision-enabled electronics, with major players entering the market for XR devices in both consumer and industrial markets. Event-based vision can significantly improve the usability, performance, comfort and overall user experience with this emerging category of devices. Read more about how Prophesee’s technology is being used in real-world designs in this exclusive article in Imaging & Machine Vision Europe.

Could event-based vision hold the key to improving the usability of
XR wearables?

This article explores how event-based vision is revolutionising XR wearables, making smart glasses lighter, more immersive and more energy-efficient. 

The XR (extended reality) wearables market – consisting of both the initial generation headset style devices, and now increasingly popular traditional eyeglasses style models has captured the imagination of both consumer and industrial sectors, with innovative designs and notable progress addressing usability concerns. Despite some missteps, wearable vision-enriched systems are experiencing substantial growth in key segments – first in gaming, but now expanding into consumer entertainment, education, healthcare and industrial automation. 

Market forecasts estimate a roughly 30% annual growth rate over the next several years, as startups and tech giants partner with consumer lifestyle and luxury eyewear makers to bring smart eyewear into the mainstream, validating design innovation and consumer appetite.  This growth is propelled by several converging trends, many of which are still evolving. Smaller, lighter, and more varied, form factors backed by notable eyewear brands are being developed to improve user comfort and experience, while advances in display technology, motion detection and haptic feedback are enhancing immersion. Meanwhile, increased integration of AI is expanding XR functionality and appeal across use cases.

However, enthusiasm is tempered by persistent challenges that limit broader adoption. Many headsets remain bulky and uncomfortable, for example, battery life is often insufficient, device costs remain high and latency-induced motion sickness persists. Users frequently report that virtual experiences still feel artificial and lack full immersion.

Enhanced vision sensing can address XR’s challenges

Addressing these challenges while leveraging growth drivers is crucial to continued expansion of the category. One promising area is advanced vision sensing, which can unlock significant improvements in usability, realism and interaction – all while reducing size, cost and power consumption in next-generation wearables of all types. Efficient vision sensing is especially important because achieving lifelike detail is computationally intensive, and smarter, lower-power sensors enable better visual fidelity with reduced system demands. Furthermore, highly accurate hand, gesture and eye tracking capabilities – essential to intuitive user interfaces – can be greatly improved by cutting-edge vision sensing technologies. Additionally, compact and efficient sensors also reduce the physical footprint and weight of the device they’re used in, thus extending wearability and usage time.

Event-based vision: a paradigm shift in XR sensing

Event-based vision is an emerging technology gaining traction in the wearables space. Inspired by human vision, it mimics biological processes by capturing only the changes within a scene, unlike traditional sensors that record full frames at fixed intervals.

This method of selective sensing provides critical advantages for XR applications. Event-based vision sensors are transforming eye tracking, for example, with ultra-fast (up to 1kHz), high-efficiency tracking that only activates when motion is detected, making them significantly more power-efficient. The design also enhances the wearables’ responsiveness – particularly in applications like eye tracking, where fine visual detail is needed only at the user’s point of focus.

Power efficiency is critical

As with most applications in the wearables market, power efficiency along with battery size and battery life are critical to the user experience and practicality of the system. Because they process only what changes, event-based sensors reduce data loads and thus save power. With consumption in the microwatt range, they can also support “always-on” operation and improve real-time responsiveness.

Event-based vision supports extremely low power operation by enabling greater efficiency at the system level, making the key process of sensing, processing, illumination and data transfer much less impactful from a power consumption standpoint. For example, event-based vision’s ability to enable low latency translates into shorter LED pulses, which conserve energy. Additionally, because event-based sensors inherently require less data, processing overhead is minimised, also reducing power requirements.

Foveated rendering enables immersive, natural experiences

XR developers are particularly excited about the role of event-based vision in improving foveated rendering – a technique that concentrates processing power wherever the user is looking. Prophesee’s Metavision platform, for example, has demonstrated this approach by delivering high-resolution imagery only in the focal area while reducing peripheral detail, which minimises GPU load and power use.

Foveated rendering mirrors how the human eye works, perceiving fine detail only in the fovea, a small central area of the retina. Rendering in this way dramatically improves visual quality, reduces latency and enhances battery life by avoiding unnecessary computation.

Event-based sensors elevate this technique by offering rapid, precise eye movement detection. Where traditional sensors may lag or miss subtle gaze shifts, event-driven sensors respond in real time, allowing for immediate adjustment of the rendered scene to deliver a smoother, more immersive experience.  Moreover, their high dynamic range ensures consistent rendering performance even under difficult lighting conditions. Developers benefit from lower system demands, while users enjoy clearer, more engaging visuals.

Inside-out tracking: another win for event-based vision

Event-based vision also enhances inside-out tracking, a self-contained method where XR devices track their position and orientation using onboard sensors, eliminating the need for external markers or cameras. This simplifies setup, improves portability and enables the seamless integration of virtual elements into real-world settings. Event-based sensors also excel at detecting motion patterns used in “constellation tracking,” which identifies positions via patterns of LEDs or markers. These sensors deliver real-time positional data with high accuracy and reliability, bolstering the realism of user interactions and supporting dynamic spatial environments.

An expanding ecosystem unlocks new potential in smart eyewear

Interest in event-based vision has expanded greatly in the past few years from its initial foundation. By taking an open systems approach and tapping into the creativity and enthusiasm of a global network of developers in industry, academia and research organisations, new use cases and applications are emerging, and commercial adoption is happening across many sectors.

As eyewear becomes more integrated into daily life, the competitive edge will come from technologies that balance style, usability and continuous functionality. Event-based vision aligns well with this need, offering high performance without sacrificing comfort or battery life. For example, the recent collaboration between Prophesee and Tobii, a leader in eye-tracking technology, integrates Prophesee’s event-based sensors with Tobii’s proven gaze-tracking algorithms, offering a turnkey solution for OEMs and developers building next-generation wearable devices.

The synergy between the two technologies enables ultra-low latency and high-precision eye tracking optimised for foveated rendering, context-aware user interfaces and hands-free navigation. The combined system drastically reduces processing overhead and power consumption – both common concerns in wearable form factors like smart glasses. For developers, this integration simplifies implementation. Tobii’s robust developer tools and APIs, when paired with the sensor’s hardware efficiency, reduce time-to-market while unlocking rich UX possibilities such as attention-based interfaces, real-time interaction cues and adaptive visual displays tailored to individual user intent.

Eye-tracking device maker 7Invensun has similarly leveraged the advantages of event-based vision to release its latest system aimed at the medical, health and research sectors. The combination of the companies’ technologies is particularly useful for medical diagnostics and assistive devices, as well as automotive and aviation safety and smart eyewear. The power- and size-efficiency of the latest event-based sensors enable developers like 7Invensun to produce lightweight designs that are comfortable to wear and don’t burden users during prolonged use. Furthermore, the device features detachable lenses and supports use with contact lenses, catering to the needs of users with different vision requirements.

A significant leap forward for eye-tracking technology in natural-scene research and dynamic behaviour capture, event-based vision offers a more powerful tool for research and industrial applications. The ability to accurately track eye movements is being applied to a range of uses cases, including laboratory neuroscience, sports science, medical diagnosis, human factors and ergonomics and cognitive process research.

The smart eyewear and XR development evolution will be accelerated by event-based vision 

XR systems promise transformative impacts across entertainment, education, industry and more – but achieving mass adoption requires solving fundamental UX challenges. This is where event-based vision technology is proving to be a breakthrough, enabling smarter, lighter and more intuitive devices.

From real-time foveated rendering to precise inside-out tracking and power-efficient operation, event-based vision – especially when paired with commercial partners – is helping the industry overcome barriers to comfort, cost and realism. These advances are not only improving today’s devices but opening the door to entirely new categories of wearable computing. As smart eyewear continues its evolution from a niche gadget to an everyday companion, event-based vision could prove to be the enabling layer that allows fashion brands and technology partners to deliver devices that are both stylish and seamlessly intelligent. 

Originally published on imveurope.com

ABOUT PROPHESEE

Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. The company developed a breakthrough Event-based Vision approach to machine vision. This new vision category allows for significant reductions of power, latency and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, mobile and AR/VR. Prophesee is based in Paris, with local offices in Grenoble, Shanghai, and Silicon Valley.

Visit prophesee.ai for more.

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter