Piotr Krukowski | Inseye: How can you track eye movement without a camera, using nearly 100x less power?
00:12:52 - 00:13:58
Other snippets from this talk
Summary of the clip:
How can you track eye movement without a camera, using nearly 100x less power?
Inseye's eye-tracking technology fundamentally differs from conventional camera-based systems by eliminating image processing entirely. Instead of a camera sensor array, the system uses a set of photodiodes strategically arranged on the glasses frame. This approach shifts the problem from computationally intensive image analysis to lightweight signal processing, which is the key to its ultra-low power consumption.
The core principle relies on measuring the intensity of infrared light reflected from different parts of the eye's surface. The sclera (the white part), the iris (the colored part), and the retina reflect light at different intensities. As the eye moves, the pattern of reflected light captured by the distributed photodiodes changes in a predictable way, creating a unique signal signature for each gaze position.
This raw signal data is then fed into a population pre-trained software model. The algorithm doesn't "see" an image of the eye; it interprets these dynamic light intensity patterns and transposes them directly into gaze coordinates. This signal-based method avoids the power and processing overhead of cameras, making it ideal for the strict constraints of all-day-wear smart glasses.
In this short video, you can learn:
* The fundamental principle of camera-free, photodiode-based eye tracking.
* How differential light reflectivity from the sclera, iris, and retina is used to determine gaze.
* The critical difference between a signal processing and an image processing approach for this application.
š **Clip Abstract** This clip reveals the core technology behind Inseye's camera-less eye tracking. It explains how a system of photodiodes and signal processing can determine gaze direction by analyzing the differential reflectivity of the eye's surface, enabling a massive reduction in power consumption.
š Link in comments š
#PhotodiodeEyeTracking, #CameraFreeEyeTracking, #SignalProcessing, #OcularReflectivity, #SmartGlasses, #AREyeTracking
This is a highlight of the presentation:
MicroLEDs, AR/VR Displays, Micro-Optics 2025: Innovations, Start-Ups, Market Trends
Online | TechBlick platform
Organised By:
TechBlick
MicroLED Connect
More Highlights from the same talk.
00:13:59 - 00:15:22
Is 2.8 milliwatts for eye tracking the new benchmark for all-day AR glasses?
Is 2.8 milliwatts for eye tracking the new benchmark for all-day AR glasses?
This segment details the market-driven performance specifications of Inseye's first reference design for OEMs and ODMs. The module achieves 2.5 degrees of accuracy within a 30-degree field of view, a specification set developed in close collaboration with industry leaders to meet the practical needs of AI and AR smart glasses applications. These parameters are sufficient for key use cases like gaze-based UI interaction and providing contextual data to AI models.
The most significant technical specification is the power consumption, which is a mere 2.8 milliwatts. This figure represents a paradigm shift compared to incumbent technologies. Traditional camera-based eye-tracking solutions typically consume around 200 milliwatts, making them unsuitable for the limited power budgets of sleek, all-day-wear consumer eyewear.
This ultra-low power consumption is not just an incremental improvement; it's an enabling factor for the entire category of lightweight AI smart glasses. By reducing the power draw by nearly two orders of magnitude, Inseye's technology removes a major roadblock to integrating eye tracking as a standard feature. This allows designers to add valuable functionality without sacrificing battery life or increasing the device's weight and bulk.
In this short video, you can learn:
* The key performance metrics (accuracy, FoV, power) of Inseye's first reference design.
* A direct power consumption comparison against traditional camera-based eye trackers.
* Why achieving sub-5 milliwatt power consumption is a critical enabler for the consumer smart glasses market.
š **Clip Abstract** This clip presents the key performance metrics of Inseye's first eye-tracking module, highlighting its 2.5-degree accuracy and groundbreaking 2.8-milliwatt power consumption. This performance is contrasted with the ~200-milliwatt draw of camera systems, establishing a new benchmark for all-day AR glasses.
š Link in comments š
#EyeTracking, #UltraLowPower, #ARSmartGlasses, #GazeUI, #WearableElectronics, #AugmentedReality
00:20:06 - 00:21:53
Can a camera-less eye tracker solve the two hardest problems: bright sunlight and detecting focal depth?
Can a camera-less eye tracker solve the two hardest problems: bright sunlight and detecting focal depth?
This Q&A segment addresses the challenging technical problem of determining the user's focal depth, also known as vergence. While precisely tracking gaze in 3D space is difficult for all eye-tracking technologies, especially at longer distances, this system has a unique capability. It can distinguish between a limited number of pre-defined focal planes, which is sufficient for many AR UI interactions, such as focusing on a notification versus the distant background.
The ability to sense depth comes from the richness of the data captured by the photosensors, not from a 3D camera. Each photodiode provides a high bit-depth signal representing light intensity. The subtle changes in the reflection patterns as the eye's lens accommodates and the eyes converge (vergence) provide enough information for the model to infer which focal plane the user is attending to. This is a sophisticated use of signal analysis to extract more than just 2D gaze direction.
This approach is a clever workaround to a complex optical problem, avoiding the need for more power-hungry and bulky hardware like depth sensors. While not a full 3D depth map of the user's gaze, this feature to differentiate between focal planes is a significant step forward. It enables more intuitive and context-aware user interfaces in AR without compromising the system's core advantages of low power and small form factor.
In this short video, you can learn:
* The technical challenge of measuring focal depth (vergence) in eye tracking.
* How Inseye's system can distinguish between a limited number of focal planes.
* The role of high bit-depth data from photosensors in enabling vergence recognition without a 3D camera.
š **Clip Abstract** This clip explains how Inseye's technology tackles the difficult problem of detecting focal depth (vergence). The system uses high bit-depth data from its photosensors to distinguish between different focal planes, enabling more advanced UI interactions without the need for power-hungry 3D cameras.
š Link in comments š
#VergenceTracking, #FocalPlaneDetection, #HighBitDepthPhotosensors, #CameraLessEyeTracking, #AREyeTracking, #WearableElectronics




