Tomas Sluka | CREAL: How is a retinal light field microdisplay engineered to deliver authentic focus depth and parallax?
07:16 - 08:35
Other snippets from this talk
Summary of the clip:
How is a retinal light field microdisplay engineered to deliver authentic focus depth and parallax?
The core of this retinal light field microdisplay system is a laser-based architecture incorporating a light multiplexer, which utilizes MEMS technology and a custom photonics chip. This multiplexer precisely illuminates an 8-kilohertz, one-megapixel FLCOS modulator. The modulated light then passes through freeform optics before reflecting off a holographic film integrated into a conventional prescription lens. This innovative combination allows the system to project an array of perspective images, each delivered to the eye through a distinct sub-pupil, effectively reconstructing the light rays typically received from real-world objects with varying focus depths.
The image processing pipeline generates slightly different perspectives with negligible overhead, which are then sequentially presented by the FLCOS modulator. Crucially, each frame is illuminated by the light multiplexer at a different angle. This angular variation is translated into distinct viewpoints, ensuring that the eye perceives a comprehensive light field. As a result, users experience natural focus depth and even observe subtle parallax shifts when moving their head relative to the glasses, providing a visually coherent and comfortable augmented reality experience that mimics natural vision.
In this short video, you can learn:
* The components of a sequential light field projection system.
* The role of MEMS and photonics in light multiplexing.
* How an FLCOS modulator and holographic film contribute to image generation.
* The mechanism for achieving true focus depth and parallax in AR.
#RetinalLightField, #MEMSPhotonics, #FLCOSModulator, #HolographicFilm, #AugmentedReality, #NearEyeDisplays
This is a highlight of the presentation:
Vision Care at the core of AR
More Highlights from the same talk.
02:27 - 04:11
Why do conventional augmented reality displays induce visual discomfort and eye strain?
Why do conventional augmented reality displays induce visual discomfort and eye strain?
Mainstream augmented reality (AR) displays present digital imagery as optically flat, fixed-focal-distance planes to each eye. This approach creates a strong stereoscopic illusion of 3D but fundamentally ignores the natural focus depth present in real-world vision. Consequently, when a user attempts to focus on a digital object at a perceived close distance while their eyes naturally accommodate to a different depth, a vergence-accommodation conflict arises. This physiological mismatch between the eyes' vergence (angle of convergence) and accommodation (lens focus) leads to visual system confusion, often manifesting as dizziness and eye strain within short periods of use.
The inherent limitation of displaying flat images means that digital content can only appear sharp when the eye's focus precisely matches the display's fixed focal distance. If the user's gaze shifts to a real-world object at a different depth, the digital overlay becomes blurred, and vice versa. This forces the visual system to constantly attempt to reconcile conflicting depth cues, leading to the reported discomfort. To mitigate this, current AR solutions often advise pushing digital content further away, which is a workaround rather than a fundamental resolution to the problem of lacking true focus depth.
In this short video, you can learn:
* The concept of vergence-accommodation conflict in AR displays.
* How fixed-focal-distance displays cause visual discomfort.
* The physiological impact of flat digital imagery on the human eye.
* Why current mitigation strategies are insufficient.
#VergenceAccommodationConflict, #FixedFocalAR, #ARVisualStrain, #ARDisplayOptics, #AugmentedReality, #NearEyeDisplays
08:10 - 10:52
What are the current miniaturization achievements and performance specifications for advanced AR display prototypes?
What are the current miniaturization achievements and performance specifications for advanced AR display prototypes?
Significant engineering effort has been dedicated to miniaturizing the retinal light field display from initial large prototypes to a practical form factor. The current design integrates the holographic film directly onto a standard prescription lens, allowing the front frame to resemble conventional eyewear. The projector unit is discreetly positioned on the side, with the remaining electronics, currently bulky due to their non-chip implementation, targeted for future miniaturization. Notably, the illumination system, which accounts for two-thirds of the projector's volume in the MEMS-based version, represents a primary area for further size reduction.
The latest prototype demonstrates product-grade image quality, achieving a resolution of 40 pixels per degree (PPD) across a field of view (FOV) ranging from 36 to 40 degrees. While the optical design is capable of supporting a wider FOV, up to 70 degrees, the current limitation is a trade-off between resolution and the desired field of view. The focus has also been on refining the display to achieve full contrast, ensuring that the digital content is rendered with optimal clarity and integration into the user's natural vision.
In this short video, you can learn:
* The evolution of retinal light field display prototypes towards miniaturization.
* The current form factor and integration of the projector and holographic film.
* Key performance metrics including PPD and field of view.
* The engineering considerations and trade-offs in achieving wider FOV.
#RetinalLightFieldDisplay, #HolographicWaveguide, #MEMSIlluminationSystem, #PPDResolution, #AugmentedReality, #MicroOptics




