Skip to Main Content

Vision Systems

Believable Augmented Reality (AR) experiences require precision engineering and insight into how eyes and brains work together. Our people-centered approach is the foundation of our vision systems, helping devices understand the user's surroundings in real time.

A pair of waveform lenses set in a frame, resting on a manufacturing cradle.

Input to Immersion

Inside our vision systems.

A pair of waveform lenses set in a frame, resting on a manufacturing cradle.

Our perception of the world is complex — and our vision systems reflect that. We start by combining camera and sensor input with advanced computer vision and perception algorithms. Then, we deliver bright and crisp digital content using micro-displays, fine-tuned optics, and world-class waveguide displays. Together, these elements form a holistic end-to-end pipeline that delivers content seamlessly integrated with the physical world the user can see and interact with naturally.

Biometric Informed Design

Designed for human variation.

A grid of human head scans, representing the diversity of AR device users.

Every design starts with the diversity of people in mind. With nearly 2,000 head scans and robust biometric data points, our extensive database informs our detailed understanding of face shape, eye position, and human geometry. We've created a world-class foundation for understanding variations in head and facial anatomy. Those insights help us design AR devices that offer visual clarity to all-day wearability — all while feeling like they were custom made for each user.

A grid of human head scans, representing the diversity of AR device users.
A close up view of a human eye, showing the intricate texture of the iris and opening of the pupil.

Projection + Displays

Photon to Eyeball and Back.

A close up view of a human eye, showing the intricate texture of the iris and opening of the pupil.

We build our vision systems around the way light moves through the eye — and how the eye reacts. Our inter-disciplinary teams work hand-in-hand to optimize the interaction between projectors, lenses, waveguides, and complete AR device design. Every element is tuned for brightness, clarity, and visual consistency. The result? Digital content that blends seamlessly with the physical world, delivered by a device so comfortable it becomes an indispensable part of the user’s daily experience.

Perception-Driven Engineering

Digital Content Made Real.

A woman walking down a hallway wearing an AR device, following digital navigation arrows.

It’s one thing to beam brilliant and crisp images to the eye — but for AR to truly work, those images have to feel at home in the physical world. That’s where perception engineering comes into play. Our vision systems are designed so virtual objects can behave like real ones. Whether it’s a navigation arrow anchored to the sidewalk or a reminder pinned to your door, digital content should feel natural.

A woman walking down a hallway wearing an AR device, following digital navigation arrows.

Built for Resilience

Stability in Motion.

We design in the lab, and we think about how to build for the everyday. That means accounting for how devices shift, flex, and respond as the user moves through their day — from taking them on and off, to accidental drops along the way. Through ongoing stress testing, process refinement, and with the understanding of how people move, we’re working to ensure digital content stays right where it's expected.