← blog
Feb 28, 2019

Optimizations in VR: Foveated Rendering

The Performance Problem in VR

Standard games require around 60fps to feel smooth. VR demands significantly more — approximately 90fps — to prevent disorientation and nausea. Drop below that, and the user feels it immediately. This challenge intensifies on mobile and standalone headsets like the Oculus Go, where compute budgets are severely constrained.

Foveated rendering is a GPU optimization strategy that addresses this by rendering only the part of the image the user is actually looking at in full resolution, while reducing quality in the periphery.

Fixed Foveated Rendering

The simpler implementation. The center of the screen is rendered at full resolution; regions toward the edges are rendered at progressively lower resolutions. This mimics human peripheral vision — our eyes have high acuity at the fovea (center) and much lower resolution toward the edges of the visual field.

The result is a multi-resolution framebuffer: stepped resolution changes moving outward from the center. The GPU does significantly less fragment shading work at the edges, reducing overall pixel throughput without visible quality loss in the center of focus.

This works well as a static optimization — no eye tracking hardware required.

Foveated Rendering with Eye Tracking

The more advanced approach. Rather than hardcoding the high-resolution region at screen center, eye tracking hardware detects where the user is actually looking and shifts the high-resolution region to match their gaze point in real time.

FOVE, a Tokyo-based startup, pioneered eye tracking in VR headsets in 2014. Since then, HTC, Meta/Oculus, and others have integrated eye tracking into higher-end devices. The system dynamically increases resolution around the gaze point and reduces it elsewhere — providing both better visual quality where it matters and significantly better performance everywhere else.

The quality improvement over fixed foveated is meaningful: users can look anywhere in the scene and the high-resolution region follows, rather than being locked to center.

Applications Beyond VR

Eye tracking enables far more than just rendering optimization. In both VR and non-VR contexts, knowing exactly where a user is looking enables: more natural UI interaction, accessibility tools, attention analytics, gaze-driven input, and social presence features (eye contact in avatars). The rendering optimization is just the most immediately quantifiable benefit.