User, Metric, and Computational Evaluation of Foveated Rendering Methods
by
Nicholas T. Swafford, Jose A. Iglesias-Guitian, Charalampos Koniaris, Bochang Moon, Darren Coskery, Kenny Mitchell
To apprear in proceedings of ACM Symposium on Applied Perception 2016
Abstract
Perceptually lossless foveated rendering methods exploit human
perception by selectively rendering at different quality levels based
on eye gaze (at a lower computational cost) while still maintaining
the user's perception of a full quality render. We consider
three foveated rendering methods and propose practical rules of
thumb for each method to achieve significant performance gains
in real-time rendering frameworks. Additionally, we contribute a
new metric for perceptual foveated rendering quality building on
HDR-VDP2 that, unlike traditional metrics, considers the loss of
fidelity in peripheral vision by lowering the contrast sensitivity of
the model with visual eccentricity based on the Cortical Magnification
Factor (CMF). The new metric is parameterized on user-test
data generated in this study. Finally, we run our metric on a novel
foveated rendering method for real-time immersive 360 content
with motion parallax.
Contents
Main Report (pdf, 4.54 MB)
Video (mov, 179 MB)