Abstract
As head-mounted displays (HMDs) commonly present a single, fixed-focus display plane, a conflict can be created between the vergence and accommodation responses of the viewer. Multifocal HMDs have long been investigated as a potential solution in which multiple image planes span the viewer's accommodation range. Such displays require a scene decomposition algorithm to distribute the depiction of objects across image planes, and previous work has shown that simple decompositions can be achieved in real-time. However, recent optimal decompositions further improve image quality, particularly with complex content. Such decompositions are more computationally involved and likely require better alignment of the image planes with the viewer's eyes, which are potential barriers to practical applications.
Our goal is to enable interactive optimal decomposition algorithms capable of driving a vergence- and accommodation-tracked multifocal testbed. Ultimately, such a testbed is necessary to establish the requirements for the practical use of multifocal displays, in terms of computational demand and hardware accuracy. To this end, we present an efficient algorithm for optimal decompositions, incorporating insights from vision science. Our method is amenable to GPU implementations and achieves a three-orders-of-magnitude speedup over previous work. We further show that eye tracking can be used for adequate plane alignment with efficient image-based deformations, adjusting for both eye rotation and head movement relative to the display. We also build the first binocular multifocal testbed with integrated eye tracking and accommodation measurement, paving the way to establish practical eye tracking and rendering requirements for this promising class of display. Finally, we report preliminary results from a pilot user study utilizing our testbed, investigating the accommodation response of users to dynamic stimuli presented under optimal decomposition.
Supplemental Material
- Kurt Akeley. 2004. Achieving near-correct focus cues using multiple image planes. Ph.D. Dissertation. Stanford University. Google ScholarDigital Library
- Kurt Akeley, Simon J. Watt, Ahna Reza Girshick, and Martin S. Banks. 2004. A Stereo Display Prototype with Multiple Focal Distances. ACM Trans. Graph. 23, 3 (2004), 804--813. Google ScholarDigital Library
- Mathew Alpern. 1958. Variability of accommodation during steady fixation at various levels of illuminance. Journal of the Optical Society of America 48 (1958), 193--197.Google ScholarCross Ref
- Barry Blundell and Adam Schwartz. 1999. Volumetric Three-Dimensional Display Systems. Wiley-IEEE Press. Google ScholarDigital Library
- David H. Brainard. 1989. Calibration of a computer controlled color monitor. Color Research and Application 14, 1 (1989), 23--34.Google ScholarCross Ref
- Richard L. Burden and J. Douglas Faires. 2011. Numerical Analysis, 9th International Edition. Brooks/Cole, Cencag Learning (2011).Google Scholar
- Johannes Burge and Wilson S. Geisler. 2011. Optimal defocus estimation in individual natural images. PNAS 108 (2011), 16849--16854.Google ScholarCross Ref
- David Dunn, Cary Tippets, Kent Torell, Petr Kellnhofer, Kaan Akşit, Piotr Didyk, Karol Myszkowski, David Luebke, and Henry Fuchs. 2017. Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors. IEEE Transactions on Visualization and Computer Graphics 23, 4 (2017), 1322--1331. Google ScholarDigital Library
- Stuart J. Gilson, Andrew W. Fitzgibbon, and Andrew Glennerster. 2011. An automated calibration method for non-see-through head mounted displays. Journal of Neuroscience Methods 199, 2 (2011), 328--335.Google ScholarCross Ref
- Dan W. Hansen and Qiang Ji. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. on Pattern Analysis and Machine Intelligence 32, 3 (2010), 478--500. Google ScholarDigital Library
- Robert T. Held, Emily A. Cooper, and Martin S. Banks. 2012. Blur and Disparity Are Complementary Cues to Depth. Current Biology 22 (2012). Issue 5.Google Scholar
- Robert T. Held, Emily A. Cooper, James F. O'Brien, and Martin S. Banks. 2010. Using Blur to Affect Perceived Distance and Size. ACM Trans. Graph. (2010). Google ScholarDigital Library
- David M. Hoffman, Ahna R. Girshick, Kurt Akeley, and Martin S. Banks. 2008. Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. Journal of Vision 8, 3 (2008), 33.Google ScholarCross Ref
- Xinda Hu and Hong Hua. 2014. High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics. Optics Express 22, 11 (2014).Google Scholar
- Hong Hua and Bahram Javidi. 2014. A 3D integral imaging optical see-through head-mounted display. Optics Express 22, 11 (2014).Google Scholar
- Fu-Chung Huang, Kevin Chen, and Gordon Wetzstein. 2015. The Light Field Stereoscope: Immersive Computer Graphics via Factored Near-eye Light Field Displays with Focus Cues. ACM Trans. Graph. 34, 4, Article 60 (2015), 12 pages. Google ScholarDigital Library
- Paul V. Johnson, Jared A.Q. Parnell, Joohwan Kim, Christopher D. Saunter, Gordon D. Love, and Martin S. Banks. 2016. Dynamic lens and monovision 3D displays to improve viewer comfort. Optics Express 24, 11 (2016).Google Scholar
- Petr Kellnhofer, Piotr Didyk, Karol Myszkowski, Mohamed M Hefeeda, Hans-Peter Seidel, and Wojciech Matusik. 2016. GazeStereo3D: seamless disparity manipulations. ACM Transactions on Graphics (TOG) 35, 4 (2016), 68. Google ScholarDigital Library
- Robert Konrad, Emily A. Cooper, and Gordon Wetzstein. 2016. Novel Optical Configurations for Virtual Reality: Evaluating User Preference and Performance with Focus-tunable and Monovision Near-eye Displays. ACM Conference on Human Factors in Computing Systems (CHI) (2016), 1211--1220. Google ScholarDigital Library
- Robert Konrad, Nitish Padmanaban, Keenan Molner, Emily A. Cooper, and Gordon Wetzstein. 2017. Accommodation-invariant Computational Near-eye Displays. ACM Trans. Graph. (SIGGRAPH) 4 (2017). Issue 36. Google ScholarDigital Library
- Jonh C. Kotulak and Clifton M. Schor. 1986. A computational model of the error detector of human visual accommodation. Biological Cybernetics 54 (1986), 189--194. Google ScholarDigital Library
- George-Alex Koulieris, Bee Bui, Martin S. Banks, and George Drettakis. 2017. Accommodation and Comfort in Head-Mounted Displays. ACM Transactions on Graphics (SIGGRAPH Conference Proceedings) 36, 4 (July 2017), 11. Google ScholarDigital Library
- Gregory Kramida. 2016. Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics 22, 7 (2016), 1912--1931.Google ScholarDigital Library
- Douglas Lanman and David Luebke. 2013. Near-eye Light Field Displays. ACM Trans. Graph. 32, 6, Article 220 (2013), 10 pages. Google ScholarDigital Library
- Junzhong Liang, Bernhard Grimm, Stefan Goelz, and Josef F. Bille. 1994. Objective measurement of wave aberrations of the human eye with the use of a Hartmann-Shack wave-front sensor. Journal of the Optical Society of America A 11, 7 (1994), 1949--1957.Google ScholarCross Ref
- Sheng Liu and Hong Hua. 2010. A systematic method for designing depth-fused multi-focal plane three-dimensional displays. Optics express 18, 11 (2010), 11562--11573.Google Scholar
- Sheng Liu, Hong Hua, and Dewen Cheng. 2010. A Novel Prototype for an Optical See-Through Head-Mounted Display with Addressable Focus Cues. IEEE Transactions on Visualization and Computer Graphics 16, 3 (2010), 381--393. Google ScholarDigital Library
- Patrick Llull, Noah Bedard, Wanmin Wu, Ivana Tošić, Kathrin Berkner, and Nikhil Balram. 2015. Design and optimization of a near-eye multifocal display system for augmented reality, In Imaging and Applied Optics. Imaging and Applied Optics.Google Scholar
- Gordon D. Love, David M. Hoffman, Philip J.W. Hands, James Gao, Andrew K. Kirby, and Martin S. Banks. 2009. High-speed switchable lens enables the development of a volumetric stereoscopic display. Optics Express 17, 18 (2009).Google Scholar
- Kevin J. MacKenzie, Ruth A. Dickson, and Simon J. Watt. 2012. Vergence and accommodation to multiple-image-plane stereoscopic displays: "real world" responses with practical image-plane separations? Journal of Electronic Imaging 21, 1 (2012).Google ScholarCross Ref
- Kevin J MacKenzie, David M Hoffman, and Simon J Watt. 2010. Accommodation to multiple-focal-plane displays: Implications for improving stereoscopic displays and for accommodation control. Journal of Vision 10, 8 (2010), 22--22.Google ScholarCross Ref
- Rafat Mantiuk, Kil Joong Kim, Allan G Rempel, and Wolfgang Heidrich. 2011. HDR-VDP-2: a calibrated visual metric for visibility and quality predictions in all luminance conditions. In ACM Transactions on Graphics (TOG), Vol. 30. ACM, 40. Google ScholarDigital Library
- Nathan Matsuda, Alexander Fix, and Douglas Lanman. 2017. Focal Surface Displays. ACM Transactions on Graphics (SIGGRAPH Conference Proceedings) 36, 4 (July 2017), 14. Google ScholarDigital Library
- Sangeetha Metlapally, Jianliang L Tong, Humza J Tahir, and Clifton M Schor. 2014. The impact of higher-order aberrations on the strength of directional signals produced by accommodative microfluctuations. Journal of vision 14, 12 (2014), 25--25.Google ScholarCross Ref
- Rahul Narain, Rachel A. Albert, Abdullah Bulbul, Gregory J. Ward, Martin S. Banks, and James F. O'Brien. 2015. Optimal Presentation of Imagery with Focus Cues on Multi-plane Displays. ACM Trans. Graph. 34, 4, Article 59 (2015), 12 pages. Google ScholarDigital Library
- D. A. Owens. 1980. A comparison of accommodation responses and contrast sensitivity for sinusoidal gratings. Vision Research 29 (1980), 159--167.Google ScholarCross Ref
- Nitish Padmanaban, Robert Konrad, Tal Stramer, Emily A Cooper, and Gordon Wetzstein. 2017. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays. Proceedings of the National Academy of Sciences (2017), 201617251.Google ScholarCross Ref
- Victor Podlozhnyuk. 2007. Image convolution with CUDA. NVIDIA Corporation white paper, June 2097, 3 (2007).Google Scholar
- Michael Potmesil and Indranil Chakravarty. 1981. A Lens and Aperture Camera Model for Synthetic Image Generation. Computer Graphics 15, 3 (1981), 297--305. Google ScholarDigital Library
- Sowmya Ravikumar, Kurt Akeley, and Martin S. Banks. 2011. Creating effective focus cues in multi-plane 3D displays. Optics Express 19, 21 (2011).Google Scholar
- Jannick P. Rolland, Myron W. Krueger, and Alexei Goon. 2000. Multifocal Planes Head-Mounted Displays. Applied Optics 39 (2000), 3209--3215.Google ScholarCross Ref
- Takashi Shibata, Joohwan Kim, David M. Hoffman, and Martin S. Banks. 2011. The zone of comfort: Predicting visual discomfort with stereo displays. Journal of Vision 11, 8 (2011), 11.Google ScholarCross Ref
- Weitao Song, Yongtian Wang, Dewen Cheng, and Yue Liu. 2014. Light field head-mounted display with correct focus cue using micro structure array. Chinese Optics Letters 12, 6 (2014), 060010.Google ScholarCross Ref
- Michael Stengel, Steve Grogorick, Martin Eisemann, Elmar Eisemann, and Marcus Magnor. 2015. An Affordable Solution for Binocular Eye Tracking and Calibration in Head-mounted Displays. In Proc. ACM Multimedia. 15--24. Google ScholarDigital Library
- Hakan Urey, Kishore V Chellappan, Erdem Erden, and Phil Surman. 2011. State of the art in stereoscopic and autostereoscopic displays. Proc. IEEE 99, 4 (2011), 540--555.Google ScholarCross Ref
- Dhanraj Vishwanath and Erik Blaser. 2010. Retinal blur and the perception of egocentric distance. Journal of Vision 10 (2010).Google Scholar
- Benjamin A. Watson and Larry F. Hodges. 1995. Using texture maps to correct for optical distortion in head-mounted displays. In Virtual Reality Annual International Symposium. 172--178. Google ScholarDigital Library
- Simon J. Watt, Kevin J. MacKenzie, and Louise Ryan. 2005. Real-world stereoscopic performance in multiple-focal-plane displays: How far apart should the image planes be?. In SPIE Stereoscopic Displays And Applications, Vol. 8288.Google Scholar
- W. Wu, P. Llull, I. Tošić, N. Bedard, K. Berkner, and N. Balram. 2016. Content-adaptive focus configuration for near-eye multi-focal displays. In IEEE Multimedia and Expo. Marina Zannoli, Gordon D. Love, Rahul Narain, and Martin S. Banks. 2016. Blur and the perception of depth at occlusions. Journal of Vision 16, 6 (2016), 17.Google Scholar
Index Terms
- Fast gaze-contingent optimal decompositions for multifocal displays
Recommendations
Towards multifocal displays with dense focal stacks
We present a virtual reality display that is capable of generating a dense collection of depth/focal planes. This is achieved by driving a focus-tunable lens to sweep a range of focal lengths at a high frequency and, subsequently, tracking the focal ...
Optimal presentation of imagery with focus cues on multi-plane displays
We present a technique for displaying three-dimensional imagery of general scenes with nearly correct focus cues on multi-plane displays. These displays present an additive combination of images at a discrete set of optical distances, allowing the ...
Focal surface displays
Conventional binocular head-mounted displays (HMDs) vary the stimulus to vergence with the information in the picture, while the stimulus to accommodation remains fixed at the apparent distance of the display, as created by the viewing optics. Sustained ...
Comments