Abstract

Head-mounted displays allow us to go through immersive experiences in virtual reality and are expected to be present in more and more applications in both recreational and professional fields. In this context, recent years have witnessed significant advances in rendering techniques following physical models of lighting and shading. The aim of this paper is to check the fidelity of the visual appearance of real objects captured through a 3D scanner, rendered in a personal computer and displayed in a virtual reality device. We have compared forward versus deferred rendering in real-time computing using two different illuminations and five artwork replicas. The survey contains seven items for each artwork (color, shading, texture, definition, geometry, chromatic aberration, and pixelation) and an extra item related to the global realism. The results confirm recent advances in virtual reality, showing considerable visual fidelity of generated to real-world images, with a rate close to 4 in a 5-step perceptive scale. They also show a high correlation of the realism sensation with the fidelity of color reproduction, material texture, and definition of the artwork replicas. Moreover, statistically significant differences between two rendering modes are found, with a higher value of realism sensation in the deferred rendering mode.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Real-time visualization and interaction with static and live optical coherence tomography volumes in immersive virtual reality

Mark Draelos, Brenton Keller, Christian Viehland, Oscar M. Carrasco-Zevallos, Anthony Kuo, and Joseph Izatt
Biomed. Opt. Express 9(6) 2825-2843 (2018)

Transverse chromatic aberration in virtual reality head-mounted displays

Ryan Beams, Andrea S. Kim, and Aldo Badano
Opt. Express 27(18) 24877-24884 (2019)

Development of an immersive virtual reality head-mounted display with high performance

Yunqi Wang, Weiqi Liu, Xiangxiang Meng, Hanyi Fu, Daliang Zhang, Yusi Kang, Rui Feng, Zhonglun Wei, Xiuqing Zhu, and Guohua Jiang
Appl. Opt. 55(25) 6969-6977 (2016)

References

  • View by:
  • |
  • |
  • |

  1. Y. Wang, W. Liu, X. Meng, H. Fu, D. Zhang, Y. Kang, R. Feng, Z. Wei, X. Zhu, and G. Jiang, “Development of an immersive virtual reality head-mounted display with high performance,” Appl. Opt. 55, 6969–6977 (2016).
    [Crossref]
  2. G. Kramida, “Resolving the vergence-accommodation conflict in head-mounted displays,” IEEE Trans. Visual. Comput. Graph. 22, 1912–1931 (2016).
    [Crossref]
  3. M. Xu and H. Hua, “High dynamic range head mounted display based on dual-layer spatial modulation,” Opt. Express 25, 23320–23333 (2017).
    [Crossref]
  4. P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.
  5. M. W. Krueger, Artificial Reality (Addison-Wesley, 1991).
  6. J. Steuer, “Defining virtual reality: dimensions determining telepresence,” J. Commun. 42, 73–93 (1992).
    [Crossref]
  7. V. Milesen, D. Madsen, and R. B. Lind, Quality Assessment of VR Film: A Study on Spatial Features in VR Concert Experiences (Aalborg University, 2017).
  8. Y. Le Grand, Light, Colour, and Vision (Wiley, 1957).
  9. S. Baek and C. Lee, “Depth perception estimation of various stereoscopic displays,” Opt. Express 24, 23618–23634 (2016).
    [Crossref]
  10. L. Muhlbach, M. Bocker, and A. Prussog, “Telepresence in vídeocommunications: a study on stereoscopy and individual eye contact,” Hum. Factors 37, 290–305 (1995).
    [Crossref]
  11. J. D. Prothero and H. G. Hoffman, “Widening the field-of-view increases the sense of presence in immersive virtual environments,” (Human Interface Technology Laboratory, 1995).
  12. E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
    [Crossref]
  13. I. P. Howard and B. J. Rogers, Binocular Vision and Stereopsis (Oxford University, 1995).
  14. J. Faubert, “The influence of optical distortions and transverse chromatic aberration on motion parallax and stereopsis in natural and artificial environments,” in Three-Dimensional Television, Vídeo, and Display Technologies, B. Javidi and F. Okano, eds. (Springer, 2002), pp. 359–396.
  15. J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
    [Crossref]
  16. P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
    [Crossref]
  17. M. Di Luca, “New method to measure end-to-end delay of virtual reality,” Presence 19, 569–584 (2010).
  18. P. Zimmons and A. Panter, “The influence of rendering quality on presence and task performance in a virtual environment,” in IEEE Virtual Reality (IEEE, 2003), pp. 293–294.
  19. L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.
  20. CIE, “A framework for the measurement of visual appearance,” (2006).
  21. Z. Wang, H. R. Sheikh, and A. C. Bovik, “Objective video quality assessment,” in The Handbook of Video Databases: Design and Applications, B. Furht and O. Marqure, eds. (CRC Press, 2003), pp. 1041–1078.
  22. S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
    [Crossref]
  23. Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.
  24. ITU-T, “Subjective video quality assessment methods for multimedia applications,” (2008).
  25. H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.
  26. A. Ostaszewska and S. Zebrowska-Lucyk, “The method of increasing the accuracy of mean opinion score estimation in subjective quality evaluation,” in Wearable and Autonomous Systems, A. Lay-Ekuakille and S. Chandra Mukhopadhyay, eds. (Springer, 2010), pp. 315–329.

2017 (1)

2016 (5)

G. Kramida, “Resolving the vergence-accommodation conflict in head-mounted displays,” IEEE Trans. Visual. Comput. Graph. 22, 1912–1931 (2016).
[Crossref]

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Y. Wang, W. Liu, X. Meng, H. Fu, D. Zhang, Y. Kang, R. Feng, Z. Wei, X. Zhu, and G. Jiang, “Development of an immersive virtual reality head-mounted display with high performance,” Appl. Opt. 55, 6969–6977 (2016).
[Crossref]

S. Baek and C. Lee, “Depth perception estimation of various stereoscopic displays,” Opt. Express 24, 23618–23634 (2016).
[Crossref]

2015 (1)

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

2011 (1)

S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
[Crossref]

2010 (1)

M. Di Luca, “New method to measure end-to-end delay of virtual reality,” Presence 19, 569–584 (2010).

1995 (1)

L. Muhlbach, M. Bocker, and A. Prussog, “Telepresence in vídeocommunications: a study on stereoscopy and individual eye contact,” Hum. Factors 37, 290–305 (1995).
[Crossref]

1992 (1)

J. Steuer, “Defining virtual reality: dimensions determining telepresence,” J. Commun. 42, 73–93 (1992).
[Crossref]

An, G.

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

Baek, S.

Benitez, P.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Blate, A.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Bocker, M.

L. Muhlbach, M. Bocker, and A. Prussog, “Telepresence in vídeocommunications: a study on stereoscopy and individual eye contact,” Hum. Factors 37, 290–305 (1995).
[Crossref]

Bovik, A. C.

Z. Wang, H. R. Sheikh, and A. C. Bovik, “Objective video quality assessment,” in The Handbook of Video Databases: Design and Applications, B. Furht and O. Marqure, eds. (CRC Press, 2003), pp. 1041–1078.

Bowman, D. A.

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

Buljan, M.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Chikkerur, S.

S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
[Crossref]

Di Luca, M.

M. Di Luca, “New method to measure end-to-end delay of virtual reality,” Presence 19, 569–584 (2010).

Faubert, J.

J. Faubert, “The influence of optical distortions and transverse chromatic aberration on motion parallax and stereopsis in natural and artificial environments,” in Three-Dimensional Television, Vídeo, and Display Technologies, B. Javidi and F. Okano, eds. (Springer, 2002), pp. 359–396.

Feng, R.

Fu, H.

Fuchs, H.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Geng, Y.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Gollier, J.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Grabovickic, D.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Hillis, J.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Hoffman, H. G.

J. D. Prothero and H. G. Hoffman, “Widening the field-of-view increases the sense of presence in immersive virtual environments,” (Human Interface Technology Laboratory, 1995).

Howard, I. P.

I. P. Howard and B. J. Rogers, Binocular Vision and Stereopsis (Oxford University, 1995).

Hua, H.

Jeong-Joon, Y.

L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.

Jiang, G.

Jun Baek, S.

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

Jung, Y. J.

H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.

Kang, Y.

Karam, L. J.

S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
[Crossref]

Kopper, R.

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

Kramida, G.

G. Kramida, “Resolving the vergence-accommodation conflict in head-mounted displays,” IEEE Trans. Visual. Comput. Graph. 22, 1912–1931 (2016).
[Crossref]

Krueger, M. W.

M. W. Krueger, Artificial Reality (Addison-Wesley, 1991).

Lanman, D.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Lastra, A.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Le Grand, Y.

Y. Le Grand, Light, Colour, and Vision (Wiley, 1957).

Lee, C.

Lee, K.

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

Lincoln, P.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Lind, R. B.

V. Milesen, D. Madsen, and R. B. Lind, Quality Assessment of VR Film: A Study on Spatial Features in VR Concert Experiences (Aalborg University, 2017).

Liu, W.

MacKenzie, K.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Madsen, D.

V. Milesen, D. Madsen, and R. B. Lind, Quality Assessment of VR Film: A Study on Spatial Features in VR Concert Experiences (Aalborg University, 2017).

McEldowney, S.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

McMahan, R. P.

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

Meng, X.

Mercier, O.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Milesen, V.

V. Milesen, D. Madsen, and R. B. Lind, Quality Assessment of VR Film: A Study on Spatial Features in VR Concert Experiences (Aalborg University, 2017).

Miñano, J. C.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Muhlbach, L.

L. Muhlbach, M. Bocker, and A. Prussog, “Telepresence in vídeocommunications: a study on stereoscopy and individual eye contact,” Hum. Factors 37, 290–305 (1995).
[Crossref]

Narasimhan, B.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Ngoc, N. P.

H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.

Nikolic, M.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Ostaszewska, A.

A. Ostaszewska and S. Zebrowska-Lucyk, “The method of increasing the accuracy of mean opinion score estimation in subjective quality evaluation,” in Wearable and Autonomous Systems, A. Lay-Ekuakille and S. Chandra Mukhopadhyay, eds. (Springer, 2010), pp. 315–329.

Panter, A.

P. Zimmons and A. Panter, “The influence of rendering quality on presence and task performance in a virtual environment,” in IEEE Virtual Reality (IEEE, 2003), pp. 293–294.

Park, J.

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

Pham, C. T.

H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.

Prothero, J. D.

J. D. Prothero and H. G. Hoffman, “Widening the field-of-view increases the sense of presence in immersive virtual environments,” (Human Interface Technology Laboratory, 1995).

Prussog, A.

L. Muhlbach, M. Bocker, and A. Prussog, “Telepresence in vídeocommunications: a study on stereoscopy and individual eye contact,” Hum. Factors 37, 290–305 (1995).
[Crossref]

Ragan, E. D.

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

Reisslein, M.

S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
[Crossref]

Rogers, B. J.

I. P. Howard and B. J. Rogers, Binocular Vision and Stereopsis (Oxford University, 1995).

Scerbo, S.

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

Seok Joong, H.

L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.

Sheikh, H. R.

Z. Wang, H. R. Sheikh, and A. C. Bovik, “Objective video quality assessment,” in The Handbook of Video Databases: Design and Applications, B. Furht and O. Marqure, eds. (CRC Press, 2003), pp. 1041–1078.

Shin, J.

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

Singh, M.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Soojung, R.

L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.

State, A.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Steuer, J.

J. Steuer, “Defining virtual reality: dimensions determining telepresence,” J. Commun. 42, 73–93 (1992).
[Crossref]

Stinson, C.

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

Sulai, Y.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Sundaram, V.

S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
[Crossref]

Thang, T. C.

H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.

Tran, H. T.

H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.

Wang, Y.

Wang, Z.

Z. Wang, H. R. Sheikh, and A. C. Bovik, “Objective video quality assessment,” in The Handbook of Video Databases: Design and Applications, B. Furht and O. Marqure, eds. (CRC Press, 2003), pp. 1041–1078.

Wei, Z.

Whitted, T.

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

Won-Jong, L.

L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.

Xu, M.

Youngsam, S.

L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.

Zamora, P.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

Zannoli, M.

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

Zebrowska-Lucyk, S.

A. Ostaszewska and S. Zebrowska-Lucyk, “The method of increasing the accuracy of mean opinion score estimation in subjective quality evaluation,” in Wearable and Autonomous Systems, A. Lay-Ekuakille and S. Chandra Mukhopadhyay, eds. (Springer, 2010), pp. 315–329.

Zhang, D.

Zhu, X.

Zimmons, P.

P. Zimmons and A. Panter, “The influence of rendering quality on presence and task performance in a virtual environment,” in IEEE Virtual Reality (IEEE, 2003), pp. 293–294.

Appl. Opt. (1)

Hum. Factors (1)

L. Muhlbach, M. Bocker, and A. Prussog, “Telepresence in vídeocommunications: a study on stereoscopy and individual eye contact,” Hum. Factors 37, 290–305 (1995).
[Crossref]

IEEE Trans. Broadcast. (1)

S. Chikkerur, V. Sundaram, M. Reisslein, and L. J. Karam, “Objective video quality assessment methods: a classification, review, and performance comparison,” IEEE Trans. Broadcast. 57, 165–182 (2011).
[Crossref]

IEEE Trans. Visual. Comput. Graph. (3)

G. Kramida, “Resolving the vergence-accommodation conflict in head-mounted displays,” IEEE Trans. Visual. Comput. Graph. 22, 1912–1931 (2016).
[Crossref]

E. D. Ragan, D. A. Bowman, R. Kopper, C. Stinson, S. Scerbo, and R. P. McMahan, “Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task,” IEEE Trans. Visual. Comput. Graph. 21, 794–807 (2015).
[Crossref]

P. Lincoln, A. Blate, M. Singh, T. Whitted, A. State, A. Lastra, and H. Fuchs, “From motion to photons in 80 microseconds: towards minimal latency for virtual and augmented reality,” IEEE Trans. Visual. Comput. Graph. 22, 1367–1376 (2016).
[Crossref]

J. Commun. (1)

J. Steuer, “Defining virtual reality: dimensions determining telepresence,” J. Commun. 42, 73–93 (1992).
[Crossref]

Multimedia Tools Appl. (1)

J. Shin, G. An, J. Park, S. Jun Baek, and K. Lee, “Application of precise indoor position tracking to immersive virtual reality with translational movement support,” Multimedia Tools Appl. 75, 12331–12350 (2016).
[Crossref]

Opt. Express (2)

Presence (1)

M. Di Luca, “New method to measure end-to-end delay of virtual reality,” Presence 19, 569–584 (2010).

Other (15)

P. Zimmons and A. Panter, “The influence of rendering quality on presence and task performance in a virtual environment,” in IEEE Virtual Reality (IEEE, 2003), pp. 293–294.

L. Won-Jong, H. Seok Joong, S. Youngsam, Y. Jeong-Joon, and R. Soojung, “Fast stereoscopic rendering on mobile ray tracing GPU for virtual reality applications,” in IEEE International Conference on Consumer Electronics (IEEE, 2017), pp. 355–357.

CIE, “A framework for the measurement of visual appearance,” (2006).

Z. Wang, H. R. Sheikh, and A. C. Bovik, “Objective video quality assessment,” in The Handbook of Video Databases: Design and Applications, B. Furht and O. Marqure, eds. (CRC Press, 2003), pp. 1041–1078.

I. P. Howard and B. J. Rogers, Binocular Vision and Stereopsis (Oxford University, 1995).

J. Faubert, “The influence of optical distortions and transverse chromatic aberration on motion parallax and stereopsis in natural and artificial environments,” in Three-Dimensional Television, Vídeo, and Display Technologies, B. Javidi and F. Okano, eds. (Springer, 2002), pp. 359–396.

J. D. Prothero and H. G. Hoffman, “Widening the field-of-view increases the sense of presence in immersive virtual environments,” (Human Interface Technology Laboratory, 1995).

V. Milesen, D. Madsen, and R. B. Lind, Quality Assessment of VR Film: A Study on Spatial Features in VR Concert Experiences (Aalborg University, 2017).

Y. Le Grand, Light, Colour, and Vision (Wiley, 1957).

Y. Sulai, Y. Geng, O. Mercier, M. Zannoli, K. MacKenzie, J. Hillis, D. Lanman, J. Gollier, and S. McEldowney, “Optics and perception in virtual reality,” in Imaging and Applied Optics (3D, AIO, COSI, IS, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2017), paper DTu4F.3.

ITU-T, “Subjective video quality assessment methods for multimedia applications,” (2008).

H. T. Tran, N. P. Ngoc, C. T. Pham, Y. J. Jung, and T. C. Thang, “A subjective study on QoE of 360 video for VR communication,” in IEEE 19th International Workshop on Multimedia Signal Processing (IEEE, 2017), pp. 1–6.

A. Ostaszewska and S. Zebrowska-Lucyk, “The method of increasing the accuracy of mean opinion score estimation in subjective quality evaluation,” in Wearable and Autonomous Systems, A. Lay-Ekuakille and S. Chandra Mukhopadhyay, eds. (Springer, 2010), pp. 315–329.

P. Benitez, J. C. Miñano, D. Grabovickic, P. Zamora, M. Buljan, B. Narasimhan, and M. Nikolic, “Freeform optics for virtual reality applications,” in Optical Design and Fabrication (Freeform, IODC, OFT), OSA Technical Digest (Optical Society of America, 2017), paper ITu2A.1.

M. W. Krueger, Artificial Reality (Addison-Wesley, 1991).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (3)

Fig. 1.
Fig. 1. Real pictures of the five artwork replicas used in this work in the LED light booth.
Fig. 2.
Fig. 2. Experimental setup includes a real light booth (left) and a HMD in which the observer can see the virtual scene displayed on an external monitor (right).
Fig. 3.
Fig. 3. Simulation of real artwork illuminated by a filtered halogen lamp at incidence angle of 45° with respect to the point of view of the observer at the light booth (0°). (a) Metal warrior. (b) Wood fish. (c) Roman mosaic. (d) Greek amphora. (e) Roman stele.

Tables (4)

Tables Icon

Table 1. Characteristics of Main Virtual Reality Devices

Tables Icon

Table 2. Mean Opinion Score Scale Used in this Work

Tables Icon

Table 3. Average Values and Standard Deviation of MOS Scores Calculated over 20 Observers for Eight Items and Four Setups

Tables Icon

Table 4. Correlation Coefficients between Perceived Realism, Remaining Visual Properties, and Linear Model

Metrics