Abstract

This paper introduces a foveation-based driving scheme that can extend line times for high-resolution and wide viewing angle displays in virtual reality applications. In the proposed one, a panel receives the substantially reduced vertical resolution image with consideration of human visual system based on the distance from the foveation point, which also leads to the dramatic data bandwidth reduction. Then, the foveated-rendering image of the full resolution is recovered in the panel by means of multi-output driving shift registers that can apply the same gate pulses to multiple lines. These multiple lines are simultaneously charged with one same line data and multiple resolutions can be presented automatically on a high resolution display. It is verified that effective numbers of lines are reduced to $30.3\%$ and $21.0\%$ for 4,800 $\times$ 4,800 and 9,600 $\times$ 9,600 resolutions, respectively. Consequently, line times can be extended to $330.0\%$ and $476.2\%$. In addition, the subjective evaluation has ensured that the foveation-based driving scheme is applicable to high resolution and wide viewing angle displays without any perceivable degradation of image qualities.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Development of an immersive virtual reality head-mounted display with high performance

Yunqi Wang, Weiqi Liu, Xiangxiang Meng, Hanyi Fu, Daliang Zhang, Yusi Kang, Rui Feng, Zhonglun Wei, Xiuqing Zhu, and Guohua Jiang
Appl. Opt. 55(25) 6969-6977 (2016)

Transverse chromatic aberration in virtual reality head-mounted displays

Ryan Beams, Andrea S. Kim, and Aldo Badano
Opt. Express 27(18) 24877-24884 (2019)

Submillisecond-response liquid crystal for high-resolution virtual reality displays

Fangwang Gou, Haiwei Chen, Ming-Chun Li, Seok-Lyul Lee, and Shin-Tson Wu
Opt. Express 25(7) 7984-7997 (2017)

References

  • View by:
  • |
  • |
  • |

  1. J. Steuer, “Defining virtual reality: Dimensions determining telepresence,” J. Commun. 42(4), 73–93 (1992).
    [Crossref]
  2. M. Slater and M. V. Sanchez-Vives, “Enhancing our lives with immersive virtual reality,” Front. Robot. AI 3, 74 (2016).
    [Crossref]
  3. E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
    [Crossref]
  4. J. Han and H.-J. Suk, “Do users perceive the same image differently? Comparison of OLED and LCD in mobile HMDs and smartphones,” J. Inf. Disp. 20(1), 31–38 (2019).
    [Crossref]
  5. B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
    [Crossref]
  6. B. Young, “OLED displays and the immersive experience,” Inf. Disp. 34(2), 16–36 (2018).
    [Crossref]
  7. A. K. Bhowmik, “Advances in virtual, augmented, and mixed reality technologies,” Inf. Disp. 34(5), 18–21 (2018).
    [Crossref]
  8. H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
    [Crossref]
  9. C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
    [Crossref]
  10. A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
    [Crossref]
  11. A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17
  12. R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
    [Crossref]
  13. W. S. Geisler and J. S. Perry, “A real-time foveated multiresolution system for low-bandwidth video communication,” Proc. SPIE 3299, 294–305 (1998).
    [Crossref]
  14. Z. Wang and A. C. Bovik, “Embedded foveation image coding,” IEEE Trans. Image Process. 10(10), 1397–1410 (2001).
    [Crossref]
  15. Z. Wang, L. Lu, and A. C. Bovik, “Foveation scalable video coding with automatic fixation selection,” IEEE Trans. Image Process. 12(2), 243–254 (2003).
    [Crossref]
  16. Y. Kim, S.-J. Park, and H. Nam, “Node-sharing low-temperature poly silicon TFT shift register without bootstrapping degradation for narrow bezel displays,” Electron. Lett. 54(20), 1162–1164 (2018).
    [Crossref]
  17. R. C. Gonzalez and R. E. Woods, Digital Image Processing (John Wiley & Sons, Ltd, 2010).
  18. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
    [Crossref]
  19. R. L. Myers, Display Interfaces: Fundamentals and Standards (John Wiley & Sons, Ltd, 2002).

2019 (2)

J. Han and H.-J. Suk, “Do users perceive the same image differently? Comparison of OLED and LCD in mobile HMDs and smartphones,” J. Inf. Disp. 20(1), 31–38 (2019).
[Crossref]

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

2018 (4)

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

B. Young, “OLED displays and the immersive experience,” Inf. Disp. 34(2), 16–36 (2018).
[Crossref]

A. K. Bhowmik, “Advances in virtual, augmented, and mixed reality technologies,” Inf. Disp. 34(5), 18–21 (2018).
[Crossref]

Y. Kim, S.-J. Park, and H. Nam, “Node-sharing low-temperature poly silicon TFT shift register without bootstrapping degradation for narrow bezel displays,” Electron. Lett. 54(20), 1162–1164 (2018).
[Crossref]

2017 (3)

R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
[Crossref]

E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
[Crossref]

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

2016 (2)

M. Slater and M. V. Sanchez-Vives, “Enhancing our lives with immersive virtual reality,” Front. Robot. AI 3, 74 (2016).
[Crossref]

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

2004 (1)

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
[Crossref]

2003 (1)

Z. Wang, L. Lu, and A. C. Bovik, “Foveation scalable video coding with automatic fixation selection,” IEEE Trans. Image Process. 12(2), 243–254 (2003).
[Crossref]

2001 (1)

Z. Wang and A. C. Bovik, “Embedded foveation image coding,” IEEE Trans. Image Process. 10(10), 1397–1410 (2001).
[Crossref]

1998 (1)

W. S. Geisler and J. S. Perry, “A real-time foveated multiresolution system for low-bandwidth video communication,” Proc. SPIE 3299, 294–305 (1998).
[Crossref]

1992 (1)

J. Steuer, “Defining virtual reality: Dimensions determining telepresence,” J. Commun. 42(4), 73–93 (1992).
[Crossref]

Albert, R.

R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
[Crossref]

Balram, N.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

Bastani, B.

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

Bastug, E.

E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
[Crossref]

Bennis, M.

E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
[Crossref]

Benty, N.

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Bhowmik, A. K.

A. K. Bhowmik, “Advances in virtual, augmented, and mixed reality technologies,” Inf. Disp. 34(5), 18–21 (2018).
[Crossref]

Bovik, A. C.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
[Crossref]

Z. Wang, L. Lu, and A. C. Bovik, “Foveation scalable video coding with automatic fixation selection,” IEEE Trans. Image Process. 12(2), 243–254 (2003).
[Crossref]

Z. Wang and A. C. Bovik, “Embedded foveation image coding,” IEEE Trans. Image Process. 10(10), 1397–1410 (2001).
[Crossref]

Debbah, M.

E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
[Crossref]

Funt, B.

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

Geisler, W. S.

W. S. Geisler and J. S. Perry, “A real-time foveated multiresolution system for low-bandwidth video communication,” Proc. SPIE 3299, 294–305 (1998).
[Crossref]

Gonzalez, R. C.

R. C. Gonzalez and R. E. Woods, Digital Image Processing (John Wiley & Sons, Ltd, 2010).

Han, J.

J. Han and H.-J. Suk, “Do users perceive the same image differently? Comparison of OLED and LCD in mobile HMDs and smartphones,” J. Inf. Disp. 20(1), 31–38 (2019).
[Crossref]

Jang, H. J.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Jiang, H.

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

Jung, S. H.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

Kang, I. B.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

Kaplanyan, A.

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Kim, J.

R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
[Crossref]

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Kim, Y.

Y. Kim, S.-J. Park, and H. Nam, “Node-sharing low-temperature poly silicon TFT shift register without bootstrapping degradation for narrow bezel displays,” Electron. Lett. 54(20), 1162–1164 (2018).
[Crossref]

Kwak, J.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Lee, B.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Lee, D.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Lee, G.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

Lee, J. Y.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Lefohn, A.

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Lu, L.

Z. Wang, L. Lu, and A. C. Bovik, “Foveation scalable video coding with automatic fixation selection,” IEEE Trans. Image Process. 12(2), 243–254 (2003).
[Crossref]

Luebke, D.

R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
[Crossref]

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Medard, M.

E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
[Crossref]

Myers, R. L.

R. L. Myers, Display Interfaces: Fundamentals and Standards (John Wiley & Sons, Ltd, 2002).

Nam, H.

Y. Kim, S.-J. Park, and H. Nam, “Node-sharing low-temperature poly silicon TFT shift register without bootstrapping degradation for narrow bezel displays,” Electron. Lett. 54(20), 1162–1164 (2018).
[Crossref]

Noh, Y. Y.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Park, J.-H.

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

Park, S.-J.

Y. Kim, S.-J. Park, and H. Nam, “Node-sharing low-temperature poly silicon TFT shift register without bootstrapping degradation for narrow bezel displays,” Electron. Lett. 54(20), 1162–1164 (2018).
[Crossref]

Patney, A.

R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
[Crossref]

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Perry, J. S.

W. S. Geisler and J. S. Perry, “A real-time foveated multiresolution system for low-bandwidth video communication,” Proc. SPIE 3299, 294–305 (1998).
[Crossref]

Salvi, M.

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Sanchez-Vives, M. V.

M. Slater and M. V. Sanchez-Vives, “Enhancing our lives with immersive virtual reality,” Front. Robot. AI 3, 74 (2016).
[Crossref]

Sheikh, H. R.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
[Crossref]

Simoncelli, E. P.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
[Crossref]

Slater, M.

M. Slater and M. V. Sanchez-Vives, “Enhancing our lives with immersive virtual reality,” Front. Robot. AI 3, 74 (2016).
[Crossref]

Steuer, J.

J. Steuer, “Defining virtual reality: Dimensions determining telepresence,” J. Commun. 42(4), 73–93 (1992).
[Crossref]

Suk, H.-J.

J. Han and H.-J. Suk, “Do users perceive the same image differently? Comparison of OLED and LCD in mobile HMDs and smartphones,” J. Inf. Disp. 20(1), 31–38 (2019).
[Crossref]

Turner, E.

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

Vieri, C.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

Wang, Z.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
[Crossref]

Z. Wang, L. Lu, and A. C. Bovik, “Foveation scalable video coding with automatic fixation selection,” IEEE Trans. Image Process. 12(2), 243–254 (2003).
[Crossref]

Z. Wang and A. C. Bovik, “Embedded foveation image coding,” IEEE Trans. Image Process. 10(10), 1397–1410 (2001).
[Crossref]

Woods, R. E.

R. C. Gonzalez and R. E. Woods, Digital Image Processing (John Wiley & Sons, Ltd, 2010).

Wyman, C.

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Yang, J. Y.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

Yoon, S. Y.

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

Young, B.

B. Young, “OLED displays and the immersive experience,” Inf. Disp. 34(2), 16–36 (2018).
[Crossref]

ACM Trans. Appl. Percept. (1)

R. Albert, A. Patney, D. Luebke, and J. Kim, “Latency requirements for foveated rendering in virtual reality,” ACM Trans. Appl. Percept. 14(4), 1–13 (2017).
[Crossref]

ACM Trans. Graph. (1)

A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016).
[Crossref]

Electron. Lett. (1)

Y. Kim, S.-J. Park, and H. Nam, “Node-sharing low-temperature poly silicon TFT shift register without bootstrapping degradation for narrow bezel displays,” Electron. Lett. 54(20), 1162–1164 (2018).
[Crossref]

Front. Robot. AI (1)

M. Slater and M. V. Sanchez-Vives, “Enhancing our lives with immersive virtual reality,” Front. Robot. AI 3, 74 (2016).
[Crossref]

IEEE Commun. Mag. (1)

E. Bastug, M. Bennis, M. Medard, and M. Debbah, “Toward interconnected virtual reality: Opportunities, challenges, and enablers,” IEEE Commun. Mag. 55(6), 110–117 (2017).
[Crossref]

IEEE Trans. Image Process. (3)

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004).
[Crossref]

Z. Wang and A. C. Bovik, “Embedded foveation image coding,” IEEE Trans. Image Process. 10(10), 1397–1410 (2001).
[Crossref]

Z. Wang, L. Lu, and A. C. Bovik, “Foveation scalable video coding with automatic fixation selection,” IEEE Trans. Image Process. 12(2), 243–254 (2003).
[Crossref]

Inf. Disp. (3)

B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017).
[Crossref]

B. Young, “OLED displays and the immersive experience,” Inf. Disp. 34(2), 16–36 (2018).
[Crossref]

A. K. Bhowmik, “Advances in virtual, augmented, and mixed reality technologies,” Inf. Disp. 34(5), 18–21 (2018).
[Crossref]

J. Commun. (1)

J. Steuer, “Defining virtual reality: Dimensions determining telepresence,” J. Commun. 42(4), 73–93 (1992).
[Crossref]

J. Inf. Disp. (2)

H. J. Jang, J. Y. Lee, J. Kwak, D. Lee, J.-H. Park, B. Lee, and Y. Y. Noh, “Progress of display performances: AR, VR, QLED, OLED, and TFT,” J. Inf. Disp. 20(1), 1–8 (2019).
[Crossref]

J. Han and H.-J. Suk, “Do users perceive the same image differently? Comparison of OLED and LCD in mobile HMDs and smartphones,” J. Inf. Disp. 20(1), 31–38 (2019).
[Crossref]

J. Soc. Inf. Disp. (1)

C. Vieri, G. Lee, N. Balram, S. H. Jung, J. Y. Yang, S. Y. Yoon, and I. B. Kang, “An 18 megapixel 4.3′′1443 ppi 120 hz OLED display for wide field of view high acuity head mounted displays,” J. Soc. Inf. Disp. 26(5), 314–324 (2018).
[Crossref]

Proc. SPIE (1)

W. S. Geisler and J. S. Perry, “A real-time foveated multiresolution system for low-bandwidth video communication,” Proc. SPIE 3299, 294–305 (1998).
[Crossref]

Other (3)

R. L. Myers, Display Interfaces: Fundamentals and Standards (John Wiley & Sons, Ltd, 2002).

R. C. Gonzalez and R. E. Woods, Digital Image Processing (John Wiley & Sons, Ltd, 2010).

A. Patney, J. Kim, M. Salvi, A. Kaplanyan, C. Wyman, N. Benty, A. Lefohn, and D. Luebke, “Perceptually-based foveated virtual reality,” in SIGGRAPH ’16 ACM SIGGRAPH 2016 Emerging Technologies (ACM, 2016), p. 17

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1.
Fig. 1. Viewing geometry for parameters of viewing distance (${d}_{view}$), display height (vertical resolution, $V$), eccentricity ($e$), and the distance from a foveation point ($d_{pixel}$)
Fig. 2.
Fig. 2. Cutoff frequency ($f_c$) plot over the distance from a foveation point ($d_{pixel}$). $f_{HVS}$ in red is the maximum perceivable frequency of HVS, $f_{RES}$ in gray is the half Nyquist frequency of a given display, and $f_c$ in black is given as the minimum frequency among them.
Fig. 3.
Fig. 3. Spatial frequency of the reduced resolution driving ($f_{RR}$). The half Nyquist frequencies ($f_{RES/2}$, $f_{RES/4}$, $f_{RES/8}$) of resolutions reduced by 1/2, 1/4, and 1/8 are presented in orange, green, and blue dotted-lines, respectively. The overlapped regions of $f_{RR}$ and half Nyquist frequencies are displayed at the corresponding reduced resolutions. Because overlapped regions are higher than $f_{HVS}$, there is no change in $f_c$.
Fig. 4.
Fig. 4. Block diagram of the proposed foveation-based reduced resolution driving scheme
Fig. 5.
Fig. 5. Vertical resolution reduction scheme. White boxes represent pixels. The number of pixels in a line is maintained, but the number of lines is reduced with averaging pixel values in the column direction by 1/2, 1/4, or 1/8 according to corresponding reduced resolutions.
Fig. 6.
Fig. 6. Proposed 8-output shift register [16]. STP and RSP are start and reset pulses, CLK1 to CLK8 are clock signals, G0 to G7 are 8 output signals, and VGH and VGL are positive and negative supply voltages.
Fig. 7.
Fig. 7. Gate driver circuit. (a) Configuration for 4 stages of $f_{RES}$, $f_{RES/2}$, $f_{RES/4}$, and $f_{RES/8}$ regions. (b) Timing diagram for clock and output signals. It should be noted that output signals of 4 stages are presented in the same rows.
Fig. 8.
Fig. 8. Foveated-rendering image construction of original resolution on a panel by multi-output driving. One line data of the reduced resolution image is transferred to 2, 4, or 8 lines of a panel at the same time according to the reduced resolution, leading to the resolution extension to the original one.
Fig. 9.
Fig. 9. Comparison of input and displayed images in the proposed foveation-based driving scheme. The foveation point is marked at the center with a red plus symbol, and $f_{RES/2}$, $f_{RES/4}$, and $f_{RES/8}$ regions are presented in red, orange, and gray, respectively.
Fig. 10.
Fig. 10. HVS simulation results for original and proposed foveated images.
Fig. 11.
Fig. 11. Simulation results of multi-output shift registers with the same configuration as Fig. 7 for (a) $f_{RES}$, (b) $f_{RES/2}$, (c) $f_{RES/4}$, and (d) $f_{RES/8}$ regions.
Fig. 12.
Fig. 12. Experimental environment for subjective evaluation.
Fig. 13.
Fig. 13. 25 test images at the resolution of 2,160 $\times$ 3,840. 6 images are computer graphics while 19 images are real pictures.
Fig. 14.
Fig. 14. Subjective evaluation procedure. The original and reduced resolution images are consecutively shown for 5 seconds per each and then, the visibility of their difference is evaluated.
Fig. 15.
Fig. 15. Evaluation scores for 25 test images and 15 subjects. 0 and 1 are detection and no-detection of image differences. The average score for each test image is higher than 0.6 and the average score for all images is measured as 0.843.

Tables (1)

Tables Icon

Table 1. Regions and reduced vertical resolutions for the displays of 4,800 and 9,600 lines.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

d p i x e l = ( x x 1 ) 2 + ( y y 1 ) 2 [ p i x e l s ]
e = tan 1 ( d p i x e l d v i e w ) = tan 1 ( d p i x e l β V ) [ d e g r e e s ]
f H V S ( e ) = e 2 ln C T 0 α ( e 2 + e ) [ c y c l e s / d e g r e e ]
f R E S ( e ) = π β V 360 cos 2 ( π e 180 ) [ c y c l e s / d e g r e e ]
f c = min ( f H V S , f R E S )
f R E S / 2 n ( e ) = π β V 360 cos 2 ( π e 180 ) × 2 n , n = 1 , 2 ,
L P F ( x , x 1 ) = { 1 N sinc ( f c f R ( x x 1 ) ) if  | x x 1 | f R f c 0 elsewhere
H V S ( x , y ) = L P F ( x , x 1 ) L P F ( y , y 1 )

Metrics