Abstract

Since Canon released the first dual-pixel autofocus in 2013, this technique has been used in many cameras and smartphones. Quad-pixel sensors, where a microlens covers 2x2 sub-pixels, will be the next development. In this paper we describe the design for such sensors; related wave optics simulations; and results, especially in terms of angular response. Then we propose a new method for mixing wave optics simulations with ray tracing simulations in order to generate physically accurate synthetic images. Those images are useful in a co-design approach by linking the pixel architecture, the main lens design and the computer vision algorithms.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Uniform illumination and rigorous electromagnetic simulations applied to CMOS image sensors

Jérôme Vaillant, Axel Crocherie, Flavien Hirigoyen, Adam Cadien, and James Pond
Opt. Express 15(9) 5494-5503 (2007)

Multiocular image sensor with on-chip beam-splitter and inner meta-micro-lens for single-main-lens stereo camera

Shinzo Koyama, Kazutoshi Onozawa, Keisuke Tanaka, Shigeru Saito, Sahim Mohamed Kourkouss, and Yoshihisa Kato
Opt. Express 24(16) 18035-18048 (2016)

Optical confinement methods for continued scaling of CMOS image sensor pixels

Christian C. Fesenmaier, Yijie Huo, and Peter B. Catrysse
Opt. Express 16(25) 20457-20470 (2008)

References

  • View by:
  • |
  • |
  • |

  1. R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).
  2. R. Winston, “Canon u.s.a., inc., whats neweos 5d mark iv dual pixel rawimages,” https://www.usa.canon.com/internet/portal/us/home/learn/education/topics/article/2018/June/Whats-New-EOS-5D-Mark-IV-Dual-Pixel-RAW-Images (2016).
  3. N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
    [Crossref]
  4. B. Vandame, V. Drazic, M. Hog, and N. Sabater, “Plenoptic sensor: Application to extend field-of-view,” in 2018 26th European Signal Processing Conference (EUSIPCO), (2018), pp. 2205–2209.
  5. M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
    [Crossref]
  6. S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.
  7. D. V. Johnston and P. N. G. Tarjan, “Cs348b final project: Ray-tracing interference and diffraction,” Tech. rep., Stanford University (2006).
  8. N. Lindlein, “Simulation of micro-optical systems including microlens arrays,” J. Opt. A: Pure Appl. Opt. 4(4), 351S1–S9 (2002).
    [Crossref]
  9. “Pbrt v2,” https://www.pbrt.org/ .
  10. M. Pharr, W. Jakob, and G. Humphreys, Physically Based Rendering: From Theory to Implementation (Morgan Kaufmann Publishers Inc., 2016), III ed.
  11. “Lumerical fdtd solution,” https://www.lumerical.com/products/fdtd-solutions/ .
  12. T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).
  13. G. Agranov, V. Berezin, and R. H. Tsai, “Crosstalk and microlens study in a color cmos image sensor,” IEEE Trans. Electron Devices 50(1), 4–11 (2003).
    [Crossref]
  14. M. Born and E. Wolf, Principles of Optics (Cambridge University, 1999), VII ed.
  15. R. Ng and P. M. Hanrahan, Digital correction of lens aberrations in light field photography, in International Optical Design (Optical Society of America, 2006), p. WB2.
  16. K. De and V. Masilamani, “Image sharpness measure for blurred images in frequency domain,” Procedia Eng. 64, 149–158 (2013).
    [Crossref]
  17. C.-Y. Wee and R. Paramesran, “Image sharpness measure using eigenvalues,” in 2008 9th International Conference on Signal Processing, (2008), pp. 840–843.

2018 (2)

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

2013 (1)

K. De and V. Masilamani, “Image sharpness measure for blurred images in frequency domain,” Procedia Eng. 64, 149–158 (2013).
[Crossref]

2003 (1)

G. Agranov, V. Berezin, and R. H. Tsai, “Crosstalk and microlens study in a color cmos image sensor,” IEEE Trans. Electron Devices 50(1), 4–11 (2003).
[Crossref]

2002 (1)

N. Lindlein, “Simulation of micro-optical systems including microlens arrays,” J. Opt. A: Pure Appl. Opt. 4(4), 351S1–S9 (2002).
[Crossref]

Agranov, G.

G. Agranov, V. Berezin, and R. H. Tsai, “Crosstalk and microlens study in a color cmos image sensor,” IEEE Trans. Electron Devices 50(1), 4–11 (2003).
[Crossref]

Arnaud, T.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Barron, J. T.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Berezin, V.

G. Agranov, V. Berezin, and R. H. Tsai, “Crosstalk and microlens study in a color cmos image sensor,” IEEE Trans. Electron Devices 50(1), 4–11 (2003).
[Crossref]

Born, M.

M. Born and E. Wolf, Principles of Optics (Cambridge University, 1999), VII ed.

Brédif, M.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

Carrere, J.-P.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Carroll, R.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Chang, D.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Cherault, N.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Choi, S.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Choi, Y.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

De, K.

K. De and V. Masilamani, “Image sharpness measure for blurred images in frequency domain,” Procedia Eng. 64, 149–158 (2013).
[Crossref]

Drazic, V.

B. Vandame, V. Drazic, M. Hog, and N. Sabater, “Plenoptic sensor: Application to extend field-of-view,” in 2018 26th European Signal Processing Conference (EUSIPCO), (2018), pp. 2205–2209.

Duval, G.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

Favennec, L.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Feldman, B. E.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Garg, R.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Gatefait, M.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Grant, L.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Hanrahan, P.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

Hanrahan, P. M.

R. Ng and P. M. Hanrahan, Digital correction of lens aberrations in light field photography, in International Optical Design (Optical Society of America, 2006), p. WB2.

Hirigoyen, F.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Hog, M.

B. Vandame, V. Drazic, M. Hog, and N. Sabater, “Plenoptic sensor: Application to extend field-of-view,” in 2018 26th European Signal Processing Conference (EUSIPCO), (2018), pp. 2205–2209.

Horowitz, M.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

Humphreys, G.

M. Pharr, W. Jakob, and G. Humphreys, Physically Based Rendering: From Theory to Implementation (Morgan Kaufmann Publishers Inc., 2016), III ed.

Ichikawa, T.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Im, J.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Inoue, S.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Jacobs, D. E.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Jakob, W.

M. Pharr, W. Jakob, and G. Humphreys, Physically Based Rendering: From Theory to Implementation (Morgan Kaufmann Publishers Inc., 2016), III ed.

Jeanjean, D.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Johnston, D. V.

D. V. Johnston and P. N. G. Tarjan, “Cs348b final project: Ray-tracing interference and diffraction,” Tech. rep., Stanford University (2006).

Jung, M.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Jung, S.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Kanazawa, N.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Kim, B.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Kobayashi, M.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Lee, D.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Lee, K.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Lee, S.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Lee, Y.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Leverd, F.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Levoy, M.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

Lindlein, N.

N. Lindlein, “Simulation of micro-optical systems including microlens arrays,” J. Opt. A: Pure Appl. Opt. 4(4), 351S1–S9 (2002).
[Crossref]

Masilamani, V.

K. De and V. Masilamani, “Image sharpness measure for blurred images in frequency domain,” Procedia Eng. 64, 149–158 (2013).
[Crossref]

Min, D.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Moon, C.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Movshovitz-Attias, Y.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Ng, R.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

R. Ng and P. M. Hanrahan, Digital correction of lens aberrations in light field photography, in International Optical Design (Optical Society of America, 2006), p. WB2.

Ohmura, M.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Paramesran, R.

C.-Y. Wee and R. Paramesran, “Image sharpness measure using eigenvalues,” in 2008 9th International Conference on Signal Processing, (2008), pp. 840–843.

Park, J.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Perrot, C.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Pharr, M.

M. Pharr, W. Jakob, and G. Humphreys, Physically Based Rendering: From Theory to Implementation (Morgan Kaufmann Publishers Inc., 2016), III ed.

Pinzelli, L.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Pritch, Y.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Pyo, J.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Roy, F.

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

Sabater, N.

B. Vandame, V. Drazic, M. Hog, and N. Sabater, “Plenoptic sensor: Application to extend field-of-view,” in 2018 26th European Signal Processing Conference (EUSIPCO), (2018), pp. 2205–2209.

Sakurai, K.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Shim, E. S.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Shirai, T.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Son, K.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Takahashi, H.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Tarjan, P. N. G.

D. V. Johnston and P. N. G. Tarjan, “Cs348b final project: Ray-tracing interference and diffraction,” Tech. rep., Stanford University (2006).

Tsai, R. H.

G. Agranov, V. Berezin, and R. H. Tsai, “Crosstalk and microlens study in a color cmos image sensor,” IEEE Trans. Electron Devices 50(1), 4–11 (2003).
[Crossref]

Vandame, B.

B. Vandame, V. Drazic, M. Hog, and N. Sabater, “Plenoptic sensor: Application to extend field-of-view,” in 2018 26th European Signal Processing Conference (EUSIPCO), (2018), pp. 2205–2209.

Wadhwa, N.

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

Wang, T.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Wee, C.-Y.

C.-Y. Wee and R. Paramesran, “Image sharpness measure using eigenvalues,” in 2008 9th International Conference on Signal Processing, (2008), pp. 840–843.

Winston, R.

R. Winston, “Canon u.s.a., inc., whats neweos 5d mark iv dual pixel rawimages,” https://www.usa.canon.com/internet/portal/us/home/learn/education/topics/article/2018/June/Whats-New-EOS-5D-Mark-IV-Dual-Pixel-RAW-Images (2016).

Wolf, E.

M. Born and E. Wolf, Principles of Optics (Cambridge University, 1999), VII ed.

Yun, J.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

Yuzurihara, H.

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

ACM Trans. Graph. (1)

N. Wadhwa, R. Garg, D. E. Jacobs, B. E. Feldman, N. Kanazawa, R. Carroll, Y. Movshovitz-Attias, J. T. Barron, Y. Pritch, and M. Levoy, “Synthetic depth-of-field with a single-camera mobile phone,” ACM Trans. Graph. 37(4), 1–13 (2018).
[Crossref]

IEEE Trans. Electron Devices (1)

G. Agranov, V. Berezin, and R. H. Tsai, “Crosstalk and microlens study in a color cmos image sensor,” IEEE Trans. Electron Devices 50(1), 4–11 (2003).
[Crossref]

J. Opt. A: Pure Appl. Opt. (1)

N. Lindlein, “Simulation of micro-optical systems including microlens arrays,” J. Opt. A: Pure Appl. Opt. 4(4), 351S1–S9 (2002).
[Crossref]

Jpn. J. Appl. Phys. (1)

M. Kobayashi, M. Ohmura, H. Takahashi, T. Shirai, K. Sakurai, T. Ichikawa, H. Yuzurihara, and S. Inoue, “High-definition and high-sensitivity cmos image sensor with all-pixel image plane phase-difference detection autofocus,” Jpn. J. Appl. Phys. 57(10), 1002B5 (2018).
[Crossref]

Procedia Eng. (1)

K. De and V. Masilamani, “Image sharpness measure for blurred images in frequency domain,” Procedia Eng. 64, 149–158 (2013).
[Crossref]

Other (12)

C.-Y. Wee and R. Paramesran, “Image sharpness measure using eigenvalues,” in 2008 9th International Conference on Signal Processing, (2008), pp. 840–843.

S. Choi, K. Lee, J. Yun, S. Choi, S. Lee, J. Park, E. S. Shim, J. Pyo, B. Kim, M. Jung, Y. Lee, K. Son, S. Jung, T. Wang, Y. Choi, D. Min, J. Im, C. Moon, D. Lee, and D. Chang, “An all pixel pdaf cmos image sensor with 0.64umx1.28um photodiode separated by self-aligned in-pixel deep trench isolation for high af performance,” in 2017 Symposium on VLSI Technology, (2017), pp. T104–T105.

D. V. Johnston and P. N. G. Tarjan, “Cs348b final project: Ray-tracing interference and diffraction,” Tech. rep., Stanford University (2006).

M. Born and E. Wolf, Principles of Optics (Cambridge University, 1999), VII ed.

R. Ng and P. M. Hanrahan, Digital correction of lens aberrations in light field photography, in International Optical Design (Optical Society of America, 2006), p. WB2.

“Pbrt v2,” https://www.pbrt.org/ .

M. Pharr, W. Jakob, and G. Humphreys, Physically Based Rendering: From Theory to Implementation (Morgan Kaufmann Publishers Inc., 2016), III ed.

“Lumerical fdtd solution,” https://www.lumerical.com/products/fdtd-solutions/ .

T. Arnaud, F. Leverd, L. Favennec, C. Perrot, L. Pinzelli, M. Gatefait, N. Cherault, D. Jeanjean, J.-P. Carrere, F. Hirigoyen, L. Grant, and F. Roy, “Pixel-to-pixel isolation by deep trench technology: Application to cmos image sensor,” in IISW, (2011).

B. Vandame, V. Drazic, M. Hog, and N. Sabater, “Plenoptic sensor: Application to extend field-of-view,” in 2018 26th European Signal Processing Conference (EUSIPCO), (2018), pp. 2205–2209.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Tech. rep., Stanford University (2005).

R. Winston, “Canon u.s.a., inc., whats neweos 5d mark iv dual pixel rawimages,” https://www.usa.canon.com/internet/portal/us/home/learn/education/topics/article/2018/June/Whats-New-EOS-5D-Mark-IV-Dual-Pixel-RAW-Images (2016).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1.
Fig. 1. General design of a quad-pixel sensor. (a) and (c) show the view from the side and from the top (not to scale). Orange rectangles in dashed lines show the tungsten isolation described in section 2.1.3. (b) and (d) show the orientation of the angles $\theta$ and $\varphi$.
Fig. 2.
Fig. 2. Angular responses for a microlens height of 2.48 µm and a radius of curvature of 3.5 µm (a) and 2.48 µm (b). Plots at the top show ${\textrm {P}_{\textrm {abs}}^\textrm{A}}$, ${\textrm {P}_{\textrm {abs}}^\textrm{B}}$ and their sum. Plots at the bottom show the rejection ratio between the two sub-pixels for $\theta =$ −30° to 30°. (c) presents the different setups (not to scale) for each combination of RoC and height. The volume between the surface of the microlens and the other layers is filled with the same material.
Fig. 3.
Fig. 3. Angular response of the 3.5 µm pixel with a 750 nm color filter (a) and the 2.8 µm pixel with a 750 nm color filter and inter-pixel tungsten isolation (b).
Fig. 4.
Fig. 4. (a) and (b) compare telecentric and non-telecentric lenses. The cone of light is not centered on the pixel when using a non-telecentric lens and causes vigneting (c).
Fig. 5.
Fig. 5. Angular response of CRA simulation. The first row shows the results with $\varphi _{cra}= {0}^\circ$ and the second row shows angular responses for $\varphi _{cra}= {22.5}^{\circ}/{45}^{\circ}$ and $\theta _{cra}= {20}^\circ / {30}^\circ$. $\varphi _{cra}$ and $\theta _{cra}$ are represented as the angular and radial coordinate respectively.
Fig. 6.
Fig. 6. (a) and (b) show the classic ray tracing procedure and its ideal angular response. (c) and (d) show our modified ray tracing procedure and its angular response.
Fig. 7.
Fig. 7. Description of experiments. Experiment 1 (a) is a moving disk with lambertian texture. Experiment 2 (b) is the infinite plane with lambertian texture and angle filtering. In experiment 2, the illuminated area varies with $\theta$ and a normalization step is necessary (c).
Fig. 8.
Fig. 8. Validation of our method showing normalized angular response of sub-pixel A in different cases: (a) FDTD angular response, (b) Experiment 1 (moving small disk), (c) Experiment 2 (fixed large texture)
Fig. 9.
Fig. 9. (a) ML-1 from US8320061 B2, (b) ML-2 from US9316810 B2, (c) abcd texture.
Fig. 10.
Fig. 10. Examples of synthetic images using a quad-pixel sensor made of 1.75 $\mu m$ sub-pixels and ML-1 as main-lens. (a) and (b) are the SAI-A of the ISO-12233 and “abcd” test chart. (c) and (d) are the sum of the 4 SAIs of the abcd test chart and San-Miguel scene respectively.
Fig. 11.
Fig. 11. Impact of sub-pixel size on angular response.
Fig. 12.
Fig. 12. Sub aperture image of sub-pixel A and D at the center using 0.8 µm and 1.75 µm sub-pixels, in classical and diffractive mode. The difference between the classical and the diffractive mode of SAI-A is shown on the left.
Fig. 13.
Fig. 13. Wavelength dependency with a $0.8 \mu m$ sub-pixels for a centered microlens (a) and a shifted microlens for CRA correction of $\varphi _{cra} = {0}^{\circ}, \theta _{cra} = {25}^{\circ}$ (b).
Fig. 14.
Fig. 14. Main lens’ aberrations seen by a quad-pixel sensor. (a) illustrates one aberration of the main lens at one corner (orange and yellow rays). Blue rays are non-aberrant rays at the center. (b) shows the rays falling on different quad-pixels (colored disks). (c) shows the 4 SAIs and the impact of the rays on the sub-pixels. The position difference $\Delta$ in sensor space translates into the disparity $\rho$ in image space.
Fig. 15.
Fig. 15. Results of the aberration correction algorithm for different setups, at the center and for $\theta _{cra}=$10° / $\varphi _{cra}=$45°. The three setups are: ML-1 in classic mode, ML-1 in “diffractive” mode and ML-2 in diffractive mode with simulated manufacturing defects.

Tables (2)

Tables Icon

Table 1. Manufacturing defects for experience 3 with ML-2

Tables Icon

Table 2. Summary of image quality gains

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

ratio = { P abs A / P abs B if  θ < 0 P abs B / P abs A if  θ 0
θ L = arctan ( sub-pixel size μ lens focal length )
N = 2 Δ φ (a) × f μ Lens 2 [ tan 2 ( θ + Δ θ ) tan 2 ( θ Δ θ ) ] (b)

Metrics