Abstract

We present a deep-learning approach for solving the problem of 2π phase ambiguities in two-dimensional quantitative phase maps of biological cells, using a multi-layer encoder-decoder residual convolutional neural network. We test the trained network, PhUn-Net, on various types of biological cells, captured with various interferometric setups, as well as on simulated phantoms. These tests demonstrate the robustness and generality of the network, even for cells of different morphologies or different illumination conditions than PhUn-Net has been trained on. In this paper, for the first time, we make the trained network publicly available in a global format, such that it can be easily deployed on every platform, to yield fast and robust phase unwrapping, not requiring prior knowledge or complex implementation. By this, we expect our phase unwrapping approach to be widely used, substituting conventional and more time-consuming phase unwrapping algorithms.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Rapid and robust two-dimensional phase unwrapping via deep learning

Teng Zhang, Shaowei Jiang, Zixin Zhao, Krishna Dixit, Xiaofei Zhou, Jia Hou, Yongbing Zhang, and Chenggang Yan
Opt. Express 27(16) 23173-23185 (2019)

Automated red blood cells extraction from holographic images using fully convolutional neural networks

Faliu Yi, Inkyu Moon, and Bahram Javidi
Biomed. Opt. Express 8(10) 4466-4479 (2017)

Phase unwrapping in optical metrology via denoised and convolutional segmentation networks

Junchao Zhang, Xiaobo Tian, Jianbo Shao, Haibo Luo, and Rongguang Liang
Opt. Express 27(10) 14903-14912 (2019)

References

  • View by:
  • |
  • |
  • |

  1. D. C. Ghihlia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (Wiley, 1998).
  2. A. Barty, K. A. Nugent, D. Paganin, and A. Roberts, “Quantitative optical phase microscopy,” Opt. Lett. 23(11), 817–819 (1998).
    [Crossref]
  3. A. Roberts, E. Ampem-Lassen, A. Barty, and K. A. Nugent, “Refractive-index profiling of optical fibers with axial symmetry by use of quantitative phase microscopy,” Opt. Lett. 27(23), 2061–2063 (2002).
    [Crossref]
  4. G. Dardikman and N. T. Shaked, “Review on methods of solving the refractive index–thickness coupling problem in digital holographic microscopy of biological cells,” Opt. Commun. 422, 8–16 (2018).
    [Crossref]
  5. G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
    [Crossref]
  6. M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
    [Crossref]
  7. C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
    [Crossref]
  8. D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
    [Crossref]
  9. G. Dardikman, Y. N. Nygate, I. Barnea, N. A. Turko, G. Singh, B. Javidi, and N. T. Shaked, “Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy,” Biomed. Opt. Express 9(3), 1177–1189 (2018).
    [Crossref]
  10. P. Girshovitz and N. T. Shaked, “Real-time quantitative phase reconstruction in off-axis digital holography using multiplexing,” Opt. Lett. 39(8), 2262–2265 (2014).
    [Crossref]
  11. C. M. Vest, Holographic Interferometry (Wiley, 1979).
  12. N. T. Shaked, “Quantitative phase microscopy of biological samples using a portable interferometer,” Opt. Lett. 37(11), 2016–2018 (2012).
    [Crossref]
  13. K. Itoh, “Analysis of the phase unwrapping algorithm,” Appl. Opt. 21(14), 2470 (1982).
    [Crossref]
  14. U. S. Kamilov, I. N. Papadopoulos, M. H. Shoreh, D. Psaltis, and M. Unser, “Isotropic inverse-problem approach for two-dimensional phase unwrapping,” J. Opt. Soc. Am. A 32(6), 1092–1100 (2015).
    [Crossref]
  15. S. A. Karout, “Two-dimensional phase unwrapping, Chapter 3: Artificial intelligence,” Ph.D. Thesis, John Moores University (2007).
  16. R. Cusack, J. M. Huntley, and H. T. Goldstein, “Improved noise-Immune phase-unwrapping algorithm,” Appl. Opt. 34(5), 781–789 (1995).
    [Crossref]
  17. S. A. Karout, M. A. Gdeisat, D. R. Burton, and M. J. Lalor, “Two-dimensional phase unwrapping using a hybrid genetic algorithm,” Appl. Opt. 46(5), 730–743 (2007).
    [Crossref]
  18. D. J. Tipper, D. R. Burton, and M. J. Lalor, “A Neural network approach to the phase unwrapping problem in fringe analysis,” Nondestr. Test. Eval. 12(6), 391–400 (1996).
    [Crossref]
  19. S. Hamzah, J. D. Pearson, P. J. Lisboa, and C. A. Hobson, “Phase unwrapping in 3-D shape measurement using artificial neural networks,” Proc. IEE Conf. Image Proc. and its Applications443 (1997).
  20. T. M. Kreis, R. Biedermann, and W. P. O. Juptner, “Evaluation of holographic interference patterns by artificial networks,” Proc. SPIE 2544, 11–24 (1995).
    [Crossref]
  21. Z. Wang, D. Yan, F. Liu, and A. He, “Phase unwrapping by a random artificial neural network,” Optical Technology in Fluid, Thermal, and Combustion Flow III, SPIE, San Diego, CA, 28–31 (1997).
  22. W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.
  23. C. Tang, W. Lu, S. Chen, Z. Zhang, B. Li, W. Wang, and L. Han, “Denoising by coupled partial differential equations and extracting phase by backpropagation neural networks for electronic speckle pattern interferometry,” Appl. Opt. 46(30), 7475–7484 (2007).
    [Crossref]
  24. F. Sawaf and R. M. Groves, “Statistically guided improvements in speckle phase discontinuity predictions by machine learning systems,” Opt. Eng. 52(10), 101907 (2013).
    [Crossref]
  25. Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015).
    [Crossref]
  26. A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM 25(2), 1090–1098 (2012).
  27. V. Badrinarayanan, A. Kendall, and R. Cipolla, “Segnet: A deep convolutional encoder-decoder architecture for image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017).
    [Crossref]
  28. K. H. Jin, M. T. McCann, and M. Unser, “Deep convolutional neural network for inverse problems in imaging,” IEEE Trans. on Image Process. 26(9), 4509–4522 (2017).
    [Crossref]
  29. U. S. Kamilov, I. N. Papadopoulos, M. H. Shoreh, A. Goy, C. Vonesch, M. Unser, and D. Psaltis, “Deep learning microscopy,” Optica 2(6), 517–522 (2015).
    [Crossref]
  30. Y. Rivenson, Z. Gorocs, H. Günaydın, Y. Zhang, H. Wang, and A. Ozcan, “Deep learning microscopy,” Optica 4(11), 1437–1443 (2017).
    [Crossref]
  31. A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Solving inverse problems using residual neural networks,” In: Digital Holography and Three-Dimensional Imaging (Optical Society of America, 2017). p. W1A. 3.
  32. A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Lensless computational imaging through deep learning,” Optica 4(9), 1117–1125 (2017).
    [Crossref]
  33. Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
    [Crossref]
  34. G. Barbastathis, A. Ozcan, and G. Situ, “On the use of deep learning for computational imaging,” Optica 6(8), 921–943 (2019).
    [Crossref]
  35. Y. Rivenson, Y. Wu, and A. Ozcan, “Deep learning in holography and coherent imaging,” Light: Sci. Appl. 8(1), 1–8 (2019).
    [Crossref]
  36. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” IEEE Conference on Computer Vision and Pattern Recognition, 770–778 (2016).
  37. G. Dardikman, N. A. Turko, and N. T. Shaked, “Deep learning approaches for unwrapping phase images with steep spatial gradients: a simulation,” 2018 IEEE International Conference on the Science of Electrical Engineering in Israel (ICSEE). IEEE, 2018.
  38. M. D. Pritt, “Congruence in least-squares phase unwrapping,” in 1997 IEEE International Geoscience and Remote Sensing Symposium (IGARSS),” vol. 2, pp. 875–877 (Singapore, 1997).
  39. G. E. Spoorthi, S. Gorthi, and R. K. S. S. Gorthi, “PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping,” IEEE Signal Process. Lett. 26(1), 54–58 (2019).
    [Crossref]
  40. K. Wang, Y. Li, Q. Kemao, J. Di, and J. Zhao, “One-step robust deep learning phase unwrapping,” Opt. Express 27(10), 15100–15115 (2019).
    [Crossref]
  41. T. Zhang, S. Jiang, Z. Zhao, K. Dixit, X. Zhou, J. Hou, Y. Zhang, and C. Yan, “Rapid and robust two-dimensional phase unwrapping via deep learning,” Opt. Express 27(16), 23173–23185 (2019).
    [Crossref]
  42. J. Zhang, X. Tian, J. Shao, H. Luo, and R. Liang, “Phase unwrapping in optical metrology via denoised and convolutional segmentation networks,” Opt. Express 27(10), 14903–14912 (2019).
    [Crossref]
  43. L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).
  44. W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).
  45. G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).
  46. S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
    [Crossref]
  47. P. Girshovitz and N. T. Shaked, “Compact and portable low-coherence interferometer with off-axis geometry for quantitative phase microscopy and nanoscopy,” Opt. Express 21(5), 5701–5714 (2013).
    [Crossref]
  48. G. Dardikman and N. T. Shaked, “Is multiplexed off-axis holography for quantitative phase imaging more spatial bandwidth-efficient than on-axis holography?” J. Opt. Soc. Am. A 36(2), A1–A11 (2019).
    [Crossref]
  49. M. A. Herráez, D. R. Burton, M. J. Lalor, and M. A. Gdeisat, “Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path,” Appl. Opt. 41(35), 7437–7444 (2002).
    [Crossref]
  50. D. Roitshtain, N. Turko, B. Javidi, and N. T. Shaked, “Flipping interferometry and its application for quantitative phase microscopy in a micro-channel,” Opt. Lett. 41(10), 2354–2357 (2016).
    [Crossref]
  51. N. A. Turko, P. Jacob Eravuchira, I. Branea, and N. T. Shaked, “Simultaneous three-wavelength unwrapping using external digital holographic multiplexing module,” Opt. Lett. 43(9), 1943–1946 (2018).
    [Crossref]

2019 (7)

2018 (4)

N. A. Turko, P. Jacob Eravuchira, I. Branea, and N. T. Shaked, “Simultaneous three-wavelength unwrapping using external digital holographic multiplexing module,” Opt. Lett. 43(9), 1943–1946 (2018).
[Crossref]

Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
[Crossref]

G. Dardikman and N. T. Shaked, “Review on methods of solving the refractive index–thickness coupling problem in digital holographic microscopy of biological cells,” Opt. Commun. 422, 8–16 (2018).
[Crossref]

G. Dardikman, Y. N. Nygate, I. Barnea, N. A. Turko, G. Singh, B. Javidi, and N. T. Shaked, “Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy,” Biomed. Opt. Express 9(3), 1177–1189 (2018).
[Crossref]

2017 (6)

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

Y. Rivenson, Z. Gorocs, H. Günaydın, Y. Zhang, H. Wang, and A. Ozcan, “Deep learning microscopy,” Optica 4(11), 1437–1443 (2017).
[Crossref]

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Lensless computational imaging through deep learning,” Optica 4(9), 1117–1125 (2017).
[Crossref]

V. Badrinarayanan, A. Kendall, and R. Cipolla, “Segnet: A deep convolutional encoder-decoder architecture for image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017).
[Crossref]

K. H. Jin, M. T. McCann, and M. Unser, “Deep convolutional neural network for inverse problems in imaging,” IEEE Trans. on Image Process. 26(9), 4509–4522 (2017).
[Crossref]

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

2016 (1)

2015 (5)

U. S. Kamilov, I. N. Papadopoulos, M. H. Shoreh, A. Goy, C. Vonesch, M. Unser, and D. Psaltis, “Deep learning microscopy,” Optica 2(6), 517–522 (2015).
[Crossref]

Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015).
[Crossref]

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

U. S. Kamilov, I. N. Papadopoulos, M. H. Shoreh, D. Psaltis, and M. Unser, “Isotropic inverse-problem approach for two-dimensional phase unwrapping,” J. Opt. Soc. Am. A 32(6), 1092–1100 (2015).
[Crossref]

2014 (1)

2013 (2)

F. Sawaf and R. M. Groves, “Statistically guided improvements in speckle phase discontinuity predictions by machine learning systems,” Opt. Eng. 52(10), 101907 (2013).
[Crossref]

P. Girshovitz and N. T. Shaked, “Compact and portable low-coherence interferometer with off-axis geometry for quantitative phase microscopy and nanoscopy,” Opt. Express 21(5), 5701–5714 (2013).
[Crossref]

2012 (2)

A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM 25(2), 1090–1098 (2012).

N. T. Shaked, “Quantitative phase microscopy of biological samples using a portable interferometer,” Opt. Lett. 37(11), 2016–2018 (2012).
[Crossref]

2008 (1)

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

2007 (2)

2002 (2)

1998 (1)

1996 (1)

D. J. Tipper, D. R. Burton, and M. J. Lalor, “A Neural network approach to the phase unwrapping problem in fringe analysis,” Nondestr. Test. Eval. 12(6), 391–400 (1996).
[Crossref]

1995 (2)

T. M. Kreis, R. Biedermann, and W. P. O. Juptner, “Evaluation of holographic interference patterns by artificial networks,” Proc. SPIE 2544, 11–24 (1995).
[Crossref]

R. Cusack, J. M. Huntley, and H. T. Goldstein, “Improved noise-Immune phase-unwrapping algorithm,” Appl. Opt. 34(5), 781–789 (1995).
[Crossref]

1982 (1)

Allier, C.

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

Ampem-Lassen, E.

Argoul, F.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Arneodo, A.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Asundi, A.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Badizadegan, K.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Badrinarayanan, V.

V. Badrinarayanan, A. Kendall, and R. Cipolla, “Segnet: A deep convolutional encoder-decoder architecture for image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017).
[Crossref]

Bal, E.

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

Band, G.

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

Barbastathis, G.

G. Barbastathis, A. Ozcan, and G. Situ, “On the use of deep learning for computational imaging,” Optica 6(8), 921–943 (2019).
[Crossref]

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Lensless computational imaging through deep learning,” Optica 4(9), 1117–1125 (2017).
[Crossref]

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Solving inverse problems using residual neural networks,” In: Digital Holography and Three-Dimensional Imaging (Optical Society of America, 2017). p. W1A. 3.

Barnea, I.

G. Dardikman, Y. N. Nygate, I. Barnea, N. A. Turko, G. Singh, B. Javidi, and N. T. Shaked, “Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy,” Biomed. Opt. Express 9(3), 1177–1189 (2018).
[Crossref]

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

Barty, A.

Bengio, Y.

Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015).
[Crossref]

Berguiga, L.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Best-Popescu, C.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Biedermann, R.

T. M. Kreis, R. Biedermann, and W. P. O. Juptner, “Evaluation of holographic interference patterns by artificial networks,” Proc. SPIE 2544, 11–24 (1995).
[Crossref]

Bovik, A. C.

W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.

Boyer-Provera, E.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Branea, I.

Burton, D. R.

Chen, Q.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Chen, S.

Cioni, O.

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

Cipolla, R.

V. Badrinarayanan, A. Kendall, and R. Cipolla, “Segnet: A deep convolutional encoder-decoder architecture for image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017).
[Crossref]

Cusack, R.

Dardikman, G.

G. Dardikman and N. T. Shaked, “Is multiplexed off-axis holography for quantitative phase imaging more spatial bandwidth-efficient than on-axis holography?” J. Opt. Soc. Am. A 36(2), A1–A11 (2019).
[Crossref]

G. Dardikman, Y. N. Nygate, I. Barnea, N. A. Turko, G. Singh, B. Javidi, and N. T. Shaked, “Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy,” Biomed. Opt. Express 9(3), 1177–1189 (2018).
[Crossref]

G. Dardikman and N. T. Shaked, “Review on methods of solving the refractive index–thickness coupling problem in digital holographic microscopy of biological cells,” Opt. Commun. 422, 8–16 (2018).
[Crossref]

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

G. Dardikman, N. A. Turko, and N. T. Shaked, “Deep learning approaches for unwrapping phase images with steep spatial gradients: a simulation,” 2018 IEEE International Conference on the Science of Electrical Engineering in Israel (ICSEE). IEEE, 2018.

Dardikman-Yoffe, G.

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

Dasari, R. R.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Deflores, L.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Di, J.

Dixit, K.

Elezgaray, J.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Evans, B. L.

W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.

Feld, M. S.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Feng, S.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Gdeisat, M. A.

Ghihlia, D. C.

D. C. Ghihlia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (Wiley, 1998).

Ghosh, J.

W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.

Girshovitz, P.

Goldstein, H. T.

Gorocs, Z.

Gorthi, R. K. S. S.

G. E. Spoorthi, S. Gorthi, and R. K. S. S. Gorthi, “PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping,” IEEE Signal Process. Lett. 26(1), 54–58 (2019).
[Crossref]

Gorthi, S.

G. E. Spoorthi, S. Gorthi, and R. K. S. S. Gorthi, “PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping,” IEEE Signal Process. Lett. 26(1), 54–58 (2019).
[Crossref]

Goy, A.

Greenspan, H.

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

Groves, R. M.

F. Sawaf and R. M. Groves, “Statistically guided improvements in speckle phase discontinuity predictions by machine learning systems,” Opt. Eng. 52(10), 101907 (2013).
[Crossref]

Günaydin, H.

Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
[Crossref]

Y. Rivenson, Z. Gorocs, H. Günaydın, Y. Zhang, H. Wang, and A. Ozcan, “Deep learning microscopy,” Optica 4(11), 1437–1443 (2017).
[Crossref]

Habaza, M.

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

Haifler, M.

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

Hamzah, S.

S. Hamzah, J. D. Pearson, P. J. Lisboa, and C. A. Hobson, “Phase unwrapping in 3-D shape measurement using artificial neural networks,” Proc. IEE Conf. Image Proc. and its Applications443 (1997).

Han, L.

He, A.

Z. Wang, D. Yan, F. Liu, and A. He, “Phase unwrapping by a random artificial neural network,” Optical Technology in Fluid, Thermal, and Combustion Flow III, SPIE, San Diego, CA, 28–31 (1997).

He, K.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” IEEE Conference on Computer Vision and Pattern Recognition, 770–778 (2016).

Herráez, M. A.

Hervé, L.

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

Hinton, G.

Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015).
[Crossref]

A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM 25(2), 1090–1098 (2012).

Hobson, C. A.

S. Hamzah, J. D. Pearson, P. J. Lisboa, and C. A. Hobson, “Phase unwrapping in 3-D shape measurement using artificial neural networks,” Proc. IEE Conf. Image Proc. and its Applications443 (1997).

Hou, J.

Huang, L.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Huntley, J. M.

Itoh, K.

Jacob Eravuchira, P.

Javidi, B.

Jiang, S.

Jin, K. H.

K. H. Jin, M. T. McCann, and M. Unser, “Deep convolutional neural network for inverse problems in imaging,” IEEE Trans. on Image Process. 26(9), 4509–4522 (2017).
[Crossref]

Juptner, W. P. O.

T. M. Kreis, R. Biedermann, and W. P. O. Juptner, “Evaluation of holographic interference patterns by artificial networks,” Proc. SPIE 2544, 11–24 (1995).
[Crossref]

Kamilov, U. S.

Karout, S. A.

S. A. Karout, M. A. Gdeisat, D. R. Burton, and M. J. Lalor, “Two-dimensional phase unwrapping using a hybrid genetic algorithm,” Appl. Opt. 46(5), 730–743 (2007).
[Crossref]

S. A. Karout, “Two-dimensional phase unwrapping, Chapter 3: Artificial intelligence,” Ph.D. Thesis, John Moores University (2007).

Kemao, Q.

Kendall, A.

V. Badrinarayanan, A. Kendall, and R. Cipolla, “Segnet: A deep convolutional encoder-decoder architecture for image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017).
[Crossref]

Kreis, T. M.

T. M. Kreis, R. Biedermann, and W. P. O. Juptner, “Evaluation of holographic interference patterns by artificial networks,” Proc. SPIE 2544, 11–24 (1995).
[Crossref]

Krizhevsky, A.

A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM 25(2), 1090–1098 (2012).

Lalor, M. J.

Laperrousaz, B.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

LeCun, Y.

Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015).
[Crossref]

Lee, J.

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Lensless computational imaging through deep learning,” Optica 4(9), 1117–1125 (2017).
[Crossref]

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Solving inverse problems using residual neural networks,” In: Digital Holography and Three-Dimensional Imaging (Optical Society of America, 2017). p. W1A. 3.

Levi, M.

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

Li, B.

Li, S.

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Lensless computational imaging through deep learning,” Optica 4(9), 1117–1125 (2017).
[Crossref]

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Solving inverse problems using residual neural networks,” In: Digital Holography and Three-Dimensional Imaging (Optical Society of America, 2017). p. W1A. 3.

Li, Y.

Liang, R.

Lisboa, P. J.

S. Hamzah, J. D. Pearson, P. J. Lisboa, and C. A. Hobson, “Phase unwrapping in 3-D shape measurement using artificial neural networks,” Proc. IEE Conf. Image Proc. and its Applications443 (1997).

Liu, F.

Z. Wang, D. Yan, F. Liu, and A. He, “Phase unwrapping by a random artificial neural network,” Optical Technology in Fluid, Thermal, and Combustion Flow III, SPIE, San Diego, CA, 28–31 (1997).

Lu, W.

Lue, N.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Luo, H.

Madjar, I.

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

Maguer-Satta, V.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Martinez-Torres, C.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

McCann, M. T.

K. H. Jin, M. T. McCann, and M. Unser, “Deep convolutional neural network for inverse problems in imaging,” IEEE Trans. on Image Process. 26(9), 4509–4522 (2017).
[Crossref]

Menneteau, M.

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

Milner, T. E.

W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.

Mirsky, S. K.

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

Morales, S.

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

Navarro, F.

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

Nicolini, F. E.

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

Nugent, K. A.

Nygate, Y. N.

Ozcan, A.

G. Barbastathis, A. Ozcan, and G. Situ, “On the use of deep learning for computational imaging,” Optica 6(8), 921–943 (2019).
[Crossref]

Y. Rivenson, Y. Wu, and A. Ozcan, “Deep learning in holography and coherent imaging,” Light: Sci. Appl. 8(1), 1–8 (2019).
[Crossref]

Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
[Crossref]

Y. Rivenson, Z. Gorocs, H. Günaydın, Y. Zhang, H. Wang, and A. Ozcan, “Deep learning microscopy,” Optica 4(11), 1437–1443 (2017).
[Crossref]

Paganin, D.

Papadopoulos, I. N.

Park, Y.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Pearson, J. D.

S. Hamzah, J. D. Pearson, P. J. Lisboa, and C. A. Hobson, “Phase unwrapping in 3-D shape measurement using artificial neural networks,” Proc. IEE Conf. Image Proc. and its Applications443 (1997).

Popescu, G.

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Pritt, M. D.

D. C. Ghihlia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (Wiley, 1998).

M. D. Pritt, “Congruence in least-squares phase unwrapping,” in 1997 IEEE International Geoscience and Remote Sensing Symposium (IGARSS),” vol. 2, pp. 875–877 (Singapore, 1997).

Psaltis, D.

Ren, S.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” IEEE Conference on Computer Vision and Pattern Recognition, 770–778 (2016).

Rivenson, Y.

Y. Rivenson, Y. Wu, and A. Ozcan, “Deep learning in holography and coherent imaging,” Light: Sci. Appl. 8(1), 1–8 (2019).
[Crossref]

Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
[Crossref]

Y. Rivenson, Z. Gorocs, H. Günaydın, Y. Zhang, H. Wang, and A. Ozcan, “Deep learning microscopy,” Optica 4(11), 1437–1443 (2017).
[Crossref]

Roberts, A.

Roitshtain, D.

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

D. Roitshtain, N. Turko, B. Javidi, and N. T. Shaked, “Flipping interferometry and its application for quantitative phase microscopy in a micro-channel,” Opt. Lett. 41(10), 2354–2357 (2016).
[Crossref]

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

Satterwhite, L. L.

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

Sawaf, F.

F. Sawaf and R. M. Groves, “Statistically guided improvements in speckle phase discontinuity predictions by machine learning systems,” Opt. Eng. 52(10), 101907 (2013).
[Crossref]

Schwartzkopf, W.

W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.

Shaked, N. T.

G. Dardikman and N. T. Shaked, “Is multiplexed off-axis holography for quantitative phase imaging more spatial bandwidth-efficient than on-axis holography?” J. Opt. Soc. Am. A 36(2), A1–A11 (2019).
[Crossref]

N. A. Turko, P. Jacob Eravuchira, I. Branea, and N. T. Shaked, “Simultaneous three-wavelength unwrapping using external digital holographic multiplexing module,” Opt. Lett. 43(9), 1943–1946 (2018).
[Crossref]

G. Dardikman, Y. N. Nygate, I. Barnea, N. A. Turko, G. Singh, B. Javidi, and N. T. Shaked, “Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy,” Biomed. Opt. Express 9(3), 1177–1189 (2018).
[Crossref]

G. Dardikman and N. T. Shaked, “Review on methods of solving the refractive index–thickness coupling problem in digital holographic microscopy of biological cells,” Opt. Commun. 422, 8–16 (2018).
[Crossref]

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

D. Roitshtain, N. Turko, B. Javidi, and N. T. Shaked, “Flipping interferometry and its application for quantitative phase microscopy in a micro-channel,” Opt. Lett. 41(10), 2354–2357 (2016).
[Crossref]

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

P. Girshovitz and N. T. Shaked, “Real-time quantitative phase reconstruction in off-axis digital holography using multiplexing,” Opt. Lett. 39(8), 2262–2265 (2014).
[Crossref]

P. Girshovitz and N. T. Shaked, “Compact and portable low-coherence interferometer with off-axis geometry for quantitative phase microscopy and nanoscopy,” Opt. Express 21(5), 5701–5714 (2013).
[Crossref]

N. T. Shaked, “Quantitative phase microscopy of biological samples using a portable interferometer,” Opt. Lett. 37(11), 2016–2018 (2012).
[Crossref]

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

G. Dardikman, N. A. Turko, and N. T. Shaked, “Deep learning approaches for unwrapping phase images with steep spatial gradients: a simulation,” 2018 IEEE International Conference on the Science of Electrical Engineering in Israel (ICSEE). IEEE, 2018.

Shao, J.

Shoreh, M. H.

Singh, G.

Sinha, A.

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Lensless computational imaging through deep learning,” Optica 4(9), 1117–1125 (2017).
[Crossref]

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Solving inverse problems using residual neural networks,” In: Digital Holography and Three-Dimensional Imaging (Optical Society of America, 2017). p. W1A. 3.

Situ, G.

Spoorthi, G. E.

G. E. Spoorthi, S. Gorthi, and R. K. S. S. Gorthi, “PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping,” IEEE Signal Process. Lett. 26(1), 54–58 (2019).
[Crossref]

Sun, J.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” IEEE Conference on Computer Vision and Pattern Recognition, 770–778 (2016).

Sutskever, I.

A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM 25(2), 1090–1098 (2012).

Tang, C.

Tao, T.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Teng, D.

Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
[Crossref]

Tian, X.

Tipper, D. J.

D. J. Tipper, D. R. Burton, and M. J. Lalor, “A Neural network approach to the phase unwrapping problem in fringe analysis,” Nondestr. Test. Eval. 12(6), 391–400 (1996).
[Crossref]

Trusiak, M.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Turko, N.

Turko, N. A.

N. A. Turko, P. Jacob Eravuchira, I. Branea, and N. T. Shaked, “Simultaneous three-wavelength unwrapping using external digital holographic multiplexing module,” Opt. Lett. 43(9), 1943–1946 (2018).
[Crossref]

G. Dardikman, Y. N. Nygate, I. Barnea, N. A. Turko, G. Singh, B. Javidi, and N. T. Shaked, “Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy,” Biomed. Opt. Express 9(3), 1177–1189 (2018).
[Crossref]

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

G. Dardikman, N. A. Turko, and N. T. Shaked, “Deep learning approaches for unwrapping phase images with steep spatial gradients: a simulation,” 2018 IEEE International Conference on the Science of Electrical Engineering in Israel (ICSEE). IEEE, 2018.

Unser, M.

Vest, C. M.

C. M. Vest, Holographic Interferometry (Wiley, 1979).

Vonesch, C.

Wang, H.

Wang, K.

Wang, W.

Wang, Z.

Z. Wang, D. Yan, F. Liu, and A. He, “Phase unwrapping by a random artificial neural network,” Optical Technology in Fluid, Thermal, and Combustion Flow III, SPIE, San Diego, CA, 28–31 (1997).

Wolbromsky, L.

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

Wu, Y.

Y. Rivenson, Y. Wu, and A. Ozcan, “Deep learning in holography and coherent imaging,” Light: Sci. Appl. 8(1), 1–8 (2019).
[Crossref]

Yan, C.

Yan, D.

Z. Wang, D. Yan, F. Liu, and A. He, “Phase unwrapping by a random artificial neural network,” Optical Technology in Fluid, Thermal, and Combustion Flow III, SPIE, San Diego, CA, 28–31 (1997).

Yin, W.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Zhang, J.

Zhang, T.

Zhang, X.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” IEEE Conference on Computer Vision and Pattern Recognition, 770–778 (2016).

Zhang, Y.

Zhang, Z.

Zhao, J.

Zhao, Z.

Zhou, X.

Zuo, C.

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

Am. J. Physiol. Cell Physiol. (1)

G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008).
[Crossref]

Appl. Opt. (5)

Biomed. Opt. Express (1)

Commun. ACM (1)

A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM 25(2), 1090–1098 (2012).

Cytometry, Part A (2)

D. Roitshtain, L. Wolbromsky, E. Bal, H. Greenspan, L. L. Satterwhite, and N. T. Shaked, “Quantitative phase microscopy spatial signatures of cancer cells,” Cytometry, Part A 91(5), 482–493 (2017).
[Crossref]

S. K. Mirsky, I. Barnea, M. Levi, H. Greenspan, and N. T. Shaked, “Automated analysis of individual sperm cells using stain-free interferometric phase microscopy and machine learning,” Cytometry, Part A 91(9), 893–900 (2017).
[Crossref]

Fertil. Steril. (1)

M. Haifler, P. Girshovitz, G. Band, G. Dardikman, I. Madjar, and N. T. Shaked, “Interferometric phase microscopy for label-free morphological evaluation of sperm cells,” Fertil. Steril. 104(1), 43–47.e2 (2015).
[Crossref]

IEEE Signal Process. Lett. (1)

G. E. Spoorthi, S. Gorthi, and R. K. S. S. Gorthi, “PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping,” IEEE Signal Process. Lett. 26(1), 54–58 (2019).
[Crossref]

IEEE Trans. on Image Process. (1)

K. H. Jin, M. T. McCann, and M. Unser, “Deep convolutional neural network for inverse problems in imaging,” IEEE Trans. on Image Process. 26(9), 4509–4522 (2017).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

V. Badrinarayanan, A. Kendall, and R. Cipolla, “Segnet: A deep convolutional encoder-decoder architecture for image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017).
[Crossref]

J. Biomed. Opt. (1)

C. Martinez-Torres, B. Laperrousaz, L. Berguiga, E. Boyer-Provera, J. Elezgaray, F. E. Nicolini, V. Maguer-Satta, A. Arneodo, and F. Argoul, “Deciphering the internal complexity of living cells with quantitative phase microscopy: a multiscale approach,” J. Biomed. Opt. 20(9), 096005 (2015).
[Crossref]

J. Opt. Soc. Am. A (2)

Light: Sci. Appl. (2)

Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018).
[Crossref]

Y. Rivenson, Y. Wu, and A. Ozcan, “Deep learning in holography and coherent imaging,” Light: Sci. Appl. 8(1), 1–8 (2019).
[Crossref]

Nature (1)

Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015).
[Crossref]

Nondestr. Test. Eval. (1)

D. J. Tipper, D. R. Burton, and M. J. Lalor, “A Neural network approach to the phase unwrapping problem in fringe analysis,” Nondestr. Test. Eval. 12(6), 391–400 (1996).
[Crossref]

Opt. Commun. (1)

G. Dardikman and N. T. Shaked, “Review on methods of solving the refractive index–thickness coupling problem in digital holographic microscopy of biological cells,” Opt. Commun. 422, 8–16 (2018).
[Crossref]

Opt. Eng. (1)

F. Sawaf and R. M. Groves, “Statistically guided improvements in speckle phase discontinuity predictions by machine learning systems,” Opt. Eng. 52(10), 101907 (2013).
[Crossref]

Opt. Express (4)

Opt. Lett. (6)

Optica (4)

Proc. SPIE (1)

T. M. Kreis, R. Biedermann, and W. P. O. Juptner, “Evaluation of holographic interference patterns by artificial networks,” Proc. SPIE 2544, 11–24 (1995).
[Crossref]

Other (13)

Z. Wang, D. Yan, F. Liu, and A. He, “Phase unwrapping by a random artificial neural network,” Optical Technology in Fluid, Thermal, and Combustion Flow III, SPIE, San Diego, CA, 28–31 (1997).

W. Schwartzkopf, T. E. Milner, J. Ghosh, B. L. Evans, and A. C. Bovik, “Two-dimensional phase unwrapping using neural networks,” In Proceedings of Image Analysis and Interpretation (4th IEEE Southwest Symposium, 2000), pp. 274–277.

A. Sinha, J. Lee, S. Li, and G. Barbastathis, “Solving inverse problems using residual neural networks,” In: Digital Holography and Three-Dimensional Imaging (Optical Society of America, 2017). p. W1A. 3.

S. Hamzah, J. D. Pearson, P. J. Lisboa, and C. A. Hobson, “Phase unwrapping in 3-D shape measurement using artificial neural networks,” Proc. IEE Conf. Image Proc. and its Applications443 (1997).

D. C. Ghihlia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (Wiley, 1998).

C. M. Vest, Holographic Interferometry (Wiley, 1979).

S. A. Karout, “Two-dimensional phase unwrapping, Chapter 3: Artificial intelligence,” Ph.D. Thesis, John Moores University (2007).

K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” IEEE Conference on Computer Vision and Pattern Recognition, 770–778 (2016).

G. Dardikman, N. A. Turko, and N. T. Shaked, “Deep learning approaches for unwrapping phase images with steep spatial gradients: a simulation,” 2018 IEEE International Conference on the Science of Electrical Engineering in Israel (ICSEE). IEEE, 2018.

M. D. Pritt, “Congruence in least-squares phase unwrapping,” in 1997 IEEE International Geoscience and Remote Sensing Symposium (IGARSS),” vol. 2, pp. 875–877 (Singapore, 1997).

L. Hervé, C. Allier, O. Cioni, F. Navarro, M. Menneteau, and S. Morales, “Deep-Learning for phase unwrapping in Lens-Free imaging,” Advances in Microscopic Imaging II. Vol. 11076. International Society for Optics and Photonics (2019).

W. Yin, Q. Chen, S. Feng, T. Tao, L. Huang, M. Trusiak, A. Asundi, and C. Zuo, “Temporal phase unwrapping using deep learning,” arXiv preprint arXiv:1903.09836 (2019).

G. Dardikman-Yoffe, D. Roitshtain, S. K. Mirsky, N. A. Turko, M. Habaza, and N. T. Shaked, “ONNX file of the deep neural network for 2D phase unwrapping of biological cells in watery medium,” figshare, https://doi.org/10.6084/m9.figshare.9926627 (2020).

Supplementary Material (1)

NameDescription
» Dataset 1       ONNX file of a deep neural network for 2D phase unwrapping of biological cells in watery medium. Meant for 512x512-pixel images. Distributed for academic non-commercial use only. If you use this network in your manuscript, please cite Dardikman-Yof

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. (a) Network architecture. Conv: 2-D convolutional layer with 1×1 stride and padding maintaining constant size in the spatial dimension; Pool: max pooling; deCo: deconvolution (learned upsampling). (b) The structure of a Res4 block, composed of four residual blocks.
Fig. 2.
Fig. 2. Results for applying PhUn-Net on unseen wrapped phase images of human sperm cells in a watery medium (similar to the ones of the training set). Each row features a different sample, where the top row presents an empty slide, used to verify lack of hallucinations.  (a) PhUn-Net input (wrapped phase). (b) Raw PhUn-Net output. (c) PhUn-Net output made consistent with input by rounding to the closest integer addition of 2π. (d) GT, with phase unwrapping computed by applying the sorting by reliability following a noncontinuous path algorithm [49] on the first column. (e) Binary error map between (c) and (d). The red arrows point to error zones. Each colorbar corresponds to the images above it. The accuracy for rows 1-4 is: 99.97%, 100%, 100%, 99.65%, respectively.
Fig. 3.
Fig. 3. Results for applying PhUn-Net on unseen wrapped phase images of human cancer cells in a watery medium (similar to the ones of the training set). Each row features a different image, where rows 1-3 are SW-480 cells, rows 4-5 are WM-115 cells, and rows 6-7 are WM-255-4 cells. (a) PhUn-Net input (wrapped phase). (b) Raw PhUn-Net output. (c) PhUn-Net output made consistent with input by rounding to the closest integer addition of 2π.
Fig. 4.
Fig. 4. Results for applying PhUn-Net on different morphology and different illumination coherence than used in the training stage: Breast cancer cells from a MDA-MB 468 cell line, acquired under coherent light. Each row features a different sample. (a) PhUn-Net input (wrapped phase). (b) Raw PhUn-Net output. (c) PhUn-Net output made consistent with input by rounding to the closest integer addition of 2π. (d) GT, with phase unwrapping computed by applying the sorting by reliability following a noncontinuous path algorithm [49] on the first column. (e) Binary error map between (c) and (d). The red arrows point to error zones. Each colorbar corresponds to the images above it. The accuracy for rows 1-4 is: 99.97%, 99.90%, 99.95%, 99.99%, respectively.
Fig. 5.
Fig. 5. Results for applying PhUn-Net on different morphology and different illumination coherence than used in the training stage: yeast cells, acquired under coherent light. Each row features a different sample. (a) PhUn-Net input (wrapped phase). (b) Raw PhUn-Net output. (c) PhUn-Net output made consistent with input by rounding to the closest integer addition of 2π. (d) GT, with phase unwrapping computed by applying the sorting by reliability following a noncontinuous path algorithm [49] on the first column. (e) Binary error map between (c) and (d). The red arrows point to error zones. Each colorbar corresponds to the images above it. The accuracy for rows 1-3 is: 99.96%, 99.99%, 99.56%, respectively.
Fig. 6.
Fig. 6. Results for applying PhUn-Net on different morphology and different illumination coherence than used in the training stage: red blood cells, acquired under coherent light. Each row features a different sample. (a) PhUn-Net input (wrapped phase). (b) Raw PhUn-Net output. (c) PhUn-Net output made consistent with input by rounding to the closest integer addition of 2π. (d) GT, with phase unwrapping computed by applying the sorting by reliability following a noncontinuous path algorithm [49] on the first column. (e) Binary error map between (c) and (d). The red arrows point to error zones. Each colorbar corresponds to the images above it. The accuracy for rows 1-3 is: 100%, 100%, 99.99%, respectively.
Fig. 7.
Fig. 7. Results for applying PhUn-Net on a simulated phantom with various types of artifacts. The first row presents the phantom in an ideal state, without noise. The second row presents a phase image subjected to diffraction, caused by a microscope objective with NA = 1.34. The third row presents a phase image with an added non-flat illumination surface, in addition to diffraction. The fourth row presents a phase map with all above artifacts, as well as shot (Poisson) noise. The fifth and sixth rows present phase maps with added speckle noise with variance 0.025 and 0.15, respectively, on top of all the other artifacts. (a) PhUn-Net input (wrapped phase). (b) Raw PhUn-Net output. (c) PhUn-Net output made consistent with input by rounding to the closest integer addition of 2π. (d) GT. (e) Binary error map between (c) and (d). Each colorbar corresponds to the images above it. The accuracy for rows 1-6 is: 100%, 100%, 100%, 99.99%, 100%, 99.99%, respectively.
Fig. 8.
Fig. 8. Accuracy result for applying PhUn-Net on a simulated phantom subjected to 8-bit quantization, diffraction due to a microscope objective with NA = 1.34, non-flat illumination surface, shot noise, and speckle noise with various values of variance.
Fig. 9.
Fig. 9. Results for applying a neural network trained on data with synthetically added coherent noise to data acquired under partially coherent illumination. Each row features a different sample, corresponding to a sample shown in Figs. 23. Rows 1-2: sperm cells, rows 3-4: SW-480 cells, row 5: WM-115 cells, row 6: WM-255-4 cells. (a) Network input (wrapped phase). (b) Raw network output. (c) Network output made consistent with input by rounding to the closest integer addition of 2π. (d) GT, with phase unwrapping computed by applying the sorting by reliability following a noncontinuous path algorithm [49] on the first column. (e) Binary error map between (c) and (d). The red arrows point to error zones. Each colorbar corresponds to the images above it. The accuracy for rows 1-6 is: 99.6%, 100%, 99.95%, 100%, 99.99%, 99.99%, respectively.

Metrics