Abstract

A video see-through augmented reality three-dimensional display method is presented. The system that is used for dense viewpoint augmented reality presentation fuses the light fields of the real scene and the virtual model naturally. Inherently benefiting from the rich information of the light field, depth sense and occlusion can be handled under no priori depth information of the real scene. A series of processes are proposed to optimize the augmented reality performance. Experimental results show that the reconstructed fused 3D light field on the autostereoscopic display is well presented. The virtual model is naturally integrated into the real scene with a consistence between binocular parallax and monocular depth cues.

© 2016 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing

Xinzhu Sang, Xin Gao, Xunbo Yu, Shujun Xing, Yuanhang Li, and Yongle Wu
Opt. Express 26(7) 8883-8889 (2018)

High-performance integral-imaging-based light field augmented reality display using freeform optics

Hekun Huang and Hong Hua
Opt. Express 26(13) 17578-17590 (2018)

Progress in virtual reality and augmented reality based on holographic display

Zehao He, Xiaomeng Sui, Guofan Jin, and Liangcai Cao
Appl. Opt. 58(5) A74-A81 (2019)

References

  • View by:
  • |
  • |
  • |

  1. D. W. F. van Krevelen and R. Poelman, “A survey of augmented reality technologies, applications and limitations,” Int. J. Virtual Reality 9(2), 1-20 (2010).
  2. R. T. Azuma, “A survey of augmented reality,” Presence (Camb. Mass.) 6(4), 355–385 (1997).
    [Crossref]
  3. F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (IEEE, 2008), pp. 193–202.
  4. H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
    [Crossref]
  5. V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
    [Crossref] [PubMed]
  6. B. H. Thomas, “A survey of visual, mixed, and augmented reality gaming,” Comput. Entertainment 9(3), 3 (2012).
  7. D. Bingbin and X. Yang, “A low-latency 3D teleconferencing system with image based approach,” in Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry (ACM, 2013), pp. 243–248.
  8. N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
    [Crossref]
  9. X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009).
    [Crossref] [PubMed]
  10. X. Gao, X. Sang, X. Yu, P. Wang, X. Cao, L. Sun, B. Yan, J. Yuan, K. Wang, C. Yu, and W. Dou, “Aberration analyses for improving the frontal projection three-dimensional display,” Opt. Express 22(19), 23496–23511 (2014).
    [Crossref] [PubMed]
  11. X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
    [Crossref]
  12. A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
    [Crossref]
  13. T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
    [Crossref]
  14. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011).
    [Crossref] [PubMed]
  15. O. Hilliges and D. Kim, S. lzadi, M. Weiss and A. Wilson, “HoloDesk: direct 3d interactions with a situated see-through display,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM, 2012), pp. 2421–2430.
  16. F. de Sorbier and Y. Uematsu, Y. uematsu, I. Daribo and H. Saito, “Augmented reality for 3d tv using depth camera input,” in 16th International Conference on Virtual Systems and Multimedia (VSMM) (IEEE, 2010), pp. 117–123.
  17. A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
    [Crossref]
  18. S. J. Gortler, R, Grzeszczuk, R. Szeliski, and M. F. Cohen, “The lumigraph,” in Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp 43–54.
  19. E. H. Adelson and J. R. Bergen, “The plenoptic function and the elements of early vision,” in Computational Models of Visual Processing (1991), pp. 1–18.

2014 (2)

X. Gao, X. Sang, X. Yu, P. Wang, X. Cao, L. Sun, B. Yan, J. Yuan, K. Wang, C. Yu, and W. Dou, “Aberration analyses for improving the frontal projection three-dimensional display,” Opt. Express 22(19), 23496–23511 (2014).
[Crossref] [PubMed]

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

2013 (1)

H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
[Crossref]

2012 (1)

B. H. Thomas, “A survey of visual, mixed, and augmented reality gaming,” Comput. Entertainment 9(3), 3 (2012).

2011 (2)

N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
[Crossref]

Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011).
[Crossref] [PubMed]

2010 (1)

D. W. F. van Krevelen and R. Poelman, “A survey of augmented reality technologies, applications and limitations,” Int. J. Virtual Reality 9(2), 1-20 (2010).

2009 (2)

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009).
[Crossref] [PubMed]

2005 (1)

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

1997 (1)

R. T. Azuma, “A survey of augmented reality,” Presence (Camb. Mass.) 6(4), 355–385 (1997).
[Crossref]

Anandan, P.

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

Ando, H.

Azuma, R. T.

R. T. Azuma, “A survey of augmented reality,” Presence (Camb. Mass.) 6(4), 355–385 (1997).
[Crossref]

Billinghurst, M.

F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (IEEE, 2008), pp. 193–202.

Bingbin, D.

D. Bingbin and X. Yang, “A low-latency 3D teleconferencing system with image based approach,” in Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry (ACM, 2013), pp. 243–248.

Cai, Y.

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Cao, X.

Chang, H. Y.

H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
[Crossref]

Chen, D.

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Choi, S.

Criminisi, A.

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

de Sorbier, F.

F. de Sorbier and Y. Uematsu, Y. uematsu, I. Daribo and H. Saito, “Augmented reality for 3d tv using depth camera input,” in 16th International Conference on Virtual Systems and Multimedia (VSMM) (IEEE, 2010), pp. 117–123.

Dodgson, N. A.

N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
[Crossref]

Dou, W.

Duh, H. B. L.

F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (IEEE, 2008), pp. 193–202.

Fan, F. C.

Favalora, G. E.

N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
[Crossref]

Ferrari, V.

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

Gao, X.

Gortler, S. J.

S. J. Gortler, R, Grzeszczuk, R. Szeliski, and M. F. Cohen, “The lumigraph,” in Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp 43–54.

Gustafsson, J.

A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
[Crossref]

Hilliges, O.

O. Hilliges and D. Kim, S. lzadi, M. Weiss and A. Wilson, “HoloDesk: direct 3d interactions with a situated see-through display,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM, 2012), pp. 2421–2430.

Holliman, N. S.

N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
[Crossref]

Jellberg, K.

A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
[Crossref]

Jiang, C. C.

Kamuro, S.

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

Kang, S. B.

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

Kashiwada, S.

Kim, D.

O. Hilliges and D. Kim, S. lzadi, M. Weiss and A. Wilson, “HoloDesk: direct 3d interactions with a situated see-through display,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM, 2012), pp. 2421–2430.

Lee, S. W. Y.

H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
[Crossref]

Liang, J. C.

H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
[Crossref]

Lindfors, C.

A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
[Crossref]

Mattsson, L.

A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
[Crossref]

Megali, G.

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

Minamizawa, K.

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

Mosca, F.

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

Nakamura, K.

Nii, H.

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

Olwal, A.

A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
[Crossref]

Pietrabissa, A.

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

Pockett, L.

N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
[Crossref]

Poelman, R.

D. W. F. van Krevelen and R. Poelman, “A survey of augmented reality technologies, applications and limitations,” Int. J. Virtual Reality 9(2), 1-20 (2010).

Sang, X.

Shimizu, K.

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

Sun, L.

Swaminathan, R.

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

Szeliski, R.

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

Tachi, S.

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

Takaki, Y.

Thomas, B. H.

B. H. Thomas, “A survey of visual, mixed, and augmented reality gaming,” Comput. Entertainment 9(3), 3 (2012).

Troia, E.

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

Uematsu, Y.

F. de Sorbier and Y. Uematsu, Y. uematsu, I. Daribo and H. Saito, “Augmented reality for 3d tv using depth camera input,” in 16th International Conference on Virtual Systems and Multimedia (VSMM) (IEEE, 2010), pp. 117–123.

Urano, Y.

van Krevelen, D. W. F.

D. W. F. van Krevelen and R. Poelman, “A survey of augmented reality technologies, applications and limitations,” Int. J. Virtual Reality 9(2), 1-20 (2010).

Wang, K.

X. Gao, X. Sang, X. Yu, P. Wang, X. Cao, L. Sun, B. Yan, J. Yuan, K. Wang, C. Yu, and W. Dou, “Aberration analyses for improving the frontal projection three-dimensional display,” Opt. Express 22(19), 23496–23511 (2014).
[Crossref] [PubMed]

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Wang, P.

Wu, H. K.

H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
[Crossref]

Xing, S.

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Xu, D.

Yan, B.

X. Gao, X. Sang, X. Yu, P. Wang, X. Cao, L. Sun, B. Yan, J. Yuan, K. Wang, C. Yu, and W. Dou, “Aberration analyses for improving the frontal projection three-dimensional display,” Opt. Express 22(19), 23496–23511 (2014).
[Crossref] [PubMed]

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Yang, X.

D. Bingbin and X. Yang, “A low-latency 3D teleconferencing system with image based approach,” in Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry (ACM, 2013), pp. 243–248.

Yoshida, T.

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

Yu, C.

Yu, X.

X. Gao, X. Sang, X. Yu, P. Wang, X. Cao, L. Sun, B. Yan, J. Yuan, K. Wang, C. Yu, and W. Dou, “Aberration analyses for improving the frontal projection three-dimensional display,” Opt. Express 22(19), 23496–23511 (2014).
[Crossref] [PubMed]

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Yuan, J.

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

X. Gao, X. Sang, X. Yu, P. Wang, X. Cao, L. Sun, B. Yan, J. Yuan, K. Wang, C. Yu, and W. Dou, “Aberration analyses for improving the frontal projection three-dimensional display,” Opt. Express 22(19), 23496–23511 (2014).
[Crossref] [PubMed]

Zhao, T.

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Zhou, F.

F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (IEEE, 2008), pp. 193–202.

Comput. Educ. (1)

H. K. Wu, S. W. Y. Lee, H. Y. Chang, and J. C. Liang, “Current status, opportunities and challenges of augmented reality in education,” Comput. Educ. 62, 41–49 (2013).
[Crossref]

Comput. Entertainment (1)

B. H. Thomas, “A survey of visual, mixed, and augmented reality gaming,” Comput. Entertainment 9(3), 3 (2012).

Comput. Vis. Image Underst. (1)

A. Criminisi, S. B. Kang, R. Swaminathan, R. Szeliski, and P. Anandan, “Extracting layers and analyzing their specular properties using epipolar-plane-image analysis,” Comput. Vis. Image Underst. 97(1), 51–85 (2005).
[Crossref]

IEEE Trans. Biomed. Eng. (1)

V. Ferrari, G. Megali, E. Troia, A. Pietrabissa, and F. Mosca, “A 3-D mixed-reality system for stereoscopic visualization of medical dataset,” IEEE Trans. Biomed. Eng. 56(11), 2627–2633 (2009).
[Crossref] [PubMed]

IEEE Trans. Broadcast (1)

N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011).
[Crossref]

Int. J. Virtual Reality (1)

D. W. F. van Krevelen and R. Poelman, “A survey of augmented reality technologies, applications and limitations,” Int. J. Virtual Reality 9(2), 1-20 (2010).

Opt. Commun. (1)

X. Yu, X. Sang, S. Xing, T. Zhao, D. Chen, Y. Cai, B. Yan, K. Wang, J. Yuan, C. Yu, and W. Dou, “Natural three-dimensional display with smooth motion parallax using active partially pixelated masks,” Opt. Commun. 313, 146–151 (2014).
[Crossref]

Opt. Express (2)

Opt. Lett. (1)

Presence (Camb. Mass.) (1)

R. T. Azuma, “A survey of augmented reality,” Presence (Camb. Mass.) 6(4), 355–385 (1997).
[Crossref]

Other (8)

F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (IEEE, 2008), pp. 193–202.

A. Olwal, C. Lindfors, J. Gustafsson, K. Jellberg, and L. Mattsson, “ASTOR: An autostereoscopic optical see-through augmented reality system,” in Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), pp. 24–27.
[Crossref]

T. Yoshida, K. Shimizu, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology,” in 2011 IEEE International Symposium on VR Innovation (ISVRI) (IEEE, 2011), pp. 49–54.
[Crossref]

O. Hilliges and D. Kim, S. lzadi, M. Weiss and A. Wilson, “HoloDesk: direct 3d interactions with a situated see-through display,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM, 2012), pp. 2421–2430.

F. de Sorbier and Y. Uematsu, Y. uematsu, I. Daribo and H. Saito, “Augmented reality for 3d tv using depth camera input,” in 16th International Conference on Virtual Systems and Multimedia (VSMM) (IEEE, 2010), pp. 117–123.

S. J. Gortler, R, Grzeszczuk, R. Szeliski, and M. F. Cohen, “The lumigraph,” in Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp 43–54.

E. H. Adelson and J. R. Bergen, “The plenoptic function and the elements of early vision,” in Computational Models of Visual Processing (1991), pp. 1–18.

D. Bingbin and X. Yang, “A low-latency 3D teleconferencing system with image based approach,” in Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry (ACM, 2013), pp. 243–248.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 3D light field
Fig. 2
Fig. 2 Basic concept of light field fusion
Fig. 3
Fig. 3 Framework of light field fusion
Fig. 4
Fig. 4 Local depth computation (a) EPI, (b) region with significant texture in EPI, (c) directional statistics, (d) histogram of directional score, (e) local depth map
Fig. 5
Fig. 5 Global depth statistics (a) the principle of global depth statistics, (b) depth line (c) global depth line after complement
Fig. 6
Fig. 6 Light field fusion in EPI layer, (a) EPI of real light field, (b) EPI of virtual light field, (c) fused EPI of the fused light field.
Fig. 7
Fig. 7 (a) Middle perspective of the scene, (b) corresponding vertical gradient, (c) EPIs, (d) front EPI-strip extraction.
Fig. 8
Fig. 8 Results of consistency refinement
Fig. 9
Fig. 9 Edge burrs elimination, (a) details of EPI-strip edge, red points represent the actual edge of each multi-perspective image and the statistical regression result is the depth line, (b) fusion effect without edge burrs elimination, (c) fusion effect with edge burrs elimination
Fig. 10
Fig. 10 3D registration and virtual model Generation, (a) and (b) the rightmost and leftmost real scene perspectives, respectively, (c) the virtual scene build in OSG, (d) and (e) the rightmost and leftmost virtual scene perspectives, respectively.
Fig. 11
Fig. 11 Autostereoscopic augment reality display, (a) perspectives of light field from left to right, (b) zoom in each perspective.
Fig. 12
Fig. 12 Virtual box is random put on the ground with correct occlusion

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

s = ( x x p ) tan θ + s p
Φ ( P , θ ) = { P ( s s p tan θ + x p , s ) | s = 1 , ... , n }
D s = P ( x , s ) Φ T [ D i s R G B ( P ( x , s ) , P ( x p , s p ) ) ]
T ( x ) = 1 | x / t | 2 , | x / t | 1 0 , | x / t | > 1
s = ( x x p ) tan θ p + s p
σ = Q N 2 Q = q { q | W ( q ) = 1 } ( x q ( s q s p ) tan θ p x p ) 2 , N = q { q | W ( q ) = 1 } W ( q )
W ( q ) = { 1 | x q ( s q s p ) tan θ p x p | w , θ q = θ p 0 e l s e
ρ = N s max s min , s max = max s { q | W ( q ) = 1 } , s min = min s { q | W ( q ) = 1 }
s = ( x x p ) tan θ p + s p , s min s s max

Metrics