Abstract

The reflection spectrum of an object characterizes its surface material, but for non-Lambertian scenes, the recorded spectrum often deviates owing to specular contamination. To compensate for this deviation, the illumination spectrum is required, and it can be estimated from specularity. However, existing illumination-estimation methods often degenerate in challenging cases, especially when only weak specularity exists. By adopting the dichromatic reflection model, which formulates a specular-influenced image as a linear combination of diffuse and specular components, this paper explores two individual priors and one mutual prior upon these two components: (i) The chromaticity of a specular component is identical over all the pixels. (ii) The diffuse component of a specular-contaminated pixel can be reconstructed using its specular-free counterpart describing the same material. (iii) The spectrum of illumination usually has low correlation with that of diffuse reflection. A general optimization framework is proposed to estimate the illumination spectrum from the specular component robustly and accurately. The results of both simulation and real experiments demonstrate the robustness and accuracy of our method.

© 2015 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Illumination separation of non-Lambertian scenes from a single hyperspectral image

Tong Su, Yu Zhou, Yao Yu, Xun Cao, and Sidan Du
Opt. Express 26(20) 26167-26178 (2018)

Variational estimation of inhomogeneous specular reflectance and illumination from a single view

Kenji Hara and Ko Nishino
J. Opt. Soc. Am. A 28(2) 136-146 (2011)

Method for computing the scene-illuminant chromaticity from specular highlights

Hsien-Che Lee
J. Opt. Soc. Am. A 3(10) 1694-1699 (1986)

References

  • View by:
  • |
  • |
  • |

  1. C. P. Huynh and A. Robles-Kelly, “A solution of the dichromatic model for multi-spectral photometric invariance,” Int. J. Comput. Vision 90(1), 1–27 (2010).
    [Crossref]
  2. A. Gijsenij, T. Gevers, and J. van De Weijer, “Computational color constancy: Survey and experiments,” IEEE Trans. Image Process. 20(9), 2475–2489 (2011).
    [Crossref] [PubMed]
  3. T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
    [Crossref]
  4. J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
    [Crossref]
  5. A. Artusi, F. Banterle, and D. Chetverikov, “A survey of specularity removal methods,” Comput. Graph. Forum 30(8), 2208–2230 (2011).
    [Crossref]
  6. P. Koirala, P. Pant, M. Hauta-Kasari, and J. Parkkinen, “Highlight detection and removal from spectral image,” J. Opt. Soc. Am. A 28(11), 2284–2291 (2011).
    [Crossref]
  7. Q. Yang, S. Wang, and S. Ahuja, “Real-time specular highlight removal using bilateral filtering,” In Proceedings of European Conference on Computer Vision (Springer, 2010), pp. 87–100.
  8. R. T. Tan, K. Nishino, and K. Ikeuchi, “Color constancy through inverse-intensity chromaticity space,” J. Opt. Soc. Am. A 21(3), 321–334 (2004).
    [Crossref]
  9. K. Barnard, V. Cardei, and B. Funt, “A comparison of computational color constancy algorithms. I: Methodology and experiments with synthesized data,” IEEE Trans. Image Process. 11(9), 972–984 (2002).
    [Crossref]
  10. K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
    [Crossref]
  11. J. V. D. Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007).
    [Crossref] [PubMed]
  12. L. Shi and B. Funt, “MaxRGB reconsidered,” J. Imaging Sci. Technol.56(2), 20501-1–20501-10(10) (2012).
    [Crossref]
  13. M. P. Lucassen, T. Gevers, A. Gijsenij, and N. Dekker, “Effects of chromatic image statistics on illumination induced color differences,” J. Opt. Soc. Am. A 30(9), 1871–1884 (2013).
    [Crossref]
  14. D. Cheng, D. K. Prasad, and M. S. Brown, “Illuminant estimation for color constancy: why spatial-domain methods work and the role of the color distribution,” J. Opt. Soc. Am. A 31(5), 1049–1058 (2014).
    [Crossref]
  15. K. Barnard, “Improvements to gamut mapping colour constancy algorithms,” In Proceedings of European Conference on Computer Vision (Springer, 2000), pp. 390–403.
  16. D. A. Forsyth, “A novel algorithm for color constancy,” Int. J. Comput. Vision 5(1), 5–35 (1990).
    [Crossref]
  17. L. Shi, W. Xiong, and B. Funt, “Illumination estimation via thin-plate spline interpolation,” J. Opt. Soc. Am. A 28(5), 940–948 (2011).
    [Crossref]
  18. P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.
  19. S. M. Newhall, R. W. Burnham, and R. M. Evans, “Color constancy in shadows,” J. Opt. Soc. Am. A 48(12), 976–984 (1958).
    [Crossref]
  20. R. Kawakami, J. Takamatsu, and K. Ikeuchi, “Color constancy from black body illumination,” J. Opt. Soc. Am. A 24(7), 1886–1893 (2007).
    [Crossref]
  21. M. S. Drew and B. V. Funt, “Variational approach to interreflection in color images,” J. Opt. Soc. Am. A 9(8), 1255–1265 (1992).
    [Crossref]
  22. S. A. Shafer, “Using color to separate reflection components,” Color Res. Appl. 10(4), 210–218 (1985).
    [Crossref]
  23. G. D. Finlayson and G. Schaefer, “Convex and non-convex illuminant constraints for dichromatic colour constancy,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 598–604.
  24. H. C. Lee, “Method for computing the scene-illuminant chromaticity from specular highlights,” J. Opt. Soc. Am. A 3(10), 1694–1699 (1986).
    [Crossref] [PubMed]
  25. T. M. Lehmann and C. Palm, “Color line search for illuminant estimation in real-world scenes,” J. Opt. Soc. Am. A 18(11), 2679–2691 (2001).
    [Crossref]
  26. L. Shi and B. Funt, “Dichromatic illumination estimation via Hough transforms in 3D” in European Conference on Colour in Graphics, Imaging, and Vision (IS&T, 2008), pp. 259–262.
  27. J. Toro and B. Funt, “A multilinear constraint on dichromatic planes for illumination estimation,” IEEE Trans. Image Process. 16(1), 92–97 (2007).
    [Crossref] [PubMed]
  28. J. Toro, “Dichromatic illumination estimation without pre-segmentation,” Pattern Recogn. Lett. 29(7), 871–877 (2008).
    [Crossref]
  29. K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. 33(12), 2341–2353 (2011).
    [Crossref]
  30. M. S. Drew, H. R. V. Joze, and G. D. Finlayson, “Specularity, the zeta-image, and information-theoretic illuminant estimation,” In Proceedings of European Conference on Computer Vision Workshops and Demonstrations (Springer, 2012), pp. 411–420.
  31. M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Proces. 54(11), 4311–4322 (2006).
    [Crossref]
  32. Z. Lin, M. Chen, and Y. Ma, “The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices,” in Technical Report UILU-ENG-09-2215 (University of Illinois Urbana-Champaign, 2009).
  33. R. Rubinstein, M. Zibulevsky, and M. Elad, “Efficient implementation of the K-SVD algorithm using batch orthogonal matching pursuit,”in CS Technical Report (Technion–Israel Institute of Technology, 2008).
  34. J. Suo, L. Bian, F. Chen, and Q. Dai, “Bispectral coding: compressive and high-quality acquisition of fluorescence and reflectance,” Opt. Express 22(2), 1697–1712 (2014).
    [Crossref] [PubMed]
  35. F. Yasuma, T. Mitsunaga, D. Iso, and S. K. Nayar, “Multispectral Image Database,” http://www.cs.columbia.edu/CAVE/databases/multispectral/ .
  36. J. Jun and J. Gu, “Recovering spectral reflectance under commonly available lighting conditions,” in Proceedings of International Conference on Computer Vision and Pattern Recognition Workshops (IEEE, 2012), pp. 1–8.

2014 (2)

2013 (1)

2011 (5)

L. Shi, W. Xiong, and B. Funt, “Illumination estimation via thin-plate spline interpolation,” J. Opt. Soc. Am. A 28(5), 940–948 (2011).
[Crossref]

P. Koirala, P. Pant, M. Hauta-Kasari, and J. Parkkinen, “Highlight detection and removal from spectral image,” J. Opt. Soc. Am. A 28(11), 2284–2291 (2011).
[Crossref]

A. Gijsenij, T. Gevers, and J. van De Weijer, “Computational color constancy: Survey and experiments,” IEEE Trans. Image Process. 20(9), 2475–2489 (2011).
[Crossref] [PubMed]

A. Artusi, F. Banterle, and D. Chetverikov, “A survey of specularity removal methods,” Comput. Graph. Forum 30(8), 2208–2230 (2011).
[Crossref]

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. 33(12), 2341–2353 (2011).
[Crossref]

2010 (1)

C. P. Huynh and A. Robles-Kelly, “A solution of the dichromatic model for multi-spectral photometric invariance,” Int. J. Comput. Vision 90(1), 1–27 (2010).
[Crossref]

2008 (2)

T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
[Crossref]

J. Toro, “Dichromatic illumination estimation without pre-segmentation,” Pattern Recogn. Lett. 29(7), 871–877 (2008).
[Crossref]

2007 (3)

J. Toro and B. Funt, “A multilinear constraint on dichromatic planes for illumination estimation,” IEEE Trans. Image Process. 16(1), 92–97 (2007).
[Crossref] [PubMed]

R. Kawakami, J. Takamatsu, and K. Ikeuchi, “Color constancy from black body illumination,” J. Opt. Soc. Am. A 24(7), 1886–1893 (2007).
[Crossref]

J. V. D. Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007).
[Crossref] [PubMed]

2006 (1)

M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Proces. 54(11), 4311–4322 (2006).
[Crossref]

2004 (1)

2002 (2)

K. Barnard, V. Cardei, and B. Funt, “A comparison of computational color constancy algorithms. I: Methodology and experiments with synthesized data,” IEEE Trans. Image Process. 11(9), 972–984 (2002).
[Crossref]

K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
[Crossref]

2001 (2)

J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
[Crossref]

T. M. Lehmann and C. Palm, “Color line search for illuminant estimation in real-world scenes,” J. Opt. Soc. Am. A 18(11), 2679–2691 (2001).
[Crossref]

1992 (1)

1990 (1)

D. A. Forsyth, “A novel algorithm for color constancy,” Int. J. Comput. Vision 5(1), 5–35 (1990).
[Crossref]

1986 (1)

1985 (1)

S. A. Shafer, “Using color to separate reflection components,” Color Res. Appl. 10(4), 210–218 (1985).
[Crossref]

1958 (1)

S. M. Newhall, R. W. Burnham, and R. M. Evans, “Color constancy in shadows,” J. Opt. Soc. Am. A 48(12), 976–984 (1958).
[Crossref]

Aharon, M.

M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Proces. 54(11), 4311–4322 (2006).
[Crossref]

Ahuja, S.

Q. Yang, S. Wang, and S. Ahuja, “Real-time specular highlight removal using bilateral filtering,” In Proceedings of European Conference on Computer Vision (Springer, 2010), pp. 87–100.

Artusi, A.

A. Artusi, F. Banterle, and D. Chetverikov, “A survey of specularity removal methods,” Comput. Graph. Forum 30(8), 2208–2230 (2011).
[Crossref]

Banterle, F.

A. Artusi, F. Banterle, and D. Chetverikov, “A survey of specularity removal methods,” Comput. Graph. Forum 30(8), 2208–2230 (2011).
[Crossref]

Barnard, K.

K. Barnard, V. Cardei, and B. Funt, “A comparison of computational color constancy algorithms. I: Methodology and experiments with synthesized data,” IEEE Trans. Image Process. 11(9), 972–984 (2002).
[Crossref]

K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
[Crossref]

K. Barnard, “Improvements to gamut mapping colour constancy algorithms,” In Proceedings of European Conference on Computer Vision (Springer, 2000), pp. 390–403.

Belhumeur, P. N.

T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
[Crossref]

Bian, L.

Blake, A.

P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Brown, M. S.

Bruckstein, A.

M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Proces. 54(11), 4311–4322 (2006).
[Crossref]

Burnham, R. W.

S. M. Newhall, R. W. Burnham, and R. M. Evans, “Color constancy in shadows,” J. Opt. Soc. Am. A 48(12), 976–984 (1958).
[Crossref]

Cardei, V.

K. Barnard, V. Cardei, and B. Funt, “A comparison of computational color constancy algorithms. I: Methodology and experiments with synthesized data,” IEEE Trans. Image Process. 11(9), 972–984 (2002).
[Crossref]

Chen, F.

Chen, M.

Z. Lin, M. Chen, and Y. Ma, “The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices,” in Technical Report UILU-ENG-09-2215 (University of Illinois Urbana-Champaign, 2009).

Cheng, D.

Chetverikov, D.

A. Artusi, F. Banterle, and D. Chetverikov, “A survey of specularity removal methods,” Comput. Graph. Forum 30(8), 2208–2230 (2011).
[Crossref]

Coath, A.

K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
[Crossref]

Dai, Q.

Dekker, N.

Drew, M. S.

M. S. Drew and B. V. Funt, “Variational approach to interreflection in color images,” J. Opt. Soc. Am. A 9(8), 1255–1265 (1992).
[Crossref]

M. S. Drew, H. R. V. Joze, and G. D. Finlayson, “Specularity, the zeta-image, and information-theoretic illuminant estimation,” In Proceedings of European Conference on Computer Vision Workshops and Demonstrations (Springer, 2012), pp. 411–420.

Elad, M.

M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Proces. 54(11), 4311–4322 (2006).
[Crossref]

R. Rubinstein, M. Zibulevsky, and M. Elad, “Efficient implementation of the K-SVD algorithm using batch orthogonal matching pursuit,”in CS Technical Report (Technion–Israel Institute of Technology, 2008).

Evans, R. M.

S. M. Newhall, R. W. Burnham, and R. M. Evans, “Color constancy in shadows,” J. Opt. Soc. Am. A 48(12), 976–984 (1958).
[Crossref]

Finlayson, G. D.

G. D. Finlayson and G. Schaefer, “Convex and non-convex illuminant constraints for dichromatic colour constancy,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 598–604.

M. S. Drew, H. R. V. Joze, and G. D. Finlayson, “Specularity, the zeta-image, and information-theoretic illuminant estimation,” In Proceedings of European Conference on Computer Vision Workshops and Demonstrations (Springer, 2012), pp. 411–420.

Forsyth, D. A.

D. A. Forsyth, “A novel algorithm for color constancy,” Int. J. Comput. Vision 5(1), 5–35 (1990).
[Crossref]

Funt, B.

L. Shi, W. Xiong, and B. Funt, “Illumination estimation via thin-plate spline interpolation,” J. Opt. Soc. Am. A 28(5), 940–948 (2011).
[Crossref]

J. Toro and B. Funt, “A multilinear constraint on dichromatic planes for illumination estimation,” IEEE Trans. Image Process. 16(1), 92–97 (2007).
[Crossref] [PubMed]

K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
[Crossref]

K. Barnard, V. Cardei, and B. Funt, “A comparison of computational color constancy algorithms. I: Methodology and experiments with synthesized data,” IEEE Trans. Image Process. 11(9), 972–984 (2002).
[Crossref]

L. Shi and B. Funt, “MaxRGB reconsidered,” J. Imaging Sci. Technol.56(2), 20501-1–20501-10(10) (2012).
[Crossref]

L. Shi and B. Funt, “Dichromatic illumination estimation via Hough transforms in 3D” in European Conference on Colour in Graphics, Imaging, and Vision (IS&T, 2008), pp. 259–262.

Funt, B. V.

Geerts, H.

J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
[Crossref]

Gehler, P. V.

P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Geusebroek, J. M.

J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
[Crossref]

Gevers, T.

M. P. Lucassen, T. Gevers, A. Gijsenij, and N. Dekker, “Effects of chromatic image statistics on illumination induced color differences,” J. Opt. Soc. Am. A 30(9), 1871–1884 (2013).
[Crossref]

A. Gijsenij, T. Gevers, and J. van De Weijer, “Computational color constancy: Survey and experiments,” IEEE Trans. Image Process. 20(9), 2475–2489 (2011).
[Crossref] [PubMed]

J. V. D. Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007).
[Crossref] [PubMed]

Gijsenij, A.

M. P. Lucassen, T. Gevers, A. Gijsenij, and N. Dekker, “Effects of chromatic image statistics on illumination induced color differences,” J. Opt. Soc. Am. A 30(9), 1871–1884 (2013).
[Crossref]

A. Gijsenij, T. Gevers, and J. van De Weijer, “Computational color constancy: Survey and experiments,” IEEE Trans. Image Process. 20(9), 2475–2489 (2011).
[Crossref] [PubMed]

J. V. D. Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007).
[Crossref] [PubMed]

Gu, J.

J. Jun and J. Gu, “Recovering spectral reflectance under commonly available lighting conditions,” in Proceedings of International Conference on Computer Vision and Pattern Recognition Workshops (IEEE, 2012), pp. 1–8.

Hauta-Kasari, M.

He, K.

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. 33(12), 2341–2353 (2011).
[Crossref]

Huynh, C. P.

C. P. Huynh and A. Robles-Kelly, “A solution of the dichromatic model for multi-spectral photometric invariance,” Int. J. Comput. Vision 90(1), 1–27 (2010).
[Crossref]

Ikeuchi, K.

Joze, H. R. V.

M. S. Drew, H. R. V. Joze, and G. D. Finlayson, “Specularity, the zeta-image, and information-theoretic illuminant estimation,” In Proceedings of European Conference on Computer Vision Workshops and Demonstrations (Springer, 2012), pp. 411–420.

Jun, J.

J. Jun and J. Gu, “Recovering spectral reflectance under commonly available lighting conditions,” in Proceedings of International Conference on Computer Vision and Pattern Recognition Workshops (IEEE, 2012), pp. 1–8.

Kawakami, R.

Koirala, P.

Kriegman, D. J.

T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
[Crossref]

Lee, H. C.

Lehmann, T. M.

Lin, Z.

Z. Lin, M. Chen, and Y. Ma, “The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices,” in Technical Report UILU-ENG-09-2215 (University of Illinois Urbana-Champaign, 2009).

Lucassen, M. P.

Ma, Y.

Z. Lin, M. Chen, and Y. Ma, “The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices,” in Technical Report UILU-ENG-09-2215 (University of Illinois Urbana-Champaign, 2009).

Mallick, S. P.

T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
[Crossref]

Martin, L.

K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
[Crossref]

Minka, T.

P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Newhall, S. M.

S. M. Newhall, R. W. Burnham, and R. M. Evans, “Color constancy in shadows,” J. Opt. Soc. Am. A 48(12), 976–984 (1958).
[Crossref]

Nishino, K.

Palm, C.

Pant, P.

Parkkinen, J.

Prasad, D. K.

Robles-Kelly, A.

C. P. Huynh and A. Robles-Kelly, “A solution of the dichromatic model for multi-spectral photometric invariance,” Int. J. Comput. Vision 90(1), 1–27 (2010).
[Crossref]

Rother, C.

P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Rubinstein, R.

R. Rubinstein, M. Zibulevsky, and M. Elad, “Efficient implementation of the K-SVD algorithm using batch orthogonal matching pursuit,”in CS Technical Report (Technion–Israel Institute of Technology, 2008).

Schaefer, G.

G. D. Finlayson and G. Schaefer, “Convex and non-convex illuminant constraints for dichromatic colour constancy,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 598–604.

Shafer, S. A.

S. A. Shafer, “Using color to separate reflection components,” Color Res. Appl. 10(4), 210–218 (1985).
[Crossref]

Sharp, T.

P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Shi, L.

L. Shi, W. Xiong, and B. Funt, “Illumination estimation via thin-plate spline interpolation,” J. Opt. Soc. Am. A 28(5), 940–948 (2011).
[Crossref]

L. Shi and B. Funt, “Dichromatic illumination estimation via Hough transforms in 3D” in European Conference on Colour in Graphics, Imaging, and Vision (IS&T, 2008), pp. 259–262.

L. Shi and B. Funt, “MaxRGB reconsidered,” J. Imaging Sci. Technol.56(2), 20501-1–20501-10(10) (2012).
[Crossref]

Smeulders, A. W. M.

J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
[Crossref]

Sun, J.

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. 33(12), 2341–2353 (2011).
[Crossref]

Suo, J.

Takamatsu, J.

Tan, R. T.

Tang, X.

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. 33(12), 2341–2353 (2011).
[Crossref]

Toro, J.

J. Toro, “Dichromatic illumination estimation without pre-segmentation,” Pattern Recogn. Lett. 29(7), 871–877 (2008).
[Crossref]

J. Toro and B. Funt, “A multilinear constraint on dichromatic planes for illumination estimation,” IEEE Trans. Image Process. 16(1), 92–97 (2007).
[Crossref] [PubMed]

van De Weijer, J.

A. Gijsenij, T. Gevers, and J. van De Weijer, “Computational color constancy: Survey and experiments,” IEEE Trans. Image Process. 20(9), 2475–2489 (2011).
[Crossref] [PubMed]

van den Boomgaard, R.

J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
[Crossref]

Wang, S.

Q. Yang, S. Wang, and S. Ahuja, “Real-time specular highlight removal using bilateral filtering,” In Proceedings of European Conference on Computer Vision (Springer, 2010), pp. 87–100.

Weijer, J. V. D.

J. V. D. Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007).
[Crossref] [PubMed]

Xiong, W.

Yang, Q.

Q. Yang, S. Wang, and S. Ahuja, “Real-time specular highlight removal using bilateral filtering,” In Proceedings of European Conference on Computer Vision (Springer, 2010), pp. 87–100.

Zibulevsky, M.

R. Rubinstein, M. Zibulevsky, and M. Elad, “Efficient implementation of the K-SVD algorithm using batch orthogonal matching pursuit,”in CS Technical Report (Technion–Israel Institute of Technology, 2008).

Zickler, T.

T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
[Crossref]

Color Res. Appl. (1)

S. A. Shafer, “Using color to separate reflection components,” Color Res. Appl. 10(4), 210–218 (1985).
[Crossref]

Comput. Graph. Forum (1)

A. Artusi, F. Banterle, and D. Chetverikov, “A survey of specularity removal methods,” Comput. Graph. Forum 30(8), 2208–2230 (2011).
[Crossref]

IEEE Trans. Image Process. (5)

A. Gijsenij, T. Gevers, and J. van De Weijer, “Computational color constancy: Survey and experiments,” IEEE Trans. Image Process. 20(9), 2475–2489 (2011).
[Crossref] [PubMed]

K. Barnard, V. Cardei, and B. Funt, “A comparison of computational color constancy algorithms. I: Methodology and experiments with synthesized data,” IEEE Trans. Image Process. 11(9), 972–984 (2002).
[Crossref]

K. Barnard, L. Martin, A. Coath, and B. Funt, “A comparison of computational color constancy Algorithms. II. Experiments with image data,” IEEE Trans. Image Process. 11(9), 985–996 (2002).
[Crossref]

J. V. D. Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007).
[Crossref] [PubMed]

J. Toro and B. Funt, “A multilinear constraint on dichromatic planes for illumination estimation,” IEEE Trans. Image Process. 16(1), 92–97 (2007).
[Crossref] [PubMed]

IEEE Trans. Pattern Anal. (2)

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. 33(12), 2341–2353 (2011).
[Crossref]

J. M. Geusebroek, R. van den Boomgaard, A. W. M. Smeulders, and H. Geerts, “Color invariance,” IEEE Trans. Pattern Anal. 23(12), 1338–1350 (2001).
[Crossref]

IEEE Trans. Signal Proces. (1)

M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Proces. 54(11), 4311–4322 (2006).
[Crossref]

Int. J. Comput. Vision (3)

C. P. Huynh and A. Robles-Kelly, “A solution of the dichromatic model for multi-spectral photometric invariance,” Int. J. Comput. Vision 90(1), 1–27 (2010).
[Crossref]

T. Zickler, S. P. Mallick, D. J. Kriegman, and P. N. Belhumeur, “Color subspaces as photometric invariants,” Int. J. Comput. Vision 79(1), 13–30 (2008).
[Crossref]

D. A. Forsyth, “A novel algorithm for color constancy,” Int. J. Comput. Vision 5(1), 5–35 (1990).
[Crossref]

J. Opt. Soc. Am. A (10)

L. Shi, W. Xiong, and B. Funt, “Illumination estimation via thin-plate spline interpolation,” J. Opt. Soc. Am. A 28(5), 940–948 (2011).
[Crossref]

S. M. Newhall, R. W. Burnham, and R. M. Evans, “Color constancy in shadows,” J. Opt. Soc. Am. A 48(12), 976–984 (1958).
[Crossref]

R. Kawakami, J. Takamatsu, and K. Ikeuchi, “Color constancy from black body illumination,” J. Opt. Soc. Am. A 24(7), 1886–1893 (2007).
[Crossref]

M. S. Drew and B. V. Funt, “Variational approach to interreflection in color images,” J. Opt. Soc. Am. A 9(8), 1255–1265 (1992).
[Crossref]

P. Koirala, P. Pant, M. Hauta-Kasari, and J. Parkkinen, “Highlight detection and removal from spectral image,” J. Opt. Soc. Am. A 28(11), 2284–2291 (2011).
[Crossref]

M. P. Lucassen, T. Gevers, A. Gijsenij, and N. Dekker, “Effects of chromatic image statistics on illumination induced color differences,” J. Opt. Soc. Am. A 30(9), 1871–1884 (2013).
[Crossref]

D. Cheng, D. K. Prasad, and M. S. Brown, “Illuminant estimation for color constancy: why spatial-domain methods work and the role of the color distribution,” J. Opt. Soc. Am. A 31(5), 1049–1058 (2014).
[Crossref]

R. T. Tan, K. Nishino, and K. Ikeuchi, “Color constancy through inverse-intensity chromaticity space,” J. Opt. Soc. Am. A 21(3), 321–334 (2004).
[Crossref]

H. C. Lee, “Method for computing the scene-illuminant chromaticity from specular highlights,” J. Opt. Soc. Am. A 3(10), 1694–1699 (1986).
[Crossref] [PubMed]

T. M. Lehmann and C. Palm, “Color line search for illuminant estimation in real-world scenes,” J. Opt. Soc. Am. A 18(11), 2679–2691 (2001).
[Crossref]

Opt. Express (1)

Pattern Recogn. Lett. (1)

J. Toro, “Dichromatic illumination estimation without pre-segmentation,” Pattern Recogn. Lett. 29(7), 871–877 (2008).
[Crossref]

Other (11)

M. S. Drew, H. R. V. Joze, and G. D. Finlayson, “Specularity, the zeta-image, and information-theoretic illuminant estimation,” In Proceedings of European Conference on Computer Vision Workshops and Demonstrations (Springer, 2012), pp. 411–420.

L. Shi and B. Funt, “Dichromatic illumination estimation via Hough transforms in 3D” in European Conference on Colour in Graphics, Imaging, and Vision (IS&T, 2008), pp. 259–262.

G. D. Finlayson and G. Schaefer, “Convex and non-convex illuminant constraints for dichromatic colour constancy,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 598–604.

F. Yasuma, T. Mitsunaga, D. Iso, and S. K. Nayar, “Multispectral Image Database,” http://www.cs.columbia.edu/CAVE/databases/multispectral/ .

J. Jun and J. Gu, “Recovering spectral reflectance under commonly available lighting conditions,” in Proceedings of International Conference on Computer Vision and Pattern Recognition Workshops (IEEE, 2012), pp. 1–8.

Z. Lin, M. Chen, and Y. Ma, “The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices,” in Technical Report UILU-ENG-09-2215 (University of Illinois Urbana-Champaign, 2009).

R. Rubinstein, M. Zibulevsky, and M. Elad, “Efficient implementation of the K-SVD algorithm using batch orthogonal matching pursuit,”in CS Technical Report (Technion–Israel Institute of Technology, 2008).

K. Barnard, “Improvements to gamut mapping colour constancy algorithms,” In Proceedings of European Conference on Computer Vision (Springer, 2000), pp. 390–403.

Q. Yang, S. Wang, and S. Ahuja, “Real-time specular highlight removal using bilateral filtering,” In Proceedings of European Conference on Computer Vision (Springer, 2010), pp. 87–100.

P. V. Gehler, C. Rother, A. Blake, T. Minka, and T. Sharp, “Bayesian color constancy revisited,” in Proceedings of International Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

L. Shi and B. Funt, “MaxRGB reconsidered,” J. Imaging Sci. Technol.56(2), 20501-1–20501-10(10) (2012).
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 An example showing the influence of illumination estimation accuracy on spectrum clustering. (a) Multi-spectral image visualized by integrating with the RGB response curves of Canon 20D. (b), (c), and (d), respectively, illustrate the spectrum clustering results of the original image, the recovered specular-free image with inaccurate illumination (estimated by [8]) and the recovered specular-free image with correct illumination (calibrated with color checker). The specular-free images in (c) and (d) are obtained using the method proposed in [7].
Fig. 2
Fig. 2 A glossy mask illuminated by three different light sources: (a) a single incandescent lamp, (b) a set of fluorescent tubes with the same normalized spectrum, and (c) an incandescent lamp and multiple fluorescent tubes.
Fig. 3
Fig. 3 Demonstration of our illumination-estimation method on the image in Fig. 1(a). (a) Top: dark-channel image. Bottom: a binary image denoting pixel grouping. The white region Ω l includes the pixels used in illumination optimization, and the dark region Ω r is composed of the pixels used to learn the over-complete dictionary. (b) The learned dictionary. We integrate the chromaticity of each base with the RGB response curves of Canon 20D to obtain the corresponding line color. (c) Our initial and final estimation of the illumination spectrum in comparison with the ground truth. The definition of chromaticity in (b) and (c) is described in Eq. (6).
Fig. 4
Fig. 4 Geometric interpretation of our mutual prior. (a) Top: synthetic image of a specular sphere. Middle: pure diffuse component. Bottom: specular component. (b) Illustration of the adaptive balance of the scene between the low-correlation prior and the sparsity constraint. I ˜ d and I ˜ s are the ground-truth chromaticity of the diffuse and specular reflection, respectively. The yellow area is the region of all the spectra in the scene. We label two locations A and B with image coordinates x A and x B , respectively, where the former has the strongest specularity. I ( x A ) = r ( x A ) I ˜ d + l ( x A ) I ˜ s and I ( x B ) = r ( x B ) I ˜ d + l ( x B ) I ˜ s describe the dichromatic model of points A and B, respectively. I ˜ d denotes the direction orthogonal to the spectrum I ˜ d in the hyperplane spanned by I ˜ d and I ˜ s . The blue arrow (#1) and red arrow (#2) denote the effects of the sparsity constraint, and the green arrow (#3) denotes that of the low-correlation prior.
Fig. 5
Fig. 5 Illumination-estimation results for synthetic multi-spectral data with one specular pattern. (a) Diffuse component. (b) Three specular components of the same distribution but with different strengths (top) and corresponding synthetic specularity-contaminated images (bottom). Here we only show data synthesized from white illumination, instead of three different color illuminations, because of the limited space. (c) Performance comparison of the simulated data with different illuminations and the varying specularity strengths in (b).
Fig. 6
Fig. 6 Illumination-estimation results for synthetic multi-spectral data with two specular patterns. (a) Diffuse component. (b) Three specular components with different strengths (top) and corresponding specularity-contaminated images (bottom) with two specular regions. Again, we only show white illumination here. (c) Performance comparison of the simulated data with different illuminations and the three specularity strengths in (b).
Fig. 7
Fig. 7 Performance for an example with similar diffuse and specular chromaticity. (a) A red sphere under white illumination. (b) The same red sphere of (a) under red illumination. (c) Diffuse chromaticity under red illumination and the estimated illuminations.
Fig. 8
Fig. 8 Illumination-estimation results for real captured data. (a) Results for data with strong specularity from [35]. We use the pixels below the red line to estimate illumination and the color checker above this line to obtain the ground truth. (b) Results for examples with weak specularity and the true illumination spectrum are obtained by using a color checker.
Fig. 9
Fig. 9 (a) Normalized spectrum of the halogen lamp. (b) Sensor spectral response of Canon 20D. (c) Transmission curves of the narrow-band filters.
Fig. 10
Fig. 10 Illumination estimation from non-Lambertian scenes containing white and grey colors, using the data from [35]. (a) The source images. (b) The threshold binary map. (c) The learned dictionary. (d) The ground-truth illumination and estimation results.

Tables (1)

Tables Icon

Algorithm 1 Algorithm for illumination-spectrum estimation

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

I c ( x ) = ω r ( x ) Λ r ( x , λ ) e ( λ ) q c ( λ ) d λ + ω l ( x ) Λ e ( λ ) q c ( λ ) d λ .
I ( x ) = I d ( x ) + I s ( x ) ,
I = I d + I s ,
I = DC + I s .
I d a r k ( x ) = min c ( I c ( x ) ) .
I ˜ d ( x ) = I d ( x ) I d ( x ) 2 and I ˜ s ( x ) = I s ( x ) I s ( x ) 2 .
I ( x ) = r ( x ) I ˜ d ( x ) + l ( x ) I ˜ s ,
arg min I s I s * + α 1 C 1 + α 2 N 2 2 + α 3 D I s 2 2 subject to I = DC + I s + N I s 0.
L a g = S * + α 1 C 1 + α 2 N 2 2 + α 3 D I s 2 2 + β 2 I DC I s N 2 2 + < Y 1 , I DC I s N > + β 2 S I s 2 2 + < Y 2 , S I s > ,
S t + 1 = arg min S { S * + β t 2 S I s t + Y 2 t β t 2 2 } = U Σ [ 1 ] V T ,
N t + 1 = arg min N { α 2 N 2 2 + β t 2 I D C t N I s t + Y 1 t β t 2 2 } = β t 2 α 2 + β t ( I D C t I s t + Y 1 t β t ) .
C t + 1 = OMP ( D , I I s t N t + 1 , 1 ) .
I s t + 1 = max { 0 , arg min I s α 3 D I s 2 2 + β t 2 ( I D C t + 1 N t + 1 I s + Y 1 t β t 2 2 + S t + 1 I s + Y 2 t β t | 2 2 ) } = max { 0 , ( 2 E + α 3 β t D D ) ( S t + 1 + I D C t + 1 N t + 1 + Y 1 t + Y 2 t β t ) } ,
Y 1 t + 1 = Y 1 t + β t ( I D C t + 1 I s t + 1 N t + 1 ) , Y 2 t + 1 = Y 2 t + β t ( S t + 1 I s t + 1 ) , β t + 1 = min { ρ β t , β max } ,

Metrics