Scene visibility in outdoor images is often deteriorated by bad weather conditions such as snow, haze, and rain. Especially, degradation due to haze is typically observed in the form of faded color and low contrast of images. To overcome image degradation, dehazing algorithms based on the atmospheric scattering model employ transmission map estimation, which is related to the haze density across the depth of scenes. However, estimating the depth of the outdoor scene without additional information is challenging, and an erroneously estimated depth leads to a dehazed image of poor quality. In this paper, we propose a fusion-based dehazing algorithm that does not require direct estimation of the transmission map. The intermediate latent images are obtained by restoring the scene radiance with several globally constant transmission values based on the atmospheric scattering model. Using this approach, only the region where the particular transmission corresponds to the ground truth is dehazed successfully in each image. These images are merged into a haze-free result via the fusion algorithm, which selectively uses the information of the patches from the latent images. Experimental results show that the proposed algorithm effectively removes haze and outperforms the several dehazing methods based on quantitative and qualitative evaluations.
Bibliographical noteFunding Information:
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government(MSIT) (No. 2019R1A2C2002167 ) and partially supported by the Samsung Advanced Institute of Technology(SAIT) in 2019.
All Science Journal Classification (ASJC) codes
- Control and Systems Engineering
- Signal Processing
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering