Image haze removal is essential in autonomous driving as the outdoor images captured during unfavorable weather conditions, such as haze or snow, are affected by poor visibility. Much research has been done to overcome image degradation such as low contrast and faded color due to haze. However, in the traditional model, a phenomenon is neglected that several particles simultaneously involved in light acquisition. To address this problem, we propose a novel single image dehazing method based on the spatially adaptive atmospheric point spread function (APSF). We developed a module that estimates the APSF to overcome the limitations of the spatially invariant APSF which used in existing dehazing algorithms. The key factor in the estimation is that road scenes with haze have different statistical characteristic from common hazy images in color and resolution. Furthermore, the APSF on the traffic signs or lights is estimated by generating superpixels to prevent halo artifacts around the sharp edges of the images. We adopted the total variation model as a regularization functional to reduce halo and unnatural artifacts that may occur during deconvolution. The haze-free images from the proposed method tested whether the proposed method can enhance the performance of vision algorithms for autonomous driving. The experimental results demonstrate that the proposed method outperforms state-of-the-art image dehazing methods enhancing the performance of the vision algorithms. Moreover, additional experiments demonstrated the effectiveness of the proposed method for quantitative and qualitative comparison with the state-of-the-art algorithms.
Bibliographical noteFunding Information:
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea Government (MSIT) (No. 2019R1A2C2002167).
© 2013 IEEE.
All Science Journal Classification (ASJC) codes
- Computer Science(all)
- Materials Science(all)