近年來,深度學習方法在單張影像除霧領域取得了相當大的成功。然而,這些方法在面對領域遷移時往往會出現性能下降的情況。尤其現有的數據集存在霧霾的濃度差異,當這些方法在不同的數據集上進行測試時,往往會導致性能下降。為了解決這個問題,我們提出一個霧霾濃度感知的數據擴增演算法 (DAMix)。DAMix 會試圖產生最小化與目標領域的 Wasserstein 距離的樣本。這些樣本是透過將一張有霧的圖與其對應的無霧圖像以及大氣光做組合所生成。通過這種方式,這些經過 DAMix 處理的樣本不僅縮小了領域之間的差別,而且還被證明符合大氣散射模型。由於上述原因,DAMix 能幫助現有除霧方法在數據上以及畫面品質上得到全面提升。尤其它幫助這些方法在面對訓練分佈之外的有霧圖像時減輕顏色偏移以及過度增強的問題。此外,我們也在實驗結果中表明 DAMix 能在不同方面下對數據效率有所幫助。具體來說,在其中一個實驗設置下,使用一半的源數據集進行訓練並且搭配 DAMix 的除霧模型相比使用整個源數據集但不使用 DAMix 訓練的模型能達到更好的適應性。並且由於 DAMix 的低計算資源要求,它可以很容易地被加入現有的除霧方法中,以便在面對領域遷移時獲得更好的效果。
Deep learning-based methods have achieved considerable success on single image dehazing in recent years. However, these methods are often subject to performance degradation when domain shifts are confronted. Specifically, haze density gaps exist among the existing datasets, often resulting in poor performance when these methods are tested across datasets. To address this issue, we propose a density-aware mixup augmentation (DAMix). DAMix generates samples in an attempt to minimize the Wasserstein distance with the hazy images in the target domain. These samples are generated by combining a hazy image with its corresponding ground truth and the atmospheric light by two density-aware matrices. In this manner, these DAMix-ed samples not only mitigate domain gaps but are also proven to comply with the atmospheric scattering model. Thus, DAMix achieves comprehensive improvements on domain adaptation quantitatively and qualitatively. It helps the state-of-the-art networks mitigate the color shift and overenhancement when dealing with hazy images out of the training distribution. Furthermore, we show that DAMix is helpful with respect to data efficiency under different perspectives. Specifically, a network trained with half of the source dataset using DAMix can achieve even better adaptivity than that trained with the whole source dataset but without DAMix. Thanks to the low computational overhead of DAMix, it can be easily plugged into any codebase to achieve better performance when confronting domain shifts.