{"title":"Image Dehazing Via Cycle Generative Adversarial Network","authors":"Changyou Shi, Jianping Lu, Qian Sun, Shiliang Cheng, Xin Feng, Wei Huang","doi":"10.1145/3503047.3503135","DOIUrl":null,"url":null,"abstract":"Recovering a clear image from single hazy image has been widely investigated in recent researches. Due to the lack of the real hazed image dataset, most studies use artificially synthesized dataset to train the models. Nonetheless, the real word foggy image is far different from the synthesized image. As a result, the existing methods could not defog the real foggy image well, when inputting the real foggy images. In this paper, we introduce a new dehazing algorithm, which adds cycle consistency constraints to the generative adversarial network (GAN). It implements the translation from foggy images to clean images without supervised learning, that is, the model does not need paired data to training. We assume that clear and foggy images come from different domains. There are two generators that act as domain translators, one from foggy image domain to clean image domain, and the other from foggy image to clean image. Two discriminators in the GAN are used for assessing each domain translator. The GAN loss, combined with the cycle consistency loss are used to regularize the model. We carried out experiments to evaluate the proposed method, and the results demonstrate the effectiveness in dehazing and there is indeed difference between the real-fog images and the synthetic images.","PeriodicalId":190604,"journal":{"name":"Proceedings of the 3rd International Conference on Advanced Information Science and System","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd International Conference on Advanced Information Science and System","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3503047.3503135","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recovering a clear image from single hazy image has been widely investigated in recent researches. Due to the lack of the real hazed image dataset, most studies use artificially synthesized dataset to train the models. Nonetheless, the real word foggy image is far different from the synthesized image. As a result, the existing methods could not defog the real foggy image well, when inputting the real foggy images. In this paper, we introduce a new dehazing algorithm, which adds cycle consistency constraints to the generative adversarial network (GAN). It implements the translation from foggy images to clean images without supervised learning, that is, the model does not need paired data to training. We assume that clear and foggy images come from different domains. There are two generators that act as domain translators, one from foggy image domain to clean image domain, and the other from foggy image to clean image. Two discriminators in the GAN are used for assessing each domain translator. The GAN loss, combined with the cycle consistency loss are used to regularize the model. We carried out experiments to evaluate the proposed method, and the results demonstrate the effectiveness in dehazing and there is indeed difference between the real-fog images and the synthetic images.