{"title":"FastPhotoStyle在军用车辆检测合成数据中的应用","authors":"Hyeongkeun Lee, Kyungmin Lee, Hunmin Yang, Se-Yoon Oh","doi":"10.23919/ICCAS50221.2020.9268331","DOIUrl":null,"url":null,"abstract":"Object detection is one of the main task for the deep learning applications. Deep learning performance has already exceeded human’s detection ability, in the case when there are lots of data for training deep neural networks. In the case of military fields, there are needs to resolve the data shortage problem to employ deep learning system efficiently with benefits. Generating the synthetic data can be a solution, but the domain gap between the synthetic and real data is still an obstacle for training the model. In this paper, we propose a method for decreasing the domain gap by applying style transfer techniques to synthetic data for military vehicle detection. Utilizing FastPhotoStyle to the synthetic data aids efficiently improving the accuracy of object detection when the real data is insufficiency for training. Specifically, we show that stylization which enables artificial data more realistic diminishes the domain gap by evaluating the visualization of their distributions using principal component analysis and Fréchet inception distance score. As a result, the performance has been improved about 8% in the AP@50 metric for stylized synthetic data.","PeriodicalId":6732,"journal":{"name":"2020 20th International Conference on Control, Automation and Systems (ICCAS)","volume":"12 1","pages":"137-140"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Applying FastPhotoStyle to Synthetic Data for Military Vehicle Detection\",\"authors\":\"Hyeongkeun Lee, Kyungmin Lee, Hunmin Yang, Se-Yoon Oh\",\"doi\":\"10.23919/ICCAS50221.2020.9268331\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Object detection is one of the main task for the deep learning applications. Deep learning performance has already exceeded human’s detection ability, in the case when there are lots of data for training deep neural networks. In the case of military fields, there are needs to resolve the data shortage problem to employ deep learning system efficiently with benefits. Generating the synthetic data can be a solution, but the domain gap between the synthetic and real data is still an obstacle for training the model. In this paper, we propose a method for decreasing the domain gap by applying style transfer techniques to synthetic data for military vehicle detection. Utilizing FastPhotoStyle to the synthetic data aids efficiently improving the accuracy of object detection when the real data is insufficiency for training. Specifically, we show that stylization which enables artificial data more realistic diminishes the domain gap by evaluating the visualization of their distributions using principal component analysis and Fréchet inception distance score. As a result, the performance has been improved about 8% in the AP@50 metric for stylized synthetic data.\",\"PeriodicalId\":6732,\"journal\":{\"name\":\"2020 20th International Conference on Control, Automation and Systems (ICCAS)\",\"volume\":\"12 1\",\"pages\":\"137-140\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 20th International Conference on Control, Automation and Systems (ICCAS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.23919/ICCAS50221.2020.9268331\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 20th International Conference on Control, Automation and Systems (ICCAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/ICCAS50221.2020.9268331","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Applying FastPhotoStyle to Synthetic Data for Military Vehicle Detection
Object detection is one of the main task for the deep learning applications. Deep learning performance has already exceeded human’s detection ability, in the case when there are lots of data for training deep neural networks. In the case of military fields, there are needs to resolve the data shortage problem to employ deep learning system efficiently with benefits. Generating the synthetic data can be a solution, but the domain gap between the synthetic and real data is still an obstacle for training the model. In this paper, we propose a method for decreasing the domain gap by applying style transfer techniques to synthetic data for military vehicle detection. Utilizing FastPhotoStyle to the synthetic data aids efficiently improving the accuracy of object detection when the real data is insufficiency for training. Specifically, we show that stylization which enables artificial data more realistic diminishes the domain gap by evaluating the visualization of their distributions using principal component analysis and Fréchet inception distance score. As a result, the performance has been improved about 8% in the AP@50 metric for stylized synthetic data.