{"title":"TCP-RBA: Semi-supervised learning for traditional chinese painting classification with random brushwork augment","authors":"Yahui Ding, Hongjuan Wang, Nan Liu, Tong Li","doi":"10.3233/jifs-236533","DOIUrl":null,"url":null,"abstract":"Traditional Chinese painting (TCP), culturally significant, reflects China’s rich history and aesthetics. In recent years, TCP classification has shown impressive performance, but obtaining accurate annotations for these tasks is time-consuming and expensive, involving professional art experts. To address this challenge, we present a semi-supervised learning (SSL) method for traditional painting classification, achieving exceptional results even with a limited number of labels. To improve global representation learning, we employ the self-attention-based MobileVit model as the backbone network. Furthermore, We present a data augmentation strategy, Random Brushwork Augment (RBA), which integrates brushwork to enhance the performance. Comparative experiments confirm the effectiveness of TCP-RBA in Chinese painting classification, demonstrating outstanding accuracy of 88.27% on the test dataset, even with only 10 labels, each representing a single class.","PeriodicalId":509313,"journal":{"name":"Journal of Intelligent & Fuzzy Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intelligent & Fuzzy Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3233/jifs-236533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Traditional Chinese painting (TCP), culturally significant, reflects China’s rich history and aesthetics. In recent years, TCP classification has shown impressive performance, but obtaining accurate annotations for these tasks is time-consuming and expensive, involving professional art experts. To address this challenge, we present a semi-supervised learning (SSL) method for traditional painting classification, achieving exceptional results even with a limited number of labels. To improve global representation learning, we employ the self-attention-based MobileVit model as the backbone network. Furthermore, We present a data augmentation strategy, Random Brushwork Augment (RBA), which integrates brushwork to enhance the performance. Comparative experiments confirm the effectiveness of TCP-RBA in Chinese painting classification, demonstrating outstanding accuracy of 88.27% on the test dataset, even with only 10 labels, each representing a single class.