{"title":"Incorporation of Contextual Information into BERT for Dialog Act Classification in Japanese","authors":"Shun Katada, Kiyoaki Shirai, S. Okada","doi":"10.1109/iSAI-NLP54397.2021.9678172","DOIUrl":null,"url":null,"abstract":"Recently developed Bidirectional Encoder Representations from Transformers (BERT) outperforms the state-of-the-art in many natural language processing tasks in English. Although contextual information is known to be useful for dialog act classification, fine-tuning BERT with contextual information has not been investigated, especially in head final languages such as Japanese. This paper investigates whether BERT with contextual information performs well on dialog act classification in Japanese open-domain conversation. In our proposed model, not only the utterance itself but also the information about previous utterances and turn-taking are taken into account. Results of experiments on a Japanese dialog corpus showed that the incorporation of the contextual information improved the F1-score by 6.7 points.","PeriodicalId":339826,"journal":{"name":"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iSAI-NLP54397.2021.9678172","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recently developed Bidirectional Encoder Representations from Transformers (BERT) outperforms the state-of-the-art in many natural language processing tasks in English. Although contextual information is known to be useful for dialog act classification, fine-tuning BERT with contextual information has not been investigated, especially in head final languages such as Japanese. This paper investigates whether BERT with contextual information performs well on dialog act classification in Japanese open-domain conversation. In our proposed model, not only the utterance itself but also the information about previous utterances and turn-taking are taken into account. Results of experiments on a Japanese dialog corpus showed that the incorporation of the contextual information improved the F1-score by 6.7 points.