{"title":"Adaptive Slot-Filling for Turkish Natural Language Understanding","authors":"A. Balcioglu","doi":"10.1109/UBMK55850.2022.9919492","DOIUrl":null,"url":null,"abstract":"Slot-filling is a key part of natural language under-standing that aims to extract words which hold certain attributes for the dialogue system. Although slot-filling is traditionally considered to be a data demanding and expensive task, advances in transformer models can help to solve this problem via transfer learning. In this paper, we propose an adaptive transfer-learning based slot filling model using BERT and conditional random fields (CRFs). We also introduce and discuss the stemming problem for agglutinative languages in slot-filling, which we define as the ambiguity of meaning between extracting the whole word or extracting a part of the word for the slot. We propose a novel definition of stemming specifically for wordpiece tokenizers used in transformer models and use it to solve the stemming issue. Our experiments with the BERT-CRF model out perform previous models on Turkish slot filling. We also show that under the new definition, wordpiece tokenizers perform on par with current state-of-the-art stemming models. Finally, we contend transformer based models like ours can overcome the stemming issue with the help of labelling.","PeriodicalId":417604,"journal":{"name":"2022 7th International Conference on Computer Science and Engineering (UBMK)","volume":"39 5-6","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Computer Science and Engineering (UBMK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UBMK55850.2022.9919492","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Slot-filling is a key part of natural language under-standing that aims to extract words which hold certain attributes for the dialogue system. Although slot-filling is traditionally considered to be a data demanding and expensive task, advances in transformer models can help to solve this problem via transfer learning. In this paper, we propose an adaptive transfer-learning based slot filling model using BERT and conditional random fields (CRFs). We also introduce and discuss the stemming problem for agglutinative languages in slot-filling, which we define as the ambiguity of meaning between extracting the whole word or extracting a part of the word for the slot. We propose a novel definition of stemming specifically for wordpiece tokenizers used in transformer models and use it to solve the stemming issue. Our experiments with the BERT-CRF model out perform previous models on Turkish slot filling. We also show that under the new definition, wordpiece tokenizers perform on par with current state-of-the-art stemming models. Finally, we contend transformer based models like ours can overcome the stemming issue with the help of labelling.