{"title":"基于概率后缀树语言模型的文本生成","authors":"S. Marukatat","doi":"10.1109/iSAI-NLP54397.2021.9678167","DOIUrl":null,"url":null,"abstract":"During last decade, language modeling has been dominated by neural structures; RNN, LSTM or Transformer. These neural language models provide excellent performance to the detriment of very high computational cost. This work investigates the use of probabilistic language model that requires much less computational cost. In particular, we are interested in variable-order Markov model that can be efficiently implemented on a probabilistic suffix tree (PST) structure. The PST construction is cheap and can be easily scaled to very large dataset. Experimental results show that this model can be used to generated realistic sentences.","PeriodicalId":339826,"journal":{"name":"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Text generation by probabilistic suffix tree language model\",\"authors\":\"S. Marukatat\",\"doi\":\"10.1109/iSAI-NLP54397.2021.9678167\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"During last decade, language modeling has been dominated by neural structures; RNN, LSTM or Transformer. These neural language models provide excellent performance to the detriment of very high computational cost. This work investigates the use of probabilistic language model that requires much less computational cost. In particular, we are interested in variable-order Markov model that can be efficiently implemented on a probabilistic suffix tree (PST) structure. The PST construction is cheap and can be easily scaled to very large dataset. Experimental results show that this model can be used to generated realistic sentences.\",\"PeriodicalId\":339826,\"journal\":{\"name\":\"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)\",\"volume\":\"77 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iSAI-NLP54397.2021.9678167\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iSAI-NLP54397.2021.9678167","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Text generation by probabilistic suffix tree language model
During last decade, language modeling has been dominated by neural structures; RNN, LSTM or Transformer. These neural language models provide excellent performance to the detriment of very high computational cost. This work investigates the use of probabilistic language model that requires much less computational cost. In particular, we are interested in variable-order Markov model that can be efficiently implemented on a probabilistic suffix tree (PST) structure. The PST construction is cheap and can be easily scaled to very large dataset. Experimental results show that this model can be used to generated realistic sentences.