{"title":"Domain Specific word Embedding Matrix for Training Neural Networks","authors":"Dorde Petrovic, S. Janicijevic","doi":"10.1109/IC-AIAI48757.2019.00022","DOIUrl":null,"url":null,"abstract":"The text represents one of the most widespread sequential models and as such is well suited to the application of deep learning models from sequential data. Deep learning through natural language processing is pattern recognition, applied to words, sentences, and paragraphs. This study describes the process of creating a pre-trained word embeddings matrix and its subsequent use in various neural network models for the purposes of domain-specific texts classification. Embedding words is one of the popular ways to associate vectors with words. Creating a word embedding matrix maps imply well semantic relationship between words, which can vary from task to task.","PeriodicalId":374193,"journal":{"name":"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC-AIAI48757.2019.00022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
The text represents one of the most widespread sequential models and as such is well suited to the application of deep learning models from sequential data. Deep learning through natural language processing is pattern recognition, applied to words, sentences, and paragraphs. This study describes the process of creating a pre-trained word embeddings matrix and its subsequent use in various neural network models for the purposes of domain-specific texts classification. Embedding words is one of the popular ways to associate vectors with words. Creating a word embedding matrix maps imply well semantic relationship between words, which can vary from task to task.