Ensemble Model for Chunking

Nilamadhaba Mohapatra, Namrata Sarraf, S. Sahu
{"title":"Ensemble Model for Chunking","authors":"Nilamadhaba Mohapatra, Namrata Sarraf, S. Sahu","doi":"10.5121/csit.2021.110811","DOIUrl":null,"url":null,"abstract":"Transformer Models have taken over most of the Natural language Inference tasks. In recent times they have proved to beat several benchmarks. Chunking means splitting the sentences into tokens and then grouping them in a meaningful way. Chunking is a task that has gradually moved from POS tag-based statistical models to neural nets using Language models such as LSTM, Bidirectional LSTMs, attention models, etc. Deep neural net Models are deployed indirectly for classifying tokens as different tags defined under Named Recognition Tasks. Later these tags are used in conjunction with pointer frameworks for the final chunking task. In our paper, we propose an Ensemble Model using a fine-tuned Transformer Model and a recurrent neural network model together to predict tags and chunk substructures of a sentence. We analyzed the shortcomings of the transformer models in predicting different tags and then trained the BILSTM+CNN accordingly to compensate for the same.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer science & information technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/csit.2021.110811","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Transformer Models have taken over most of the Natural language Inference tasks. In recent times they have proved to beat several benchmarks. Chunking means splitting the sentences into tokens and then grouping them in a meaningful way. Chunking is a task that has gradually moved from POS tag-based statistical models to neural nets using Language models such as LSTM, Bidirectional LSTMs, attention models, etc. Deep neural net Models are deployed indirectly for classifying tokens as different tags defined under Named Recognition Tasks. Later these tags are used in conjunction with pointer frameworks for the final chunking task. In our paper, we propose an Ensemble Model using a fine-tuned Transformer Model and a recurrent neural network model together to predict tags and chunk substructures of a sentence. We analyzed the shortcomings of the transformer models in predicting different tags and then trained the BILSTM+CNN accordingly to compensate for the same.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
分块集成模型
转换模型已经接管了大部分的自然语言推理任务。近年来,事实证明,它们的表现超过了几个基准。分块是指将句子分成不同的符号,然后以有意义的方式将它们分组。分块是一个从基于词性标注的统计模型逐渐发展到使用LSTM、双向LSTM、注意力模型等语言模型的神经网络的任务。深度神经网络模型被间接部署用于将令牌分类为命名识别任务下定义的不同标签。稍后,这些标签将与指针框架一起用于最终的分块任务。在我们的论文中,我们提出了一个集成模型,使用一个微调的变压器模型和一个递归神经网络模型一起预测句子的标签和块子结构。我们分析了变压器模型在预测不同标签方面的不足,然后对BILSTM+CNN进行相应的训练来弥补不足。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Tensor-Based Multi-Modality Feature Selection and Regression for Alzheimer's Disease Diagnosis. Tensor-Based Multi-Modality Feature Selection and Regression for Alzheimer's Disease Diagnosis Lattice Based Group Key Exchange Protocol in the Standard Model The 5 Dimensions of Problem Solving using DINNA Diagram: Double Ishikawa and Naze Naze Analysis Appraisal Study of Similarity-Based and Embedding-Based Link Prediction Methods on Graphs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1