{"title":"Sarcasm Recognition on News Headlines Using Multiple Channel Embedding Attention BLSTM","authors":"Azika Syahputra Azwar, Suharjito Suharjito","doi":"10.15408/jti.v15i2.28417","DOIUrl":null,"url":null,"abstract":"Sarcasm is a statement that conveys an opposing viewpoint via positive or exaggeratedly positive phrases. Due to this intentional ambiguity, sarcasm identification has become one of the important factors in sentiment analysis that make many researchers in natural language processing intensively study sarcasm detection. This research is using multiple channels embedding the attention bidirectional long-short memory (MCEA-BLSTM) model that explored sarcasm detection in news headlines and has different approach from previous research-developed models that lexical, semantic, and pragmatic properties. This research found that multiple channels embedding attention mechanism improve the performance of BLSTM, making it superior to other models. The proposed method achieves 96.64% accuracy with an f-measure of 97%","PeriodicalId":52586,"journal":{"name":"Jurnal Sarjana Teknik Informatika","volume":"21 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Jurnal Sarjana Teknik Informatika","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15408/jti.v15i2.28417","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Sarcasm is a statement that conveys an opposing viewpoint via positive or exaggeratedly positive phrases. Due to this intentional ambiguity, sarcasm identification has become one of the important factors in sentiment analysis that make many researchers in natural language processing intensively study sarcasm detection. This research is using multiple channels embedding the attention bidirectional long-short memory (MCEA-BLSTM) model that explored sarcasm detection in news headlines and has different approach from previous research-developed models that lexical, semantic, and pragmatic properties. This research found that multiple channels embedding attention mechanism improve the performance of BLSTM, making it superior to other models. The proposed method achieves 96.64% accuracy with an f-measure of 97%