{"title":"A Novel Model of Supervised Clustering using Sentiment and Contextual Analysis for Fake News Detection","authors":"Suman De, Dhriti Agarwal","doi":"10.1109/MPCIT51588.2020.9350457","DOIUrl":null,"url":null,"abstract":"Unorganized data is a massive source of cluttered information available over the web. It possesses a major problem when this data originates from unauthenticated sources creating confusion among the general public. The amount of fake news regarding the current COVID-19 scenario and political movements have had an adverse effect on the world. It is necessary to devise models and a step by step algorithm to tackle this challenge. This paper talks about a model that identifies data available over the web and performs crawling to get information about the data sources and maps the information with regards to the authenticity of the source. We look at possible web perspectives of data sources, official social media handles, reviewed agency lists, sentiment analysis, and calculate a value for a piece of particular news. The observed critical value looks for identifying the authenticity of the news and forms the basis of this idea. This paper also looks at a model that uses supervised learning to classify various news items depending on the defined criteria.","PeriodicalId":136514,"journal":{"name":"2020 Third International Conference on Multimedia Processing, Communication & Information Technology (MPCIT)","volume":"97 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Third International Conference on Multimedia Processing, Communication & Information Technology (MPCIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MPCIT51588.2020.9350457","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Unorganized data is a massive source of cluttered information available over the web. It possesses a major problem when this data originates from unauthenticated sources creating confusion among the general public. The amount of fake news regarding the current COVID-19 scenario and political movements have had an adverse effect on the world. It is necessary to devise models and a step by step algorithm to tackle this challenge. This paper talks about a model that identifies data available over the web and performs crawling to get information about the data sources and maps the information with regards to the authenticity of the source. We look at possible web perspectives of data sources, official social media handles, reviewed agency lists, sentiment analysis, and calculate a value for a piece of particular news. The observed critical value looks for identifying the authenticity of the news and forms the basis of this idea. This paper also looks at a model that uses supervised learning to classify various news items depending on the defined criteria.