Yu Wang, Shunping Zhou, Qingfeng Guan, Fang Fang, Ni Yang, Kanglin Li, Yuanyuan Liu
{"title":"通过地理标记照片的多视角情感识别加强地方情感分析:全球旅游景点视角","authors":"Yu Wang, Shunping Zhou, Qingfeng Guan, Fang Fang, Ni Yang, Kanglin Li, Yuanyuan Liu","doi":"10.3390/ijgi13070256","DOIUrl":null,"url":null,"abstract":"User−generated geo−tagged photos (UGPs) have emerged as a valuable tool for analyzing large−scale tourist place emotions with unprecedented detail. This process involves extracting and analyzing human emotions associated with specific locations. However, previous studies have been limited to analyzing individual faces in the UGPs. This approach falls short of representing the contextual scene characteristics, such as environmental elements and overall scene context, which may contain implicit emotional knowledge. To address this issue, we propose an innovative computational framework for global tourist place emotion analysis leveraging UGPs. Specifically, we first introduce a Multi−view Graph Fusion Network (M−GFN) to effectively recognize multi−view emotions from UGPs, considering crowd emotions and scene implicit sentiment. After that, we designed an attraction−specific emotion index (AEI) to quantitatively measure place emotions based on the identified multi−view emotions at various tourist attractions with place types. Complementing the AEI, we employ the emotion intensity index (EII) and Pearson correlation coefficient (PCC) to deepen the exploration of the association between attraction types and place emotions. The synergy of AEI, EII, and PCC allows comprehensive attraction−specific place emotion extraction, enhancing the overall quality of tourist place emotion analysis. Extensive experiments demonstrate that our framework enhances existing place emotion analysis methods, and the M−GFN outperforms state−of−the−art emotion recognition methods. Our framework can be adapted for various geo−emotion analysis tasks, like recognizing and regulating workplace emotions, underscoring the intrinsic link between emotions and geographic contexts.","PeriodicalId":48738,"journal":{"name":"ISPRS International Journal of Geo-Information","volume":"56 1","pages":""},"PeriodicalIF":2.8000,"publicationDate":"2024-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing Place Emotion Analysis with Multi-View Emotion Recognition from Geo-Tagged Photos: A Global Tourist Attraction Perspective\",\"authors\":\"Yu Wang, Shunping Zhou, Qingfeng Guan, Fang Fang, Ni Yang, Kanglin Li, Yuanyuan Liu\",\"doi\":\"10.3390/ijgi13070256\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"User−generated geo−tagged photos (UGPs) have emerged as a valuable tool for analyzing large−scale tourist place emotions with unprecedented detail. This process involves extracting and analyzing human emotions associated with specific locations. However, previous studies have been limited to analyzing individual faces in the UGPs. This approach falls short of representing the contextual scene characteristics, such as environmental elements and overall scene context, which may contain implicit emotional knowledge. To address this issue, we propose an innovative computational framework for global tourist place emotion analysis leveraging UGPs. Specifically, we first introduce a Multi−view Graph Fusion Network (M−GFN) to effectively recognize multi−view emotions from UGPs, considering crowd emotions and scene implicit sentiment. After that, we designed an attraction−specific emotion index (AEI) to quantitatively measure place emotions based on the identified multi−view emotions at various tourist attractions with place types. Complementing the AEI, we employ the emotion intensity index (EII) and Pearson correlation coefficient (PCC) to deepen the exploration of the association between attraction types and place emotions. The synergy of AEI, EII, and PCC allows comprehensive attraction−specific place emotion extraction, enhancing the overall quality of tourist place emotion analysis. Extensive experiments demonstrate that our framework enhances existing place emotion analysis methods, and the M−GFN outperforms state−of−the−art emotion recognition methods. Our framework can be adapted for various geo−emotion analysis tasks, like recognizing and regulating workplace emotions, underscoring the intrinsic link between emotions and geographic contexts.\",\"PeriodicalId\":48738,\"journal\":{\"name\":\"ISPRS International Journal of Geo-Information\",\"volume\":\"56 1\",\"pages\":\"\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2024-07-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS International Journal of Geo-Information\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://doi.org/10.3390/ijgi13070256\",\"RegionNum\":3,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS International Journal of Geo-Information","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.3390/ijgi13070256","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Enhancing Place Emotion Analysis with Multi-View Emotion Recognition from Geo-Tagged Photos: A Global Tourist Attraction Perspective
User−generated geo−tagged photos (UGPs) have emerged as a valuable tool for analyzing large−scale tourist place emotions with unprecedented detail. This process involves extracting and analyzing human emotions associated with specific locations. However, previous studies have been limited to analyzing individual faces in the UGPs. This approach falls short of representing the contextual scene characteristics, such as environmental elements and overall scene context, which may contain implicit emotional knowledge. To address this issue, we propose an innovative computational framework for global tourist place emotion analysis leveraging UGPs. Specifically, we first introduce a Multi−view Graph Fusion Network (M−GFN) to effectively recognize multi−view emotions from UGPs, considering crowd emotions and scene implicit sentiment. After that, we designed an attraction−specific emotion index (AEI) to quantitatively measure place emotions based on the identified multi−view emotions at various tourist attractions with place types. Complementing the AEI, we employ the emotion intensity index (EII) and Pearson correlation coefficient (PCC) to deepen the exploration of the association between attraction types and place emotions. The synergy of AEI, EII, and PCC allows comprehensive attraction−specific place emotion extraction, enhancing the overall quality of tourist place emotion analysis. Extensive experiments demonstrate that our framework enhances existing place emotion analysis methods, and the M−GFN outperforms state−of−the−art emotion recognition methods. Our framework can be adapted for various geo−emotion analysis tasks, like recognizing and regulating workplace emotions, underscoring the intrinsic link between emotions and geographic contexts.
期刊介绍:
ISPRS International Journal of Geo-Information (ISSN 2220-9964) provides an advanced forum for the science and technology of geographic information. ISPRS International Journal of Geo-Information publishes regular research papers, reviews and communications. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.
The 2018 IJGI Outstanding Reviewer Award has been launched! This award acknowledge those who have generously dedicated their time to review manuscripts submitted to IJGI. See full details at http://www.mdpi.com/journal/ijgi/awards.