{"title":"图像质量与视觉注意力的互动:在显著性空间走向更可靠的分析","authors":"J. Redi, I. Heynderickx","doi":"10.1109/QoMEX.2011.6065705","DOIUrl":null,"url":null,"abstract":"Understanding bottom-up and top-down visual attention mechanisms related to visual quality perception can be greatly beneficial for the design of effective objective quality metrics. Subjective studies based on eye-movement tracking have been recently published that try to get more insight in these interactions. However, it is still not easy to find coherence across their results, also due to the different methodologies adopted to analyze eye-tracking data. In this paper we propose a robust methodology to measure differences between eye-tracking data collected under different experimental conditions. The proposed method takes into account inter-observer variability and content effects, producing results that give an accurate insight in attention variations.","PeriodicalId":6441,"journal":{"name":"2011 Third International Workshop on Quality of Multimedia Experience","volume":"213 1","pages":"201-206"},"PeriodicalIF":0.0000,"publicationDate":"2011-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":"{\"title\":\"Image quality and visual attention interactions: Towards a more reliable analysis in the saliency space\",\"authors\":\"J. Redi, I. Heynderickx\",\"doi\":\"10.1109/QoMEX.2011.6065705\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Understanding bottom-up and top-down visual attention mechanisms related to visual quality perception can be greatly beneficial for the design of effective objective quality metrics. Subjective studies based on eye-movement tracking have been recently published that try to get more insight in these interactions. However, it is still not easy to find coherence across their results, also due to the different methodologies adopted to analyze eye-tracking data. In this paper we propose a robust methodology to measure differences between eye-tracking data collected under different experimental conditions. The proposed method takes into account inter-observer variability and content effects, producing results that give an accurate insight in attention variations.\",\"PeriodicalId\":6441,\"journal\":{\"name\":\"2011 Third International Workshop on Quality of Multimedia Experience\",\"volume\":\"213 1\",\"pages\":\"201-206\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-11-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"14\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 Third International Workshop on Quality of Multimedia Experience\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/QoMEX.2011.6065705\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 Third International Workshop on Quality of Multimedia Experience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QoMEX.2011.6065705","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Image quality and visual attention interactions: Towards a more reliable analysis in the saliency space
Understanding bottom-up and top-down visual attention mechanisms related to visual quality perception can be greatly beneficial for the design of effective objective quality metrics. Subjective studies based on eye-movement tracking have been recently published that try to get more insight in these interactions. However, it is still not easy to find coherence across their results, also due to the different methodologies adopted to analyze eye-tracking data. In this paper we propose a robust methodology to measure differences between eye-tracking data collected under different experimental conditions. The proposed method takes into account inter-observer variability and content effects, producing results that give an accurate insight in attention variations.