{"title":"响应式设计网页中布局失败的自动视觉验证","authors":"Ibrahim Althomali, G. M. Kapfhammer, Phil McMinn","doi":"10.1109/ICST.2019.00027","DOIUrl":null,"url":null,"abstract":"Responsively designed web pages adjust their layout according to the viewport width of the device in use. Although tools exist to help developers test the layout of a responsive web page, they often rely on humans to flag problems. Yet, the considerable number of web-enabled devices with unique viewport widths makes this manual process both time-consuming and error-prone. Capable of detecting some common responsive layout failures, the ReDeCheck tool partially automates this process. Since ReDeCheck focuses on a web page's document object model (DOM), some of the issues it finds are not observable by humans. This paper presents a tool, called Viser, that renders a ReDeCheck-reported layout issue in a browser, adjusting the opacity of certain elements and checking for a visible difference. Unless Viser classifies an issue as a human-observable layout failure, a web developer can ignore it. This paper's experiments reveal the benefit of using Viser to support automated visual verification of layout failures in responsively designed web pages. Viser automatically classified all of the 117 layout failures that ReDeCheck reported for 20 web pages, each of which had to be manually analyzed in a prior study. Viser's automated manipulation of element opacity also highlighted manual classification's subjectivity: it categorized 28 issues differently to manual analysis, including three correctly reclassified as false positives.","PeriodicalId":446827,"journal":{"name":"2019 12th IEEE Conference on Software Testing, Validation and Verification (ICST)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Automatic Visual Verification of Layout Failures in Responsively Designed Web Pages\",\"authors\":\"Ibrahim Althomali, G. M. Kapfhammer, Phil McMinn\",\"doi\":\"10.1109/ICST.2019.00027\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Responsively designed web pages adjust their layout according to the viewport width of the device in use. Although tools exist to help developers test the layout of a responsive web page, they often rely on humans to flag problems. Yet, the considerable number of web-enabled devices with unique viewport widths makes this manual process both time-consuming and error-prone. Capable of detecting some common responsive layout failures, the ReDeCheck tool partially automates this process. Since ReDeCheck focuses on a web page's document object model (DOM), some of the issues it finds are not observable by humans. This paper presents a tool, called Viser, that renders a ReDeCheck-reported layout issue in a browser, adjusting the opacity of certain elements and checking for a visible difference. Unless Viser classifies an issue as a human-observable layout failure, a web developer can ignore it. This paper's experiments reveal the benefit of using Viser to support automated visual verification of layout failures in responsively designed web pages. Viser automatically classified all of the 117 layout failures that ReDeCheck reported for 20 web pages, each of which had to be manually analyzed in a prior study. Viser's automated manipulation of element opacity also highlighted manual classification's subjectivity: it categorized 28 issues differently to manual analysis, including three correctly reclassified as false positives.\",\"PeriodicalId\":446827,\"journal\":{\"name\":\"2019 12th IEEE Conference on Software Testing, Validation and Verification (ICST)\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-04-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 12th IEEE Conference on Software Testing, Validation and Verification (ICST)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICST.2019.00027\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 12th IEEE Conference on Software Testing, Validation and Verification (ICST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICST.2019.00027","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automatic Visual Verification of Layout Failures in Responsively Designed Web Pages
Responsively designed web pages adjust their layout according to the viewport width of the device in use. Although tools exist to help developers test the layout of a responsive web page, they often rely on humans to flag problems. Yet, the considerable number of web-enabled devices with unique viewport widths makes this manual process both time-consuming and error-prone. Capable of detecting some common responsive layout failures, the ReDeCheck tool partially automates this process. Since ReDeCheck focuses on a web page's document object model (DOM), some of the issues it finds are not observable by humans. This paper presents a tool, called Viser, that renders a ReDeCheck-reported layout issue in a browser, adjusting the opacity of certain elements and checking for a visible difference. Unless Viser classifies an issue as a human-observable layout failure, a web developer can ignore it. This paper's experiments reveal the benefit of using Viser to support automated visual verification of layout failures in responsively designed web pages. Viser automatically classified all of the 117 layout failures that ReDeCheck reported for 20 web pages, each of which had to be manually analyzed in a prior study. Viser's automated manipulation of element opacity also highlighted manual classification's subjectivity: it categorized 28 issues differently to manual analysis, including three correctly reclassified as false positives.