Hendrik Lücke-Tieke, Marcel Beuth, Philipp Schader, T. May, J. Bernard, J. Kohlhammer
{"title":"Lowering the Barrier for Successful Replication and Evaluation","authors":"Hendrik Lücke-Tieke, Marcel Beuth, Philipp Schader, T. May, J. Bernard, J. Kohlhammer","doi":"10.1109/BELIV.2018.8634201","DOIUrl":null,"url":null,"abstract":"Evaluation of a visualization technique is complex and time-consuming. We present a system that aims at easing design, creation and execution of controlled experiments for visualizations in the web. We include of parameterizable visualization generation services, thus separating the visualization implementation from study design and execution. This enables experimenters to design and run multiple experiments on the same visualization service in parallel, replicate experiments, and compare different visualization services quickly. The system supports the range from simple questionnaires to visualization-specific interaction techniques as well as automated task generation based on dynamic sampling of parameter spaces. We feature two examples to demonstrate our service-based approach. One example demonstrates how a suite of successive experiments can be conducted, while the other example includes an extended replication study.","PeriodicalId":235801,"journal":{"name":"Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BELIV.2018.8634201","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Evaluation of a visualization technique is complex and time-consuming. We present a system that aims at easing design, creation and execution of controlled experiments for visualizations in the web. We include of parameterizable visualization generation services, thus separating the visualization implementation from study design and execution. This enables experimenters to design and run multiple experiments on the same visualization service in parallel, replicate experiments, and compare different visualization services quickly. The system supports the range from simple questionnaires to visualization-specific interaction techniques as well as automated task generation based on dynamic sampling of parameter spaces. We feature two examples to demonstrate our service-based approach. One example demonstrates how a suite of successive experiments can be conducted, while the other example includes an extended replication study.