首页 > 最新文献

IUI Companion '14最新文献

英文 中文
Silent speech decoder using adaptive collection 静音语音解码器采用自适应采集
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559190
M. Matsumoto
We investigated a classification method using brain computer interfaces (BCIs) for silent speech. Event-related potentials (ERPs) obtained when four subjects imagined the vocalization of two Japanese vowels while they remained silent and immobilized were recorded. We used an adaptive collection (AC) that adaptively selects suitable output signals of common spatial patterns (CSP) filters and its time duration for classification. The classification accuracies (CAs) were 73-92% for the pairwise classification /a/ vs. /u/ in the use of 63 channels and significantly better than previous study.
研究了一种基于脑机接口(bci)的无声言语分类方法。记录了4名被试在静止不动状态下想象两个日语元音发声时的事件相关电位(event - associated potential, ERPs)。我们使用自适应采集(AC),自适应地选择合适的公共空间模式(CSP)滤波器的输出信号及其持续时间进行分类。在63个通道中,/a/ vs /u/两两分类的分类准确率(CAs)为73 ~ 92%,明显优于前人的研究。
{"title":"Silent speech decoder using adaptive collection","authors":"M. Matsumoto","doi":"10.1145/2559184.2559190","DOIUrl":"https://doi.org/10.1145/2559184.2559190","url":null,"abstract":"We investigated a classification method using brain computer interfaces (BCIs) for silent speech. Event-related potentials (ERPs) obtained when four subjects imagined the vocalization of two Japanese vowels while they remained silent and immobilized were recorded. We used an adaptive collection (AC) that adaptively selects suitable output signals of common spatial patterns (CSP) filters and its time duration for classification. The classification accuracies (CAs) were 73-92% for the pairwise classification /a/ vs. /u/ in the use of 63 channels and significantly better than previous study.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125938447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Toward emotion regulation via physical interaction 通过身体互动来调节情绪
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559186
A. D. Rooij
Emotions can be regulated to fit a task in order to enhance task performance. Motor expressions can help regulate emotion. This paper briefly reports ongoing work on the design of physical interactions based on motor expressions that can help regulate emotion to fit a task. We argue that to be effective, such interactions must be made meaningful in relation to ongoing appraisal processes, and that such interactions can help regulate emotion via congruence, suppression, or incompatibility. We present previous work on the validation of these arguments within the context of supporting idea generation, and develop a roadmap for research that aims to translate these results to the design of physical interactions under device constraints. The research will enable designers of interactive technology to develop physical interactions that help regulate emotion with the aim to help people get the most out of their own capabilities.
情绪可以被调节以适应一项任务,从而提高任务绩效。运动表情可以帮助调节情绪。本文简要报告了基于运动表达的物理交互设计的正在进行的工作,这可以帮助调节情绪以适应任务。我们认为,要想有效,这种互动必须在持续的评估过程中变得有意义,并且这种互动可以通过一致性、抑制或不相容来帮助调节情绪。我们介绍了之前在支持想法产生的背景下对这些论点进行验证的工作,并制定了一个研究路线图,旨在将这些结果转化为设备约束下物理交互的设计。这项研究将使互动技术的设计者能够开发出有助于调节情绪的身体互动,目的是帮助人们最大限度地发挥自己的能力。
{"title":"Toward emotion regulation via physical interaction","authors":"A. D. Rooij","doi":"10.1145/2559184.2559186","DOIUrl":"https://doi.org/10.1145/2559184.2559186","url":null,"abstract":"Emotions can be regulated to fit a task in order to enhance task performance. Motor expressions can help regulate emotion. This paper briefly reports ongoing work on the design of physical interactions based on motor expressions that can help regulate emotion to fit a task. We argue that to be effective, such interactions must be made meaningful in relation to ongoing appraisal processes, and that such interactions can help regulate emotion via congruence, suppression, or incompatibility. We present previous work on the validation of these arguments within the context of supporting idea generation, and develop a roadmap for research that aims to translate these results to the design of physical interactions under device constraints. The research will enable designers of interactive technology to develop physical interactions that help regulate emotion with the aim to help people get the most out of their own capabilities.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122722520","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Demo: making plans scrutable with argumentation and natural language generation 演示:通过论证和自然语言生成使计划可理解
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559202
N. Tintarev, Roman Kutlák
Autonomous systems perform tasks without human guidance. Techniques for making autonomous systems scrutable and, hence, more transparent are required in order to support humans working with such systems. The Scrutable Autonomous Systems (SAsSy) demo shows a novel way of combining argumentation and natural language to generate a human understandable explanation dialogue. By interacting with SAsSy users are able to ask why a certain plan was selected for execution, why other alternatives were not selected, also allowing users to modify information in the system.
自主系统在没有人类指导的情况下执行任务。为了支持人类使用这些系统,需要使自治系统具有可操作性并因此更加透明的技术。可伸缩自治系统(SAsSy)演示展示了一种结合论证和自然语言来生成人类可理解的解释对话的新方法。通过与SAsSy交互,用户能够询问为什么选择执行某个计划,为什么没有选择其他替代方案,还允许用户修改系统中的信息。
{"title":"Demo: making plans scrutable with argumentation and natural language generation","authors":"N. Tintarev, Roman Kutlák","doi":"10.1145/2559184.2559202","DOIUrl":"https://doi.org/10.1145/2559184.2559202","url":null,"abstract":"Autonomous systems perform tasks without human guidance. Techniques for making autonomous systems scrutable and, hence, more transparent are required in order to support humans working with such systems. The Scrutable Autonomous Systems (SAsSy) demo shows a novel way of combining argumentation and natural language to generate a human understandable explanation dialogue. By interacting with SAsSy users are able to ask why a certain plan was selected for execution, why other alternatives were not selected, also allowing users to modify information in the system.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128672075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
See-through mobile AR system for natural 3D interaction 透明移动AR系统,实现自然3D交互
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559198
Yuko Unuma, T. Niikura, T. Komuro
In this paper, we propose an interaction system which displays see-through images on the mobile display and that allows a user to interact with virtual objects overlaid on the see-through image using the user's hand. In this system, the camera which tracks the user's viewpoint is attached to the front of the mobile display and the depth camera which captures color and depth images of the user's hand and the background scene is attached to the back of the mobile display. Natural interaction with virtual objects using the user's hand is realized by displaying images so that the appearance of a space through the mobile display is consistent with that of the real space from the user's viewpoint. We implemented two applications to the system and showed the usefulness of this system in various AR applications.
在本文中,我们提出了一个交互系统,该系统可以在移动显示器上显示透明图像,并允许用户使用用户的手与覆盖在透明图像上的虚拟物体进行交互。在该系统中,用于跟踪用户视点的摄像头安装在移动显示器的前部,用于捕获用户手部和背景场景的颜色和深度图像的深度摄像头安装在移动显示器的后部。利用用户的手与虚拟物体的自然交互是通过显示图像来实现的,通过移动显示的空间的外观与用户所看到的真实空间的外观一致。我们对该系统实现了两个应用程序,并展示了该系统在各种AR应用程序中的实用性。
{"title":"See-through mobile AR system for natural 3D interaction","authors":"Yuko Unuma, T. Niikura, T. Komuro","doi":"10.1145/2559184.2559198","DOIUrl":"https://doi.org/10.1145/2559184.2559198","url":null,"abstract":"In this paper, we propose an interaction system which displays see-through images on the mobile display and that allows a user to interact with virtual objects overlaid on the see-through image using the user's hand. In this system, the camera which tracks the user's viewpoint is attached to the front of the mobile display and the depth camera which captures color and depth images of the user's hand and the background scene is attached to the back of the mobile display. Natural interaction with virtual objects using the user's hand is realized by displaying images so that the appearance of a space through the mobile display is consistent with that of the real space from the user's viewpoint. We implemented two applications to the system and showed the usefulness of this system in various AR applications.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"142 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124517332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Recognition of student intentions in a virtual reality training environment 在虚拟现实训练环境中识别学生意图
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559189
Yecheng Gu, Sergey Sosnovsky
This paper introduces a novel method for detecting and modeling intentions of students performing training tasks in a Virtual Reality (VR) environment enhanced with intelligent tutoring capabilities. Our VR-setup provides students with an immersive user interface, but produces noisy and low-level input, from which we need to recognize higher-level cognitive information about the student. The complexity of this task is amplified by the requirements of the target domain (child pedestrian safety), where students need to train complex skills in dynamic settings. We present an approach for this task, which combines the logic-based Event Calculus (EC) and probabilistic modeling.
本文介绍了一种在具有智能辅导功能的虚拟现实(VR)环境中检测和建模学生执行训练任务意图的新方法。我们的vr设置为学生提供了一个身临其境的用户界面,但产生了嘈杂和低级的输入,我们需要从中识别关于学生的更高层次的认知信息。这项任务的复杂性被目标领域(儿童行人安全)的要求放大,学生需要在动态环境中训练复杂的技能。我们提出了一种方法,它结合了基于逻辑的事件演算(EC)和概率建模。
{"title":"Recognition of student intentions in a virtual reality training environment","authors":"Yecheng Gu, Sergey Sosnovsky","doi":"10.1145/2559184.2559189","DOIUrl":"https://doi.org/10.1145/2559184.2559189","url":null,"abstract":"This paper introduces a novel method for detecting and modeling intentions of students performing training tasks in a Virtual Reality (VR) environment enhanced with intelligent tutoring capabilities. Our VR-setup provides students with an immersive user interface, but produces noisy and low-level input, from which we need to recognize higher-level cognitive information about the student. The complexity of this task is amplified by the requirements of the target domain (child pedestrian safety), where students need to train complex skills in dynamic settings. We present an approach for this task, which combines the logic-based Event Calculus (EC) and probabilistic modeling.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"15 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125761618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Enhancing understanding of safety aspects in embedded systems through an interactive visual tool 透过互动视觉工具,加深对嵌入式系统安全方面的了解
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559196
Ragaad Altarawneh, J. Bauer, S. Humayoun, A. Ebert, P. Liggesmeyer
In this work, we present a demonstration of a visual interactive tool called ESSAVis that helps different engineers in collaborating together for understanding the failure mechanisms in complex embedded systems. ESSAVis provides a 2Dplus3D visual user interface that integrates intuitively between different data sets related with embedded systems failure mechanisms. The tool accepts a CFT model describing a specific hazard in the underlying system, and a CAD model describing the geometry of system components. In this paper, we present different interaction options of ESSAVis that are used for intuitively extracting safety aspects of the underlying embedded system.
在这项工作中,我们展示了一个名为ESSAVis的可视化交互工具的演示,该工具可以帮助不同的工程师一起协作,以了解复杂嵌入式系统中的故障机制。ESSAVis提供了一个2Dplus3D可视化用户界面,可以直观地集成与嵌入式系统故障机制相关的不同数据集。该工具接受描述底层系统中特定危险的CFT模型,以及描述系统组件几何形状的CAD模型。在本文中,我们提出了ESSAVis的不同交互选项,用于直观地提取底层嵌入式系统的安全方面。
{"title":"Enhancing understanding of safety aspects in embedded systems through an interactive visual tool","authors":"Ragaad Altarawneh, J. Bauer, S. Humayoun, A. Ebert, P. Liggesmeyer","doi":"10.1145/2559184.2559196","DOIUrl":"https://doi.org/10.1145/2559184.2559196","url":null,"abstract":"In this work, we present a demonstration of a visual interactive tool called ESSAVis that helps different engineers in collaborating together for understanding the failure mechanisms in complex embedded systems. ESSAVis provides a 2Dplus3D visual user interface that integrates intuitively between different data sets related with embedded systems failure mechanisms. The tool accepts a CFT model describing a specific hazard in the underlying system, and a CAD model describing the geometry of system components. In this paper, we present different interaction options of ESSAVis that are used for intuitively extracting safety aspects of the underlying embedded system.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125414747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Exploratory search interfaces: blending relevance, diversity, relationships and categories 探索性搜索界面:融合相关性、多样性、关系和类别
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559187
Sivan Yogev
Exploratory search of scientific literature plays an essential part of a researcher's work. Efforts to provide interfaces supporting this task accomplished significant progress, but the field is open for further evolution. In this paper I present four basic design concepts identified in exploratory search interfaces: relevance, diversity, relationships and categories, and propose a novel browsing layout featuring a unique combination of these concepts.
科学文献的探索性检索是研究人员工作的重要组成部分。提供支持此任务的接口的努力取得了重大进展,但该领域仍有待进一步发展。在本文中,我提出了探索性搜索界面中确定的四个基本设计概念:相关性,多样性,关系和类别,并提出了一种具有这些概念独特组合的新颖浏览布局。
{"title":"Exploratory search interfaces: blending relevance, diversity, relationships and categories","authors":"Sivan Yogev","doi":"10.1145/2559184.2559187","DOIUrl":"https://doi.org/10.1145/2559184.2559187","url":null,"abstract":"Exploratory search of scientific literature plays an essential part of a researcher's work. Efforts to provide interfaces supporting this task accomplished significant progress, but the field is open for further evolution. In this paper I present four basic design concepts identified in exploratory search interfaces: relevance, diversity, relationships and categories, and propose a novel browsing layout featuring a unique combination of these concepts.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129286440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A facial affect mapping engine 一个面部情感映射引擎
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559203
L. Impett, P. Robinson, T. Baltrušaitis
Facial expressions play a crucial role in human interaction. Interactive digital games can help teaching people to both express and recognise them. Such interactive games can benefit from the ability to alter user expressions dynamically and in real-time. In this demonstration, we present the Facial Affect Mapping Engine (FAME), a framework for mapping and manipulating facial expressions across images and video streams. Our system is fully automatic runs in real-time and does not require any specialist hardware. FAME presents new possibilities for the designers of intelligent interactive digital games.
面部表情在人际交往中起着至关重要的作用。交互式数字游戏可以帮助人们表达和识别它们。这种互动游戏可以从动态和实时改变用户表情的能力中受益。在这个演示中,我们展示了面部表情映射引擎(FAME),这是一个跨图像和视频流映射和操纵面部表情的框架。我们的系统是全自动的实时运行,不需要任何专业的硬件。FAME为智能互动数字游戏的设计者提供了新的可能性。
{"title":"A facial affect mapping engine","authors":"L. Impett, P. Robinson, T. Baltrušaitis","doi":"10.1145/2559184.2559203","DOIUrl":"https://doi.org/10.1145/2559184.2559203","url":null,"abstract":"Facial expressions play a crucial role in human interaction. Interactive digital games can help teaching people to both express and recognise them. Such interactive games can benefit from the ability to alter user expressions dynamically and in real-time. In this demonstration, we present the Facial Affect Mapping Engine (FAME), a framework for mapping and manipulating facial expressions across images and video streams. Our system is fully automatic runs in real-time and does not require any specialist hardware. FAME presents new possibilities for the designers of intelligent interactive digital games.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128868970","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Microcosm: visual discovery, exploration and analysis of social communities 微观世界:对社会群体的视觉发现、探索和分析
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559195
Haggai Roitman, Ariel Raviv, S. Hummel, Shai Erera, D. Konopnicki
Social communities play an important role in many domains. While a lot of attention has been given to developing efficient methods for detecting and analyzing social communities, it still remains a great challenge to provide intuitive search interfaces for end-users who wish to discover and explore such communities. Trying to fill the gaps, in this demonstration we present Microcosm: a holistic solution for visual discovery, exploration and analysis of social communities.
社会团体在许多领域发挥着重要作用。虽然开发检测和分析社会社区的有效方法已经引起了很多关注,但为希望发现和探索此类社区的最终用户提供直观的搜索界面仍然是一个巨大的挑战。为了填补这个空白,在这个演示中,我们展示了microcosmos:一个用于视觉发现、探索和分析社会社区的整体解决方案。
{"title":"Microcosm: visual discovery, exploration and analysis of social communities","authors":"Haggai Roitman, Ariel Raviv, S. Hummel, Shai Erera, D. Konopnicki","doi":"10.1145/2559184.2559195","DOIUrl":"https://doi.org/10.1145/2559184.2559195","url":null,"abstract":"Social communities play an important role in many domains. While a lot of attention has been given to developing efficient methods for detecting and analyzing social communities, it still remains a great challenge to provide intuitive search interfaces for end-users who wish to discover and explore such communities. Trying to fill the gaps, in this demonstration we present Microcosm: a holistic solution for visual discovery, exploration and analysis of social communities.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124182752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Deploying recommender system for the masses 部署群众推荐系统
Pub Date : 2014-02-24 DOI: 10.1145/2559184.2559194
David Ben-Shimon, Michael Friedmann, J. Hörle, Alexander Tsikinovsky, Roland Gude, Rodion Aluchanov
Many small and mid-sized e-businesses wish to integrate a recommender system into their website. Integrating an existing recommender system to a website often requires certain expertise and programming efforts, thus incurs substantial investments and may not be justified by the added value of the recommender system. This demo presents a solution for integrating a recommender system as a service to an existing e-business without any programming efforts. The integration method is analogue to the way of the Google AdSense integration and the business model is adapted from the advertisements world. Initial feedback from real website owners indicates that such integration has a great benefit for both sides; the website owner and the Recommender System (RS) provider.
许多中小型电子商务希望将推荐系统集成到他们的网站中。将现有的推荐系统集成到网站上通常需要一定的专业知识和编程工作,因此需要大量的投资,并且可能无法通过推荐系统的附加价值来证明。这个演示提供了一个解决方案,可以将推荐系统作为服务集成到现有的电子商务中,而无需任何编程工作。其整合方式类似于Google AdSense的整合方式,其商业模式借鉴了广告业。来自真正的网站所有者的初步反馈表明,这种整合对双方都有很大的好处;网站所有者和推荐系统(RS)提供商。
{"title":"Deploying recommender system for the masses","authors":"David Ben-Shimon, Michael Friedmann, J. Hörle, Alexander Tsikinovsky, Roland Gude, Rodion Aluchanov","doi":"10.1145/2559184.2559194","DOIUrl":"https://doi.org/10.1145/2559184.2559194","url":null,"abstract":"Many small and mid-sized e-businesses wish to integrate a recommender system into their website. Integrating an existing recommender system to a website often requires certain expertise and programming efforts, thus incurs substantial investments and may not be justified by the added value of the recommender system. This demo presents a solution for integrating a recommender system as a service to an existing e-business without any programming efforts. The integration method is analogue to the way of the Google AdSense integration and the business model is adapted from the advertisements world. Initial feedback from real website owners indicates that such integration has a great benefit for both sides; the website owner and the Recommender System (RS) provider.","PeriodicalId":206452,"journal":{"name":"IUI Companion '14","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121180018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
期刊
IUI Companion '14
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1