A. Fedosov, B. Stancu, Elena Di Lascio, D. Eynard, Marc Langheinrich
{"title":"电影+","authors":"A. Fedosov, B. Stancu, Elena Di Lascio, D. Eynard, Marc Langheinrich","doi":"10.1145/3290607.3313261","DOIUrl":null,"url":null,"abstract":"Collaborative movie viewing with the loved ones increases connectedness and social bonds within family members and friends. Furthermore, with the rapid adoption of personal mobile devices, people often engage in this activity being geographically separated. However, conveying our feelings and emotions about a recently watched movie or a video clip is often limited to a post on social media or a short blurb on an instant messaging app. Drawing on the popular interest in quantified-self, which envisioned one collecting and sharing biophysical information from everyday routines (e.g., workouts), we have designed and developed Movie+, a mobile application, which utilizes personal biophysical data to construct an individual's \"emotional fingerprint\" while viewing a video clip. Movie+ allows the selective sharing of this information through different visualization options, as well as rendering others' emotional fingerprints over the same clip. In this submission, we outline the design rationale and briefly describe our application prototype.","PeriodicalId":389485,"journal":{"name":"Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems","volume":"25 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Movie+\",\"authors\":\"A. Fedosov, B. Stancu, Elena Di Lascio, D. Eynard, Marc Langheinrich\",\"doi\":\"10.1145/3290607.3313261\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Collaborative movie viewing with the loved ones increases connectedness and social bonds within family members and friends. Furthermore, with the rapid adoption of personal mobile devices, people often engage in this activity being geographically separated. However, conveying our feelings and emotions about a recently watched movie or a video clip is often limited to a post on social media or a short blurb on an instant messaging app. Drawing on the popular interest in quantified-self, which envisioned one collecting and sharing biophysical information from everyday routines (e.g., workouts), we have designed and developed Movie+, a mobile application, which utilizes personal biophysical data to construct an individual's \\\"emotional fingerprint\\\" while viewing a video clip. Movie+ allows the selective sharing of this information through different visualization options, as well as rendering others' emotional fingerprints over the same clip. In this submission, we outline the design rationale and briefly describe our application prototype.\",\"PeriodicalId\":389485,\"journal\":{\"name\":\"Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems\",\"volume\":\"25 2\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-05-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3290607.3313261\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3290607.3313261","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Collaborative movie viewing with the loved ones increases connectedness and social bonds within family members and friends. Furthermore, with the rapid adoption of personal mobile devices, people often engage in this activity being geographically separated. However, conveying our feelings and emotions about a recently watched movie or a video clip is often limited to a post on social media or a short blurb on an instant messaging app. Drawing on the popular interest in quantified-self, which envisioned one collecting and sharing biophysical information from everyday routines (e.g., workouts), we have designed and developed Movie+, a mobile application, which utilizes personal biophysical data to construct an individual's "emotional fingerprint" while viewing a video clip. Movie+ allows the selective sharing of this information through different visualization options, as well as rendering others' emotional fingerprints over the same clip. In this submission, we outline the design rationale and briefly describe our application prototype.