C. Zeagler, C. Byrne, Giancarlo Valentin, Larry Freil, Eric Kidder, James Crouch, Thad Starner, M. Jackson
Search and Rescue (SAR) is a critical component of disaster recovery efforts. Every second saved in the search increases the chances of finding survivors and the majority of these teams prefer using canines [5]. Our goal is to help enable SAR dog and handler teams to work together more effectively. Using a semi-structured interviews and guidance from K9-SAR experts as we iterate through designs, we develop a two-part system consisting of a wearable computer interface for working SAR dogs that communicates with their handler via a mobile application. Additionally, we discuss the system around a heuristic framework that includes dogs as active participants. Finally, we show the viability of our tool by evaluating it with feedback from three SAR experts.
{"title":"Search and rescue: dog and handler collaboration through wearable and mobile interfaces","authors":"C. Zeagler, C. Byrne, Giancarlo Valentin, Larry Freil, Eric Kidder, James Crouch, Thad Starner, M. Jackson","doi":"10.1145/2995257.2995390","DOIUrl":"https://doi.org/10.1145/2995257.2995390","url":null,"abstract":"Search and Rescue (SAR) is a critical component of disaster recovery efforts. Every second saved in the search increases the chances of finding survivors and the majority of these teams prefer using canines [5]. Our goal is to help enable SAR dog and handler teams to work together more effectively. Using a semi-structured interviews and guidance from K9-SAR experts as we iterate through designs, we develop a two-part system consisting of a wearable computer interface for working SAR dogs that communicates with their handler via a mobile application. Additionally, we discuss the system around a heuristic framework that includes dogs as active participants. Finally, we show the viability of our tool by evaluating it with feedback from three SAR experts.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123593954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Zeagler, Jay Zuerndorfer, Andrea Lau, Larry Freil, Scott M. Gilliland, Thad Starner, M. Jackson
Touchscreens can provide a way for service dogs to relay emergency information about their handlers from a home or office environment. In this paper, we build on work exploring the ability of canines to interact with touchscreen interfaces. We observe new requirements for training and explain best practices found in training techniques. Learning from previous work, we also begin to test new dog interaction techniques such as lift-off selection and sliding gestural motions. Our goal is to understand the affordances needed to make touchscreen interfaces usable for canines and help the future design of touchscreen interfaces for assistance dogs in the home.
{"title":"Canine computer interaction: towards designing a touchscreen interface for working dogs","authors":"C. Zeagler, Jay Zuerndorfer, Andrea Lau, Larry Freil, Scott M. Gilliland, Thad Starner, M. Jackson","doi":"10.1145/2995257.2995384","DOIUrl":"https://doi.org/10.1145/2995257.2995384","url":null,"abstract":"Touchscreens can provide a way for service dogs to relay emergency information about their handlers from a home or office environment. In this paper, we build on work exploring the ability of canines to interact with touchscreen interfaces. We observe new requirements for training and explain best practices found in training techniques. Learning from previous work, we also begin to test new dog interaction techniques such as lift-off selection and sliding gestural motions. Our goal is to understand the affordances needed to make touchscreen interfaces usable for canines and help the future design of touchscreen interfaces for assistance dogs in the home.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132350579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The increasing use of digital games for pets' enrichment and entertainment calls for better understanding of this phenomenon. While there is a lot of anecdotal evidence of pets "playing" digital games on computers, the nature of such interactions is yet to be understood. Humans - both pet owners and pet professionals, play a pivotal role in shaping the way pets interact with technology, both in terms of promoting pet-oriented technologies, as well as posing requirements for them. We present some first results of an exploratory empirical study of human perceptions and attitudes towards playful interactions of dogs with tablets.
{"title":"Exploring human perceptions of dog-tablet playful interactions","authors":"Sofya Baskin, A. Zamansky, V. Kononova","doi":"10.1145/2995257.3012023","DOIUrl":"https://doi.org/10.1145/2995257.3012023","url":null,"abstract":"The increasing use of digital games for pets' enrichment and entertainment calls for better understanding of this phenomenon. While there is a lot of anecdotal evidence of pets \"playing\" digital games on computers, the nature of such interactions is yet to be understood. Humans - both pet owners and pet professionals, play a pivotal role in shaping the way pets interact with technology, both in terms of promoting pet-oriented technologies, as well as posing requirements for them. We present some first results of an exploratory empirical study of human perceptions and attitudes towards playful interactions of dogs with tablets.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"122 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116098089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Joelle Alcaidinho, Giancarlo Valentin, G. Abowd, M. Jackson
We describe how we trained two dogs to perform precise gestures to be sensed from an inertial measurement unit worn on a dog's collar as part of a larger research effort. These gestures could then be used to relay specific alerts to humans through a companion smartphone application. For example, a guide dog could use a set of two gestures to alert between obstacles requiring a human to 'wait' or 'go around'.
{"title":"Training collar-sensed gestures for canine communication","authors":"Joelle Alcaidinho, Giancarlo Valentin, G. Abowd, M. Jackson","doi":"10.1145/2995257.3012020","DOIUrl":"https://doi.org/10.1145/2995257.3012020","url":null,"abstract":"We describe how we trained two dogs to perform precise gestures to be sensed from an inertial measurement unit worn on a dog's collar as part of a larger research effort. These gestures could then be used to relay specific alerts to humans through a companion smartphone application. For example, a guide dog could use a set of two gestures to alert between obstacles requiring a human to 'wait' or 'go around'.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127795068","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Naohiro Isokawa, Yuuki Nishiyama, T. Okoshi, J. Nakazawa, K. Takashio, H. Tokuda
Have you ever talked to fish? If you can incorporate the feelings of your fish into breeding systems, life with your fish can become more comfortable, fun, and a greater experience overall. In breeding pets, owners often cannot properly understand their pet's internal state, which can lead to impaired health-or worse, loss of life-for the pet. In case of fish, such situations are more likely to occur than with dogs or cats, since owner-pet communication is more difficult with fish. Although fish are the most bred pets [1], information and communications technology (ICT)-based solutions for accelerating owner-pet interaction have not yet been sufficiently explored. We present "TalkingNemo," an aquarium fish breeding system that enables interaction between owners and their fish. TalkingNemo detects the condition of fish and their aquarium via a camera and sensors attached to the aquarium, and notifies users with speech balloons as from the perspective of the fish (Figure 1). TalkingNemo empowers us to breed fish in good condition over a longer period. This paper introduces the architecture and evaluates the results of TalkingNemo.
{"title":"TalkingNemo: aquarium fish talks its mind for breeding support","authors":"Naohiro Isokawa, Yuuki Nishiyama, T. Okoshi, J. Nakazawa, K. Takashio, H. Tokuda","doi":"10.1145/2995257.3012017","DOIUrl":"https://doi.org/10.1145/2995257.3012017","url":null,"abstract":"Have you ever talked to fish? If you can incorporate the feelings of your fish into breeding systems, life with your fish can become more comfortable, fun, and a greater experience overall. In breeding pets, owners often cannot properly understand their pet's internal state, which can lead to impaired health-or worse, loss of life-for the pet. In case of fish, such situations are more likely to occur than with dogs or cats, since owner-pet communication is more difficult with fish. Although fish are the most bred pets [1], information and communications technology (ICT)-based solutions for accelerating owner-pet interaction have not yet been sufficiently explored. We present \"TalkingNemo,\" an aquarium fish breeding system that enables interaction between owners and their fish. TalkingNemo detects the condition of fish and their aquarium via a camera and sensors attached to the aquarium, and notifies users with speech balloons as from the perspective of the fish (Figure 1). TalkingNemo empowers us to breed fish in good condition over a longer period. This paper introduces the architecture and evaluates the results of TalkingNemo.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127413338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this exploratory paper, we advocate for a way to mitigate the anthropocentrism inherent in interaction-design methodologies. We propose to involve animals that live in anthropic environments as participants in design processes. The current relationships between animals and technology have an inevitable impact on their well-being and raise fundamental ethical questions concerning our design policies. Drawing from the work of Bruno Latour and Donna Haraway, we argue for a situated approach in which we reflect upon concrete design contexts. We explore the notion of becoming with as a conceptual framework for the intuitive and bodily understanding that takes place between humans and animals when they encounter one-another in shared contexts. Adopting a research through design approach, we further explore this notion by reflecting upon two different participatory design projects with two dogs. We found these reflections to offer valuable perspectives for designers to analyse and discuss their iterative processes.
{"title":"Becoming with: towards the inclusion of animals as participants in design processes","authors":"Michelle Westerlaken, S. Gualeni","doi":"10.1145/2995257.2995392","DOIUrl":"https://doi.org/10.1145/2995257.2995392","url":null,"abstract":"In this exploratory paper, we advocate for a way to mitigate the anthropocentrism inherent in interaction-design methodologies. We propose to involve animals that live in anthropic environments as participants in design processes. The current relationships between animals and technology have an inevitable impact on their well-being and raise fundamental ethical questions concerning our design policies. Drawing from the work of Bruno Latour and Donna Haraway, we argue for a situated approach in which we reflect upon concrete design contexts. We explore the notion of becoming with as a conceptual framework for the intuitive and bodily understanding that takes place between humans and animals when they encounter one-another in shared contexts. Adopting a research through design approach, we further explore this notion by reflecting upon two different participatory design projects with two dogs. We found these reflections to offer valuable perspectives for designers to analyse and discuss their iterative processes.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128288313","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As drones are quickly becoming part of our everyday lives, dogs become inevitably exposed to them. Moreover, dog-drone interactions have far-reaching applications in search and rescue operations and other domains. This short note calls for taking an ACI, user-centric perspective on dog-drone interaction, informing the design of interactions which are safe, stress-free and enriching for our canine companions.
{"title":"Dog-drone interactions: towards an ACI perspective","authors":"A. Zamansky","doi":"10.1145/2995257.3012021","DOIUrl":"https://doi.org/10.1145/2995257.3012021","url":null,"abstract":"As drones are quickly becoming part of our everyday lives, dogs become inevitably exposed to them. Moreover, dog-drone interactions have far-reaching applications in search and rescue operations and other domains. This short note calls for taking an ACI, user-centric perspective on dog-drone interaction, informing the design of interactions which are safe, stress-free and enriching for our canine companions.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133152606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. P. Rossi, Sarah Rodriguez, Cassia Rabelo Cardoso dos Santos
This paper discusses the rise of technology and its use to enhance the lives of animals. As the bond between humans and domesticated dogs grows in importance in today's society, the need to stay connected to pets from a distant location has peaked inquiries into how technology can be used to enhance that bond. While it is not clear how different types of technology mediated interaction affects the human - dog bond, recent work between a trainer and his canine displayed the ability of a canine to correctly respond to verbal cues given through video chat. The steps to achieve this will be explored, and the benefits of such an interaction will be discussed.
{"title":"A dog using skype","authors":"A. P. Rossi, Sarah Rodriguez, Cassia Rabelo Cardoso dos Santos","doi":"10.1145/2995257.3012019","DOIUrl":"https://doi.org/10.1145/2995257.3012019","url":null,"abstract":"This paper discusses the rise of technology and its use to enhance the lives of animals. As the bond between humans and domesticated dogs grows in importance in today's society, the need to stay connected to pets from a distant location has peaked inquiries into how technology can be used to enhance that bond. While it is not clear how different types of technology mediated interaction affects the human - dog bond, recent work between a trainer and his canine displayed the ability of a canine to correctly respond to verbal cues given through video chat. The steps to achieve this will be explored, and the benefits of such an interaction will be discussed.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125026604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fiona French, Mark Kingston-Jones, David T. Schaller, S. Webber, H. Väätäjä, M. Campbell
This workshop explores different ways to use technology to facilitate hunting behaviour enrichment for zoo-housed animals and parallel gaming experiences for zoo visitors.
{"title":"Don't cut to the chase: hunting experiences for zoo animals and visitors","authors":"Fiona French, Mark Kingston-Jones, David T. Schaller, S. Webber, H. Väätäjä, M. Campbell","doi":"10.1145/2995257.3014066","DOIUrl":"https://doi.org/10.1145/2995257.3014066","url":null,"abstract":"This workshop explores different ways to use technology to facilitate hunting behaviour enrichment for zoo-housed animals and parallel gaming experiences for zoo visitors.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129641268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
3D sensing hardware, such as the Microsoft Kinect, allows new interaction paradigms that would be difficult to accomplish with traditional RGB cameras alone. One basic step in realizing these new methods of animal-computer interaction is posture and behavior detection and classification. In this paper, we present a system capable of identifying static postures for canines that does not rely on hand-labeled data at any point during the process. We create a model of the canine based on measurements automatically obtained in from the first few captured frames, reducing the burden on users. We also present a preliminary evaluation of the system with five dogs, which shows that the system can identify the "standing," "sitting," and "lying" postures with approximately 70%, 69%, and 94% accuracy, respectively.
{"title":"Semi-supervised classification of static canine postures using the Microsoft Kinect","authors":"Sean P. Mealin, Ignacio X. Domínguez, D. Roberts","doi":"10.1145/2995257.3012024","DOIUrl":"https://doi.org/10.1145/2995257.3012024","url":null,"abstract":"3D sensing hardware, such as the Microsoft Kinect, allows new interaction paradigms that would be difficult to accomplish with traditional RGB cameras alone. One basic step in realizing these new methods of animal-computer interaction is posture and behavior detection and classification. In this paper, we present a system capable of identifying static postures for canines that does not rely on hand-labeled data at any point during the process. We create a model of the canine based on measurements automatically obtained in from the first few captured frames, reducing the burden on users. We also present a preliminary evaluation of the system with five dogs, which shows that the system can identify the \"standing,\" \"sitting,\" and \"lying\" postures with approximately 70%, 69%, and 94% accuracy, respectively.","PeriodicalId":197703,"journal":{"name":"Proceedings of the Third International Conference on Animal-Computer Interaction","volume":"207 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133679899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}