Marcelo Feighelstein, Einat Kovalyo, Jennifer Abrams, Sarah-Elisabeth Byosiere, A. Zamansky
Large-scale, pretrained vision-language models such as OpenAI’s CLIP are a game changer in Computer Vision due to their unprecedented ‘zero-shot’ image classification capabilities. As they are pretrained on huge amounts of unsupervised web-scraped data, they suffer from inherent biases reflecting human perceptions, norms and beliefs. This position paper aims to highlight the potential of studying models such as CLIP in the context of human-animal relationships, in particular for understanding human perceptions and preferences with respect to physical attributes of pets and their adoptability.
{"title":"Do AI Models “Like\" Black Dogs? Towards Exploring Perceptions of Dogs with Vision-Language Models","authors":"Marcelo Feighelstein, Einat Kovalyo, Jennifer Abrams, Sarah-Elisabeth Byosiere, A. Zamansky","doi":"10.1145/3565995.3566022","DOIUrl":"https://doi.org/10.1145/3565995.3566022","url":null,"abstract":"Large-scale, pretrained vision-language models such as OpenAI’s CLIP are a game changer in Computer Vision due to their unprecedented ‘zero-shot’ image classification capabilities. As they are pretrained on huge amounts of unsupervised web-scraped data, they suffer from inherent biases reflecting human perceptions, norms and beliefs. This position paper aims to highlight the potential of studying models such as CLIP in the context of human-animal relationships, in particular for understanding human perceptions and preferences with respect to physical attributes of pets and their adoptability.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129449455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Interacting with animals in a safari park is an important opportunity to think about animals, and plays a role beyond providing a healing effect through animals. However, safari park online tours do not allow users to experience touching animals, and thus cannot make them feel that the animals are alive (a sense of life). In this study, we developed a system that reproduces the breathing of animals, based on the hypothesis that reproduction of breathing is important to present a sense of animal life to viewers in online tours. A user study was conducted to verify the usefulness of the proposed system, and it was found that the proposed system was able to significantly present a sense of life compared to a comparative method that did not reproduce animal breath.
{"title":"Investigation on Enhancement of the Sense of Life in Safari Park Online Tours with Animal Breathing Reproduction System","authors":"Minori Tsuji, Yoshinari Takegawa, Kohei Matsumura, Keiji Hirata","doi":"10.1145/3565995.3566024","DOIUrl":"https://doi.org/10.1145/3565995.3566024","url":null,"abstract":"Interacting with animals in a safari park is an important opportunity to think about animals, and plays a role beyond providing a healing effect through animals. However, safari park online tours do not allow users to experience touching animals, and thus cannot make them feel that the animals are alive (a sense of life). In this study, we developed a system that reproduces the breathing of animals, based on the hypothesis that reproduction of breathing is important to present a sense of animal life to viewers in online tours. A user study was conducted to verify the usefulness of the proposed system, and it was found that the proposed system was able to significantly present a sense of life compared to a comparative method that did not reproduce animal breath.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133212313","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Validating that non human animals can communicate with humans using Augmentative and Alternative Communication (AAC) requires extensive logging, and traditional techniques are costly in resources and time. We propose to implement 1) a configurable “communication board” application aimed at small non human animals able to use touch interfaces, which not only emits human words associated to each button, but also logs such interactions; 2) a hardware keyboard to extend the use of such an application to larger non human animals unable to use a touch screen, but able to use large keys and 3) a centralized back-end gathering the logs from various devices, facilitating their study by researchers. We propose to validate the usability of such prototype solutions with two monk parakeets parrots for the application, a dog and two cats for the keyboard (and application), and a researcher in comparative psychology for the website of the back-end.
{"title":"A Loggable Aid to Speech: A Research Proposal","authors":"Jérémy Félix Barbay, Camila Labarca-Rosenbluth, Brandon Peña-Haipas","doi":"10.1145/3565995.3566031","DOIUrl":"https://doi.org/10.1145/3565995.3566031","url":null,"abstract":"Validating that non human animals can communicate with humans using Augmentative and Alternative Communication (AAC) requires extensive logging, and traditional techniques are costly in resources and time. We propose to implement 1) a configurable “communication board” application aimed at small non human animals able to use touch interfaces, which not only emits human words associated to each button, but also logs such interactions; 2) a hardware keyboard to extend the use of such an application to larger non human animals unable to use a touch screen, but able to use large keys and 3) a centralized back-end gathering the logs from various devices, facilitating their study by researchers. We propose to validate the usability of such prototype solutions with two monk parakeets parrots for the application, a dog and two cats for the keyboard (and application), and a researcher in comparative psychology for the website of the back-end.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130730218","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Clara Mancini, Orit Hirsch-Matsioulas, Daniel Metcalfe
While ACI researchers aspire to design animal-centred technology, they must operate within socio-economic systems that are not necessarily animal-centred. This creates a tension between researchers’ endeavour to address the immediate needs of animals in specific situations through technological interventions, on the one hand, and these interventions’ wider implications and consequences for the situation and life of various animal stakeholders beyond specific ACI projects, on the other hand. In this paper, we focus on the political nature of ACI, drawing from literature on political interaction design, which argues that designers should work towards social justice. Drawing from political philosophies, we then explore how extending the notion of social justice to animals might help define a political notion of animal centredness in ACI. Finally, through the lens of this notion of animal centredness, we consider the relevance of previously proposed strategies for political interaction design and propose an approach that could support ACI researchers’ political engagement in animal-centred design.
{"title":"Politicising Animal-Computer Interaction: an Approach to Political Engagement with Animal-Centred Design: An Approach to Political Engagement with Animal-Centred Design","authors":"Clara Mancini, Orit Hirsch-Matsioulas, Daniel Metcalfe","doi":"10.1145/3565995.3566034","DOIUrl":"https://doi.org/10.1145/3565995.3566034","url":null,"abstract":"While ACI researchers aspire to design animal-centred technology, they must operate within socio-economic systems that are not necessarily animal-centred. This creates a tension between researchers’ endeavour to address the immediate needs of animals in specific situations through technological interventions, on the one hand, and these interventions’ wider implications and consequences for the situation and life of various animal stakeholders beyond specific ACI projects, on the other hand. In this paper, we focus on the political nature of ACI, drawing from literature on political interaction design, which argues that designers should work towards social justice. Drawing from political philosophies, we then explore how extending the notion of social justice to animals might help define a political notion of animal centredness in ACI. Finally, through the lens of this notion of animal centredness, we consider the relevance of previously proposed strategies for political interaction design and propose an approach that could support ACI researchers’ political engagement in animal-centred design.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121485216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Watching animals use digital technology is known to affect our attitudes towards them, but there has been little empirical study of this topic. There is a need for greater understanding of how technology can shape people's perceptions of other species, since human attitudes are a significant factor in animal welfare. We studied the effects of a digital installation, created as enrichment for zoo-housed orangutans. It was hypothesised that seeing the installation in use would strengthen zoo visitors’ perceptions of orangutans’ intellect and strengthen support for their conservation. Effects were investigated through visitor interviews (n=39) and surveys (n=101), comparing responses of people who saw the installation with those who did not. Seeing primates use the digital installation was found to be associated with stronger attribution of cognitive abilities. Watching animals comprehend game rules, and seeing their human-like patterns of interaction seemed to contribute to this effect. However, no overall impact was found on attitudes to orangutan conservation. This research provides insights into the potential effects of animal-computer interaction on the attitudes of human observers, and suggests avenues for technology design to strengthen people's understanding of animal minds.
{"title":"Watching Animal-Computer Interaction: Effects on Perceptions of Animal Intellect","authors":"S. Webber, Wally Smith, M. Carter, F. Vetere","doi":"10.1145/3565995.3566035","DOIUrl":"https://doi.org/10.1145/3565995.3566035","url":null,"abstract":"Watching animals use digital technology is known to affect our attitudes towards them, but there has been little empirical study of this topic. There is a need for greater understanding of how technology can shape people's perceptions of other species, since human attitudes are a significant factor in animal welfare. We studied the effects of a digital installation, created as enrichment for zoo-housed orangutans. It was hypothesised that seeing the installation in use would strengthen zoo visitors’ perceptions of orangutans’ intellect and strengthen support for their conservation. Effects were investigated through visitor interviews (n=39) and surveys (n=101), comparing responses of people who saw the installation with those who did not. Seeing primates use the digital installation was found to be associated with stronger attribution of cognitive abilities. Watching animals comprehend game rules, and seeing their human-like patterns of interaction seemed to contribute to this effect. However, no overall impact was found on attitudes to orangutan conservation. This research provides insights into the potential effects of animal-computer interaction on the attitudes of human observers, and suggests avenues for technology design to strengthen people's understanding of animal minds.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"77 3-4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114014103","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Blind or visually impaired pet dogs have additional challenges to overcome in their daily life in environments typically built for sighted humans, as do their caregivers in supporting them. From simple activities like finding their food in a hopefully stable home environment, to more complex activities like navigating an ever changing outdoor environment safely. While some support exists for blind and visually impaired dogs, frequently in the form of physical safety products and veterinary guidelines for care giving, little interactive technology yet exists to inform and complement caregivers’ abilities. In this paper, we present the results of an interview-based study with caregivers of blind and visually impaired dogs, using thematic analysis to construct core themes of support needed, and translated these into a prototype app. Our findings show that, while caregivers can adapt quickly to coping with a blind or visually impaired pet dog in their own environment, a gap exists in coping with (ever changing) outdoor environments, in particular identifying safe and suitable outdoor walking routes. We show an initial design of a mobile app for this purpose, and discuss to what extent software for informed caregiving of visually impaired pet dogs could benefit from further work.
{"title":"Blind dogs need guides too: towards technological support for blind dog caregiving","authors":"Alexandra Morgan, D. van der Linden","doi":"10.1145/3565995.3566026","DOIUrl":"https://doi.org/10.1145/3565995.3566026","url":null,"abstract":"Blind or visually impaired pet dogs have additional challenges to overcome in their daily life in environments typically built for sighted humans, as do their caregivers in supporting them. From simple activities like finding their food in a hopefully stable home environment, to more complex activities like navigating an ever changing outdoor environment safely. While some support exists for blind and visually impaired dogs, frequently in the form of physical safety products and veterinary guidelines for care giving, little interactive technology yet exists to inform and complement caregivers’ abilities. In this paper, we present the results of an interview-based study with caregivers of blind and visually impaired dogs, using thematic analysis to construct core themes of support needed, and translated these into a prototype app. Our findings show that, while caregivers can adapt quickly to coping with a blind or visually impaired pet dog in their own environment, a gap exists in coping with (ever changing) outdoor environments, in particular identifying safe and suitable outdoor walking routes. We show an initial design of a mobile app for this purpose, and discuss to what extent software for informed caregiving of visually impaired pet dogs could benefit from further work.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123236285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fiona French, Clara Mancini, Christopher Flynn Martin
This workshop aims to facilitate human participants to become more aware of other animals’ sensory and aesthetic sensibilities, raising points for discussion and future research within ACI. For all animals, being able to make sense of the environment is crucial in order to gain control and make informed choices, as well as to achieve competence in daily activities. Although human perception is limited by evolution, technology can enable us to perceive signals that may be meaningful for other species, thereby gaining insight and possibly empathy. Moreover, pursuing a multi-species perspective may foster inclusive approaches to design that aim to achieve a lighter environmental impact by taking into account the sensory experiences of other species. We will offer participants a range of activities to challenge human senses and sense-making abilities, and then invite them to collaboratively design and test a system that incorporates some animal-centred sensory stimulation inspired by the activities previously undertaken.
{"title":"Sensory Jam 2022: Exploring other sensibilities – beyond human senses and aesthetics","authors":"Fiona French, Clara Mancini, Christopher Flynn Martin","doi":"10.1145/3565995.3566045","DOIUrl":"https://doi.org/10.1145/3565995.3566045","url":null,"abstract":"This workshop aims to facilitate human participants to become more aware of other animals’ sensory and aesthetic sensibilities, raising points for discussion and future research within ACI. For all animals, being able to make sense of the environment is crucial in order to gain control and make informed choices, as well as to achieve competence in daily activities. Although human perception is limited by evolution, technology can enable us to perceive signals that may be meaningful for other species, thereby gaining insight and possibly empathy. Moreover, pursuing a multi-species perspective may foster inclusive approaches to design that aim to achieve a lighter environmental impact by taking into account the sensory experiences of other species. We will offer participants a range of activities to challenge human senses and sense-making abilities, and then invite them to collaboratively design and test a system that incorporates some animal-centred sensory stimulation inspired by the activities previously undertaken.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126434774","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jérémy Félix Barbay, Daniel Freire-Fernández, Danko Lobos-Bustamante
Some video games were developed to entertain non human animals while measuring their abilities, logged in a file which can be analyzed later. Using such games to measure the limits of such abilities is problematic, as it requires the subjects to be exposed to instances that they cannot solve, potentially frustrating them. Could presenting the subjects with a mix of instances of various difficulties at the same time, and measuring their interaction with those, yield useful information about their abilities and inabilities, without frustrating the subjects? We propose to design, develop and validate a web game presenting several instances at once, inspired by existing ones such as ’Pop the Balloons’, so that the subject can be exposed to a mix of “easy” and “difficult” instances in parallel, and to validate that this does not produce frustration by studying the correlation of the subjects’ assiduity in playing the game with the rate of “difficult” instances.
有些电子游戏是为了娱乐非人类动物而开发的,同时测量它们的能力,记录在文件中,供以后分析。使用这类游戏来衡量这些能力的极限是有问题的,因为它要求受试者暴露在他们无法解决的情况下,这可能会让他们感到沮丧。能否同时向受试者展示各种困难的混合实例,并测量他们与这些实例的互动,在不让受试者感到沮丧的情况下,获得有关他们能力和不足的有用信息?我们建议设计、开发和验证一款同时呈现多个实例的网页游戏,其灵感来自于现有的“Pop the Balloons”,这样受试者就可以同时接触到“简单”和“困难”实例的混合,并通过研究受试者在玩游戏中的积极性与“困难”实例的比例之间的相关性来验证这不会产生挫败感。
{"title":"Popping Up Balloons for Science: a Research Proposal","authors":"Jérémy Félix Barbay, Daniel Freire-Fernández, Danko Lobos-Bustamante","doi":"10.1145/3565995.3566029","DOIUrl":"https://doi.org/10.1145/3565995.3566029","url":null,"abstract":"Some video games were developed to entertain non human animals while measuring their abilities, logged in a file which can be analyzed later. Using such games to measure the limits of such abilities is problematic, as it requires the subjects to be exposed to instances that they cannot solve, potentially frustrating them. Could presenting the subjects with a mix of instances of various difficulties at the same time, and measuring their interaction with those, yield useful information about their abilities and inabilities, without frustrating the subjects? We propose to design, develop and validate a web game presenting several instances at once, inspired by existing ones such as ’Pop the Balloons’, so that the subject can be exposed to a mix of “easy” and “difficult” instances in parallel, and to validate that this does not produce frustration by studying the correlation of the subjects’ assiduity in playing the game with the rate of “difficult” instances.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114897469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yifan Wu, Timothy R. N. Holder, Marc Foster, Evan Williams, M. Enomoto, B. Lascelles, A. Bozkurt, David L. Roberts
Training guide dogs for visually-impaired people is a resource-consuming task for guide dog schools. This task is further complicated by a dearth of capabilities to objectively measure and analyze candidate guide dogs’ temperaments as they are placed with volunteer raisers away from guide dog schools for months during the raising process. In this work, we demonstrate a preliminary data analysis workflow that is able to provide detailed information about candidate guide dogs’ day to day physical exercise levels and gait activities using objective environmental and behavioral data collected from a wearable collar-based Internet of Things device. We trained and tested machine learning models to analyze different gait types including walking, pacing, trotting and mixture of walk and trot. By analyzing data both spatially and temporally, a location and behavior summary for candidate dogs is generated to provide insight for guide dog training experts, so that they can more accurately and comprehensively evaluate the future success of the candidate. The preliminary analysis revealed movement patterns for different location types which reflected the behaviors of candidate guide dogs.
{"title":"Spatial and Temporal Analytic Pipeline for Evaluation of Potential Guide Dogs Using Location and Behavior Data","authors":"Yifan Wu, Timothy R. N. Holder, Marc Foster, Evan Williams, M. Enomoto, B. Lascelles, A. Bozkurt, David L. Roberts","doi":"10.1145/3565995.3566033","DOIUrl":"https://doi.org/10.1145/3565995.3566033","url":null,"abstract":"Training guide dogs for visually-impaired people is a resource-consuming task for guide dog schools. This task is further complicated by a dearth of capabilities to objectively measure and analyze candidate guide dogs’ temperaments as they are placed with volunteer raisers away from guide dog schools for months during the raising process. In this work, we demonstrate a preliminary data analysis workflow that is able to provide detailed information about candidate guide dogs’ day to day physical exercise levels and gait activities using objective environmental and behavioral data collected from a wearable collar-based Internet of Things device. We trained and tested machine learning models to analyze different gait types including walking, pacing, trotting and mixture of walk and trot. By analyzing data both spatially and temporally, a location and behavior summary for candidate dogs is generated to provide insight for guide dog training experts, so that they can more accurately and comprehensively evaluate the future success of the candidate. The preliminary analysis revealed movement patterns for different location types which reflected the behaviors of candidate guide dogs.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133781652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Face recognition, in the sense of identifying people, is controversial from a legal, social, and ethical perspective. In particular, opposition has been expressed to its use in public spaces for mass surveillance purposes. Face recognition in animals, by contrast, seems to be uncontroversial from a social and ethical point of view and could even have potential for animal welfare and protection. This paper explores how face recognition for bears (understood here as brown bears) in the Alps could be implemented within a system that would help animals as well as humans. It sets out the advantages and disadvantages of wildlife cameras, ground robots, and camera drones that would be linked to artificial intelligence. Based on this, the authors make a proposal for deployment. They favour a three-stage plan that first deploys fixed cameras and then incorporates camera drones and ground robots. These are all connected to a control centre that assesses images and developments and intervenes as needed. The paper then discusses social and ethical, technical and scientific, and economic and structural perspectives. In conclusion, it considers what could happen in the future in this context.
{"title":"A Face Recognition System for Bears: Protection for Animals and Humans in the Alps","authors":"Oliver Bendel, Ali Yürekkirmaz","doi":"10.1145/3565995.3566030","DOIUrl":"https://doi.org/10.1145/3565995.3566030","url":null,"abstract":"Face recognition, in the sense of identifying people, is controversial from a legal, social, and ethical perspective. In particular, opposition has been expressed to its use in public spaces for mass surveillance purposes. Face recognition in animals, by contrast, seems to be uncontroversial from a social and ethical point of view and could even have potential for animal welfare and protection. This paper explores how face recognition for bears (understood here as brown bears) in the Alps could be implemented within a system that would help animals as well as humans. It sets out the advantages and disadvantages of wildlife cameras, ground robots, and camera drones that would be linked to artificial intelligence. Based on this, the authors make a proposal for deployment. They favour a three-stage plan that first deploys fixed cameras and then incorporates camera drones and ground robots. These are all connected to a control centre that assesses images and developments and intervenes as needed. The paper then discusses social and ethical, technical and scientific, and economic and structural perspectives. In conclusion, it considers what could happen in the future in this context.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126329014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}