人工智能语音助手的女性化:人格化、拟人化与话语意识形态

IF 2.3 2区 文学 Q1 COMMUNICATION Discourse Context & Media Pub Date : 2024-11-15 DOI:10.1016/j.dcm.2024.100833
Maria Grazia Sindoni
{"title":"人工智能语音助手的女性化:人格化、拟人化与话语意识形态","authors":"Maria Grazia Sindoni","doi":"10.1016/j.dcm.2024.100833","DOIUrl":null,"url":null,"abstract":"<div><div>Intelligent Voice Assistants (IVAs), such as Amazon Alexa, Apple Siri, Microsoft Cortana, and Google Assistant, have been mainstreamed as female by default, through voices, avatars, colour palette, and conversational cues. Even though tech companies tend to justify this systematic feminization on customers’ preferences, the ingrained gender biases have been raising concerns about the normalization of gendered, abusive, and toxic discourse practices.</div><div>In this paper, a multimodal critical discourse approach, combined with feminist philosophy, and notions of ‘digital domesticity’ will be applied to analyse examples of IVA’s coded responses, as well as personification and anthropomorphic conversational cues. The analysis aims to uncover the companies’ hidden ideologies as they emerge from coded (i.e., pre-established) conversational practices (i.e., what IVAs are expected to say to engage users) in response to users’ prompts that gender, sexualize, and ultimately harass IVAs – a practice that hard-wires women and subservience. The paper seeks to advance understanding of the intersection of design interface of IVAs with reference to the ideological gendering of IVAs, actively pursued by companies to increase user engagement.</div></div>","PeriodicalId":46649,"journal":{"name":"Discourse Context & Media","volume":"62 ","pages":"Article 100833"},"PeriodicalIF":2.3000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The femininization of AI-powered voice assistants: Personification, anthropomorphism and discourse ideologies\",\"authors\":\"Maria Grazia Sindoni\",\"doi\":\"10.1016/j.dcm.2024.100833\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Intelligent Voice Assistants (IVAs), such as Amazon Alexa, Apple Siri, Microsoft Cortana, and Google Assistant, have been mainstreamed as female by default, through voices, avatars, colour palette, and conversational cues. Even though tech companies tend to justify this systematic feminization on customers’ preferences, the ingrained gender biases have been raising concerns about the normalization of gendered, abusive, and toxic discourse practices.</div><div>In this paper, a multimodal critical discourse approach, combined with feminist philosophy, and notions of ‘digital domesticity’ will be applied to analyse examples of IVA’s coded responses, as well as personification and anthropomorphic conversational cues. The analysis aims to uncover the companies’ hidden ideologies as they emerge from coded (i.e., pre-established) conversational practices (i.e., what IVAs are expected to say to engage users) in response to users’ prompts that gender, sexualize, and ultimately harass IVAs – a practice that hard-wires women and subservience. The paper seeks to advance understanding of the intersection of design interface of IVAs with reference to the ideological gendering of IVAs, actively pursued by companies to increase user engagement.</div></div>\",\"PeriodicalId\":46649,\"journal\":{\"name\":\"Discourse Context & Media\",\"volume\":\"62 \",\"pages\":\"Article 100833\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2024-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Discourse Context & Media\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2211695824000795\",\"RegionNum\":2,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMMUNICATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Discourse Context & Media","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2211695824000795","RegionNum":2,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMMUNICATION","Score":null,"Total":0}
引用次数: 0

摘要

亚马逊 Alexa、苹果 Siri、微软 Cortana 和谷歌助理等智能语音助手(IVA)已通过声音、头像、色调和对话提示等方式默认为女性。本文将采用多模态批判性话语方法,结合女性主义哲学和 "数字家庭性 "概念,分析 IVA 的编码回复示例,以及人格化和拟人化对话线索。分析旨在揭示公司隐藏的意识形态,这些意识形态来自于编码(即预先建立的)对话实践(即 IVA 在与用户互动时应该说的话),以回应用户对 IVA 的性别化、性化和最终骚扰的提示--这是一种对女性和从属性进行硬编码的实践。本文旨在促进人们对 IVAs 设计界面与 IVAs 意识形态性别化交叉点的理解,公司积极追求提高用户参与度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The femininization of AI-powered voice assistants: Personification, anthropomorphism and discourse ideologies
Intelligent Voice Assistants (IVAs), such as Amazon Alexa, Apple Siri, Microsoft Cortana, and Google Assistant, have been mainstreamed as female by default, through voices, avatars, colour palette, and conversational cues. Even though tech companies tend to justify this systematic feminization on customers’ preferences, the ingrained gender biases have been raising concerns about the normalization of gendered, abusive, and toxic discourse practices.
In this paper, a multimodal critical discourse approach, combined with feminist philosophy, and notions of ‘digital domesticity’ will be applied to analyse examples of IVA’s coded responses, as well as personification and anthropomorphic conversational cues. The analysis aims to uncover the companies’ hidden ideologies as they emerge from coded (i.e., pre-established) conversational practices (i.e., what IVAs are expected to say to engage users) in response to users’ prompts that gender, sexualize, and ultimately harass IVAs – a practice that hard-wires women and subservience. The paper seeks to advance understanding of the intersection of design interface of IVAs with reference to the ideological gendering of IVAs, actively pursued by companies to increase user engagement.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Discourse Context & Media
Discourse Context & Media COMMUNICATION-
CiteScore
5.00
自引率
10.00%
发文量
46
审稿时长
55 days
期刊最新文献
The femininization of AI-powered voice assistants: Personification, anthropomorphism and discourse ideologies Scaling as method: A three-stage, mixed-methods approach to digital discourse analysis Sharing second stories in online comforting interactions Surveillance at the (inter)face: A nexus analysis Transmodal messenger interaction–Analysing the sequentiality of text and audio postings in WhatsApp chats
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1