Mohammad Amin Kuhail , Nazik Alturki , Justin Thomas , Amal K. Alkhalifa
{"title":"人类咨询与人工智能咨询:大学生的观点","authors":"Mohammad Amin Kuhail , Nazik Alturki , Justin Thomas , Amal K. Alkhalifa","doi":"10.1016/j.chbr.2024.100534","DOIUrl":null,"url":null,"abstract":"<div><div>Transitioning to college life while navigating the complexities of emerging adulthood can be stressful. In some instances, it may even lead to the onset of mental health problems or the exacerbation of existing issues. While therapeutic resources are typically available in tertiary educational contexts, social stigma may lead to service underutilization. Additionally, high student-to-therapist ratios can create bottlenecks to access when such services are sought. Offering an adjunct to traditional campus counseling services, AI chatbots can potentially address such issues. Chatbots can provide flexible, accessible, anonymous, and cost-effective first-line support, improving access and extending traditional treatment methodologies. This study evaluates college students' perceptions (<em>N</em> = 224) of an AI chatbot (Pi) designed to emulate supportive and empathetic interactions characterized by active listening. Participants blindly assessed transcripts from active listening interactions between a client and Pi versus interactions between a client and a human counselor/therapist. The results indicate that participants could not distinguish between the human-human and human-AI counseling transcripts, answering correctly only 47.5% of the time. Moreover, participants gave higher quality ratings to the human-AI counseling transcripts than the human-human ones. These findings provide tentative support for the user-acceptability of relational AI chatbots during the counseling process's early phases (active listening and problem exploration).</div></div>","PeriodicalId":72681,"journal":{"name":"Computers in human behavior reports","volume":"16 ","pages":"Article 100534"},"PeriodicalIF":4.9000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Human vs. AI counseling: College students' perspectives\",\"authors\":\"Mohammad Amin Kuhail , Nazik Alturki , Justin Thomas , Amal K. Alkhalifa\",\"doi\":\"10.1016/j.chbr.2024.100534\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Transitioning to college life while navigating the complexities of emerging adulthood can be stressful. In some instances, it may even lead to the onset of mental health problems or the exacerbation of existing issues. While therapeutic resources are typically available in tertiary educational contexts, social stigma may lead to service underutilization. Additionally, high student-to-therapist ratios can create bottlenecks to access when such services are sought. Offering an adjunct to traditional campus counseling services, AI chatbots can potentially address such issues. Chatbots can provide flexible, accessible, anonymous, and cost-effective first-line support, improving access and extending traditional treatment methodologies. This study evaluates college students' perceptions (<em>N</em> = 224) of an AI chatbot (Pi) designed to emulate supportive and empathetic interactions characterized by active listening. Participants blindly assessed transcripts from active listening interactions between a client and Pi versus interactions between a client and a human counselor/therapist. The results indicate that participants could not distinguish between the human-human and human-AI counseling transcripts, answering correctly only 47.5% of the time. Moreover, participants gave higher quality ratings to the human-AI counseling transcripts than the human-human ones. These findings provide tentative support for the user-acceptability of relational AI chatbots during the counseling process's early phases (active listening and problem exploration).</div></div>\",\"PeriodicalId\":72681,\"journal\":{\"name\":\"Computers in human behavior reports\",\"volume\":\"16 \",\"pages\":\"Article 100534\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2024-11-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in human behavior reports\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2451958824001672\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in human behavior reports","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2451958824001672","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
摘要
过渡到大学生活,同时又要应对复杂的成年期生活,这可能会给人带来压力。在某些情况下,它甚至可能导致心理健康问题的出现或现有问题的加剧。虽然在高等教育环境中通常会提供治疗资源,但社会耻辱感可能会导致服务利用不足。此外,在寻求此类服务时,学生与治疗师之间的高比例也会造成获取服务的瓶颈。作为传统校园咨询服务的辅助工具,人工智能聊天机器人有可能解决这些问题。聊天机器人可以提供灵活、便捷、匿名、经济高效的一线支持,从而改善获取途径并扩展传统治疗方法。本研究评估了大学生(N = 224)对人工智能聊天机器人(Pi)的看法,该聊天机器人旨在模仿以积极倾听为特征的支持性和移情互动。参与者对客户与 Pi 之间的积极倾听互动记录和客户与人类咨询师/治疗师之间的互动记录进行了盲评。结果表明,参与者无法区分人与人之间的咨询记录和人与人工智能之间的咨询记录,只有 47.5% 的时间回答正确。此外,参与者对人类-人工智能咨询记录誊本的质量评分高于人类-人类咨询记录誊本。这些发现为用户在咨询过程的早期阶段(积极倾听和问题探索)接受关系型人工智能聊天机器人提供了初步支持。
Human vs. AI counseling: College students' perspectives
Transitioning to college life while navigating the complexities of emerging adulthood can be stressful. In some instances, it may even lead to the onset of mental health problems or the exacerbation of existing issues. While therapeutic resources are typically available in tertiary educational contexts, social stigma may lead to service underutilization. Additionally, high student-to-therapist ratios can create bottlenecks to access when such services are sought. Offering an adjunct to traditional campus counseling services, AI chatbots can potentially address such issues. Chatbots can provide flexible, accessible, anonymous, and cost-effective first-line support, improving access and extending traditional treatment methodologies. This study evaluates college students' perceptions (N = 224) of an AI chatbot (Pi) designed to emulate supportive and empathetic interactions characterized by active listening. Participants blindly assessed transcripts from active listening interactions between a client and Pi versus interactions between a client and a human counselor/therapist. The results indicate that participants could not distinguish between the human-human and human-AI counseling transcripts, answering correctly only 47.5% of the time. Moreover, participants gave higher quality ratings to the human-AI counseling transcripts than the human-human ones. These findings provide tentative support for the user-acceptability of relational AI chatbots during the counseling process's early phases (active listening and problem exploration).