事情出错,船长必须处理

IF 2.2 Q3 ENGINEERING, INDUSTRIAL Journal of Cognitive Engineering and Decision Making Pub Date : 2024-04-01 DOI:10.1177/15553434241236536
Amy R. Pritchett
{"title":"事情出错,船长必须处理","authors":"Amy R. Pritchett","doi":"10.1177/15553434241236536","DOIUrl":null,"url":null,"abstract":"How to characterize automation failures? From the point of view of complex, multi-agent operations, I argue that their definition and modeling is the most useful when it accounts for the important drivers of operational performance and safety. This calls us beyond a focus on one-human, one-system performing one task, to a team simultaneously executing many activities in which things are always failing. Much of this activity is defined by the dynamics of the work environment, which can be modeled and predicted. Further, having a human legally responsible for the outcome of the automation’s actions significantly colors the dynamic.","PeriodicalId":46342,"journal":{"name":"Journal of Cognitive Engineering and Decision Making","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Things Go Wrong and the Captain Has to Handle it\",\"authors\":\"Amy R. Pritchett\",\"doi\":\"10.1177/15553434241236536\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"How to characterize automation failures? From the point of view of complex, multi-agent operations, I argue that their definition and modeling is the most useful when it accounts for the important drivers of operational performance and safety. This calls us beyond a focus on one-human, one-system performing one task, to a team simultaneously executing many activities in which things are always failing. Much of this activity is defined by the dynamics of the work environment, which can be modeled and predicted. Further, having a human legally responsible for the outcome of the automation’s actions significantly colors the dynamic.\",\"PeriodicalId\":46342,\"journal\":{\"name\":\"Journal of Cognitive Engineering and Decision Making\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2024-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Cognitive Engineering and Decision Making\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/15553434241236536\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, INDUSTRIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Cognitive Engineering and Decision Making","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/15553434241236536","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 0

摘要

如何描述自动化故障?从复杂的多代理运行的角度来看,我认为,如果能考虑到运行性能和安全的重要驱动因素,对其进行定义和建模是最有用的。这就要求我们不能只关注一个人、一个系统执行一项任务,而是要关注一个团队同时执行多项活动,而在这些活动中,总会出现故障。其中大部分活动是由工作环境的动态变化决定的,而工作环境的动态变化是可以建模和预测的。此外,让人类对自动化行动的结果负法律责任,也会大大增加动态效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Things Go Wrong and the Captain Has to Handle it
How to characterize automation failures? From the point of view of complex, multi-agent operations, I argue that their definition and modeling is the most useful when it accounts for the important drivers of operational performance and safety. This calls us beyond a focus on one-human, one-system performing one task, to a team simultaneously executing many activities in which things are always failing. Much of this activity is defined by the dynamics of the work environment, which can be modeled and predicted. Further, having a human legally responsible for the outcome of the automation’s actions significantly colors the dynamic.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.60
自引率
10.00%
发文量
21
期刊最新文献
Introduction to the Special Issue on Automation Failure Augmenting Human Cognition With a Digital Submarine Periscope Get on the Round Dial: Fighter Pilot Strategies for Recovering Situation Awareness After Disorienting Physiological Events Distinguishing Urgent From Non-urgent Communications: A Mixed Methods Study of Communication Technology Use in Perinatal Care Wrong, Strong, and Silent: What Happens when Automated Systems With High Autonomy and High Authority Misbehave?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1