{"title":"系统安全的情境自适应自动化程度","authors":"T. Inagaki","doi":"10.1109/ROMAN.1993.367716","DOIUrl":null,"url":null,"abstract":"This paper discusses responsibility allocation between human and computer, or degrees of automation, in supervisory control of large-complex systems. Strategies for responsibility allocation in emergencies are analyzed in a probabilistic manner by taking into account human's distrust on an alarm subsystem, inappropriate situation awareness, and dynamics of a controlled process under various situations. It is proven that degree of automation should not be fixed but must be changeable dynamically and flexibly depending on the situation. Criteria for setting degree of automation at an appropriate level are given. Thus obtained level for degree of automation may not satisfy the principle that \"a human locus of control is required\", if the principle is interpreted to the letter. That suggests the need for extending the current recognition on human supervisory control if we desire to attain or improve system safety. The situation-adaptive degree of automation is indispensable for realizing human-centered automation.<<ETX>>","PeriodicalId":270591,"journal":{"name":"Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1993-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"23","resultStr":"{\"title\":\"Situation-adaptive degree of automation for system safety\",\"authors\":\"T. Inagaki\",\"doi\":\"10.1109/ROMAN.1993.367716\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper discusses responsibility allocation between human and computer, or degrees of automation, in supervisory control of large-complex systems. Strategies for responsibility allocation in emergencies are analyzed in a probabilistic manner by taking into account human's distrust on an alarm subsystem, inappropriate situation awareness, and dynamics of a controlled process under various situations. It is proven that degree of automation should not be fixed but must be changeable dynamically and flexibly depending on the situation. Criteria for setting degree of automation at an appropriate level are given. Thus obtained level for degree of automation may not satisfy the principle that \\\"a human locus of control is required\\\", if the principle is interpreted to the letter. That suggests the need for extending the current recognition on human supervisory control if we desire to attain or improve system safety. The situation-adaptive degree of automation is indispensable for realizing human-centered automation.<<ETX>>\",\"PeriodicalId\":270591,\"journal\":{\"name\":\"Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1993-11-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"23\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROMAN.1993.367716\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.1993.367716","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Situation-adaptive degree of automation for system safety
This paper discusses responsibility allocation between human and computer, or degrees of automation, in supervisory control of large-complex systems. Strategies for responsibility allocation in emergencies are analyzed in a probabilistic manner by taking into account human's distrust on an alarm subsystem, inappropriate situation awareness, and dynamics of a controlled process under various situations. It is proven that degree of automation should not be fixed but must be changeable dynamically and flexibly depending on the situation. Criteria for setting degree of automation at an appropriate level are given. Thus obtained level for degree of automation may not satisfy the principle that "a human locus of control is required", if the principle is interpreted to the letter. That suggests the need for extending the current recognition on human supervisory control if we desire to attain or improve system safety. The situation-adaptive degree of automation is indispensable for realizing human-centered automation.<>