{"title":"OOD 检测的最新进展:问题与方法","authors":"Shuo Lu, YingSheng Wang, LuJun Sheng, AiHua Zheng, LinXiao He, Jian Liang","doi":"arxiv-2409.11884","DOIUrl":null,"url":null,"abstract":"Out-of-distribution (OOD) detection aims to detect test samples outside the\ntraining category space, which is an essential component in building reliable\nmachine learning systems. Existing reviews on OOD detection primarily focus on\nmethod taxonomy, surveying the field by categorizing various approaches.\nHowever, many recent works concentrate on non-traditional OOD detection\nscenarios, such as test-time adaptation, multi-modal data sources and other\nnovel contexts. In this survey, we uniquely review recent advances in OOD\ndetection from the problem scenario perspective for the first time. According\nto whether the training process is completely controlled, we divide OOD\ndetection methods into training-driven and training-agnostic. Besides,\nconsidering the rapid development of pre-trained models, large pre-trained\nmodel-based OOD detection is also regarded as an important category and\ndiscussed separately. Furthermore, we provide a discussion of the evaluation\nscenarios, a variety of applications, and several future research directions.\nWe believe this survey with new taxonomy will benefit the proposal of new\nmethods and the expansion of more practical scenarios. A curated list of\nrelated papers is provided in the Github repository:\n\\url{https://github.com/shuolucs/Awesome-Out-Of-Distribution-Detection}","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Recent Advances in OOD Detection: Problems and Approaches\",\"authors\":\"Shuo Lu, YingSheng Wang, LuJun Sheng, AiHua Zheng, LinXiao He, Jian Liang\",\"doi\":\"arxiv-2409.11884\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Out-of-distribution (OOD) detection aims to detect test samples outside the\\ntraining category space, which is an essential component in building reliable\\nmachine learning systems. Existing reviews on OOD detection primarily focus on\\nmethod taxonomy, surveying the field by categorizing various approaches.\\nHowever, many recent works concentrate on non-traditional OOD detection\\nscenarios, such as test-time adaptation, multi-modal data sources and other\\nnovel contexts. In this survey, we uniquely review recent advances in OOD\\ndetection from the problem scenario perspective for the first time. According\\nto whether the training process is completely controlled, we divide OOD\\ndetection methods into training-driven and training-agnostic. Besides,\\nconsidering the rapid development of pre-trained models, large pre-trained\\nmodel-based OOD detection is also regarded as an important category and\\ndiscussed separately. Furthermore, we provide a discussion of the evaluation\\nscenarios, a variety of applications, and several future research directions.\\nWe believe this survey with new taxonomy will benefit the proposal of new\\nmethods and the expansion of more practical scenarios. A curated list of\\nrelated papers is provided in the Github repository:\\n\\\\url{https://github.com/shuolucs/Awesome-Out-Of-Distribution-Detection}\",\"PeriodicalId\":501301,\"journal\":{\"name\":\"arXiv - CS - Machine Learning\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11884\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11884","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Recent Advances in OOD Detection: Problems and Approaches
Out-of-distribution (OOD) detection aims to detect test samples outside the
training category space, which is an essential component in building reliable
machine learning systems. Existing reviews on OOD detection primarily focus on
method taxonomy, surveying the field by categorizing various approaches.
However, many recent works concentrate on non-traditional OOD detection
scenarios, such as test-time adaptation, multi-modal data sources and other
novel contexts. In this survey, we uniquely review recent advances in OOD
detection from the problem scenario perspective for the first time. According
to whether the training process is completely controlled, we divide OOD
detection methods into training-driven and training-agnostic. Besides,
considering the rapid development of pre-trained models, large pre-trained
model-based OOD detection is also regarded as an important category and
discussed separately. Furthermore, we provide a discussion of the evaluation
scenarios, a variety of applications, and several future research directions.
We believe this survey with new taxonomy will benefit the proposal of new
methods and the expansion of more practical scenarios. A curated list of
related papers is provided in the Github repository:
\url{https://github.com/shuolucs/Awesome-Out-Of-Distribution-Detection}