当多任务学习遇到部分监督:计算机视觉回顾

IF 23.2 1区 计算机科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC Proceedings of the IEEE Pub Date : 2024-08-07 DOI:10.1109/JPROC.2024.3435012
Maxime Fontana;Michael Spratling;Miaojing Shi
{"title":"当多任务学习遇到部分监督:计算机视觉回顾","authors":"Maxime Fontana;Michael Spratling;Miaojing Shi","doi":"10.1109/JPROC.2024.3435012","DOIUrl":null,"url":null,"abstract":"Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at \n<uri>https://github.com/Klodivio355/MTL-CV-Review</uri>\n.","PeriodicalId":20556,"journal":{"name":"Proceedings of the IEEE","volume":"112 6","pages":"516-543"},"PeriodicalIF":23.2000,"publicationDate":"2024-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"When Multitask Learning Meets Partial Supervision: A Computer Vision Review\",\"authors\":\"Maxime Fontana;Michael Spratling;Miaojing Shi\",\"doi\":\"10.1109/JPROC.2024.3435012\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at \\n<uri>https://github.com/Klodivio355/MTL-CV-Review</uri>\\n.\",\"PeriodicalId\":20556,\"journal\":{\"name\":\"Proceedings of the IEEE\",\"volume\":\"112 6\",\"pages\":\"516-543\"},\"PeriodicalIF\":23.2000,\"publicationDate\":\"2024-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the IEEE\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10628096/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the IEEE","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10628096/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

多任务学习(MTL)旨在同时学习多个任务,同时利用它们之间的相互关系。通过使用共享资源同时计算多个输出,这种学习范式有可能比针对每个任务使用单独方法的传统方法具有更低的内存要求和推理时间。以往的 MTL 工作主要集中在完全监督方法上,因为任务关系(TR)不仅可以用来降低这些方法的数据依赖程度,还能提高性能。然而,由于复杂的优化方案和更高的标记要求,MTL 引入了一系列挑战。本文将重点讨论如何在不同的部分监督设置下利用 MTL 来应对这些挑战。首先,本文分析了 MTL 传统上如何使用不同的参数共享技术在任务间传递知识。其次,文章介绍了这种多目标优化(MOO)方案带来的不同挑战。第三,介绍如何通过分析 TR 实现任务分组(TG)。第四,文章重点介绍了应用于 MTL 的部分监督方法如何应对上述挑战。最后,本文介绍了此类方法的可用数据集、工具和基准测试结果。按照本作品分类的综述文章可在 https://github.com/Klodivio355/MTL-CV-Review 上查阅。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
When Multitask Learning Meets Partial Supervision: A Computer Vision Review
Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at https://github.com/Klodivio355/MTL-CV-Review .
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Proceedings of the IEEE
Proceedings of the IEEE 工程技术-工程:电子与电气
CiteScore
46.40
自引率
1.00%
发文量
160
审稿时长
3-8 weeks
期刊介绍: Proceedings of the IEEE is the leading journal to provide in-depth review, survey, and tutorial coverage of the technical developments in electronics, electrical and computer engineering, and computer science. Consistently ranked as one of the top journals by Impact Factor, Article Influence Score and more, the journal serves as a trusted resource for engineers around the world.
期刊最新文献
Front Cover Table of Contents IEEE Membership Future Special Issues/Special Sections of the Proceedings TechRxiv
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1