{"title":"基于互信息的视频跟踪中强度、纹理和颜色的融合","authors":"J. Mundy, Chung-Fu Chang","doi":"10.1109/AIPR.2004.26","DOIUrl":null,"url":null,"abstract":"Next-generation reconnaissance systems (NGRS) offer dynamic tasking of a menu of sensor modalities such as video, multi/hyper-spectral and polarization data. A key issue is how best to exploit these modes in time critical scenarios such as target tracking and event detection. It is essential to be able to represent diverse sensor content in a unified measurement space so that the contribution of each modality can be evaluated in terms of its contribution to the exploitation task. In this paper, mutual information is used to represent the content of individual sensor channels. A series of experiments on video tracking have been carried out to demonstrate the effectiveness of mutual information as a fusion framework. These experiments quantify the relative information content of intensity, color, and polarization image channels.","PeriodicalId":120814,"journal":{"name":"33rd Applied Imagery Pattern Recognition Workshop (AIPR'04)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":"{\"title\":\"Fusion of intensity, texture, and color in video tracking based on mutual information\",\"authors\":\"J. Mundy, Chung-Fu Chang\",\"doi\":\"10.1109/AIPR.2004.26\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Next-generation reconnaissance systems (NGRS) offer dynamic tasking of a menu of sensor modalities such as video, multi/hyper-spectral and polarization data. A key issue is how best to exploit these modes in time critical scenarios such as target tracking and event detection. It is essential to be able to represent diverse sensor content in a unified measurement space so that the contribution of each modality can be evaluated in terms of its contribution to the exploitation task. In this paper, mutual information is used to represent the content of individual sensor channels. A series of experiments on video tracking have been carried out to demonstrate the effectiveness of mutual information as a fusion framework. These experiments quantify the relative information content of intensity, color, and polarization image channels.\",\"PeriodicalId\":120814,\"journal\":{\"name\":\"33rd Applied Imagery Pattern Recognition Workshop (AIPR'04)\",\"volume\":\"65 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"18\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"33rd Applied Imagery Pattern Recognition Workshop (AIPR'04)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AIPR.2004.26\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"33rd Applied Imagery Pattern Recognition Workshop (AIPR'04)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIPR.2004.26","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fusion of intensity, texture, and color in video tracking based on mutual information
Next-generation reconnaissance systems (NGRS) offer dynamic tasking of a menu of sensor modalities such as video, multi/hyper-spectral and polarization data. A key issue is how best to exploit these modes in time critical scenarios such as target tracking and event detection. It is essential to be able to represent diverse sensor content in a unified measurement space so that the contribution of each modality can be evaluated in terms of its contribution to the exploitation task. In this paper, mutual information is used to represent the content of individual sensor channels. A series of experiments on video tracking have been carried out to demonstrate the effectiveness of mutual information as a fusion framework. These experiments quantify the relative information content of intensity, color, and polarization image channels.