Ning Zhang, Tao Mei, Xiansheng Hua, L. Guan, Shipeng Li
{"title":"Tap-to-search: Interactive and contextual visual search on mobile devices","authors":"Ning Zhang, Tao Mei, Xiansheng Hua, L. Guan, Shipeng Li","doi":"10.1109/MMSP.2011.6093802","DOIUrl":null,"url":null,"abstract":"Mobile visual search has been an emerging topic for both researching and industrial communities. Among various methods, visual search has its merit in providing an alternative solution, where text and voice searches are not applicable. This paper proposes an interactive “tap-to-search” approach utilizing both individual's intention in selecting interested regions via “tap” actions on the mobile touch screen, as well as a visual recognition by search mechanism in a large-scale image database. Automatic image segmentation technique is applied in order to provide region candidates. Visual vocabulary tree based search is adopted by incorporating rich contextual information which are collected from mobile sensors. The proposed approach has been conducted on an image dataset with the scale of two million. We demonstrated that using GPS contextual information, such an approach can further achieve satisfactory results with the standard information retrieval evaluation.","PeriodicalId":214459,"journal":{"name":"2011 IEEE 13th International Workshop on Multimedia Signal Processing","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE 13th International Workshop on Multimedia Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MMSP.2011.6093802","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Mobile visual search has been an emerging topic for both researching and industrial communities. Among various methods, visual search has its merit in providing an alternative solution, where text and voice searches are not applicable. This paper proposes an interactive “tap-to-search” approach utilizing both individual's intention in selecting interested regions via “tap” actions on the mobile touch screen, as well as a visual recognition by search mechanism in a large-scale image database. Automatic image segmentation technique is applied in order to provide region candidates. Visual vocabulary tree based search is adopted by incorporating rich contextual information which are collected from mobile sensors. The proposed approach has been conducted on an image dataset with the scale of two million. We demonstrated that using GPS contextual information, such an approach can further achieve satisfactory results with the standard information retrieval evaluation.