智能手机位置识别和运输模式识别使用生成对抗网络的集合

Lukas Günthermann, Ivor Simpson, D. Roggen
{"title":"智能手机位置识别和运输模式识别使用生成对抗网络的集合","authors":"Lukas Günthermann, Ivor Simpson, D. Roggen","doi":"10.1145/3410530.3414353","DOIUrl":null,"url":null,"abstract":"We present a generative adversarial network (GAN) approach to recognising modes of transportation from smartphone motion sensor data, as part of our contribution to the Sussex-Huawei Locomotion-Transportation (SHL) recognition challenge 2020 as team noname. Our approach identifies the location where the smartphone of the test dataset is carried on the body through heuristics, after which a location-specific model is trained based on the available published data at this location. Performance on the validation data is 0.95, which we expect to be very similar on the test set, if our estimation of the location of the phone on the test set is correct. We are highly confident in this location estimation. If however it were wrong, an accuracy as low as 30% could be expected.","PeriodicalId":7183,"journal":{"name":"Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers","volume":"14 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Smartphone location identification and transport mode recognition using an ensemble of generative adversarial networks\",\"authors\":\"Lukas Günthermann, Ivor Simpson, D. Roggen\",\"doi\":\"10.1145/3410530.3414353\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a generative adversarial network (GAN) approach to recognising modes of transportation from smartphone motion sensor data, as part of our contribution to the Sussex-Huawei Locomotion-Transportation (SHL) recognition challenge 2020 as team noname. Our approach identifies the location where the smartphone of the test dataset is carried on the body through heuristics, after which a location-specific model is trained based on the available published data at this location. Performance on the validation data is 0.95, which we expect to be very similar on the test set, if our estimation of the location of the phone on the test set is correct. We are highly confident in this location estimation. If however it were wrong, an accuracy as low as 30% could be expected.\",\"PeriodicalId\":7183,\"journal\":{\"name\":\"Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers\",\"volume\":\"14 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3410530.3414353\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3410530.3414353","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

我们提出了一种生成对抗网络(GAN)方法,用于从智能手机运动传感器数据中识别交通方式,作为我们对2020年苏塞克斯-华为交通运输(SHL)识别挑战的贡献的一部分,作为团队名称。我们的方法通过启发式方法确定测试数据集的智能手机携带在身体上的位置,然后根据该位置的可用发布数据训练特定于位置的模型。验证数据上的性能为0.95,如果我们对手机在测试集上的位置的估计是正确的,我们期望在测试集上的性能非常相似。我们对这个位置估计非常有信心。然而,如果它是错误的,准确率可能会低至30%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Smartphone location identification and transport mode recognition using an ensemble of generative adversarial networks
We present a generative adversarial network (GAN) approach to recognising modes of transportation from smartphone motion sensor data, as part of our contribution to the Sussex-Huawei Locomotion-Transportation (SHL) recognition challenge 2020 as team noname. Our approach identifies the location where the smartphone of the test dataset is carried on the body through heuristics, after which a location-specific model is trained based on the available published data at this location. Performance on the validation data is 0.95, which we expect to be very similar on the test set, if our estimation of the location of the phone on the test set is correct. We are highly confident in this location estimation. If however it were wrong, an accuracy as low as 30% could be expected.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Using gamification to create and label photos that are challenging for computer vision and people Pose evaluation for dance learning application using joint position and angular similarity SParking: a win-win data-driven contract parking sharing system HeadgearX Blink rate variability: a marker of sustained attention during a visual task
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1