{"title":"OpenMoves","authors":"Samir Amin, J. Burke","doi":"10.1145/3212721.3212846","DOIUrl":null,"url":null,"abstract":"While person-tracking systems can capture very fine-grained, accurate data, the creation of art pieces and interactive experiences making use of captured data often benefits from being able to work with higher-level features. We propose a computational framework for interpreting person-tracking data and publishing the resulting information over a network for use by client applications, and emphasize the recognition of patterns of movement, both over time and instantaneously. Our system consists of four modules for tracking instantaneous features, short-time features, and using unsupervised and supervised machine learning techniques to extract features at higher levels of abstraction. Data used by the system is collected using OpenPTrack, an open-source library for person and object tracking geared towards accessibility to the arts and education communities.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Conference on Movement and Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3212721.3212846","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
While person-tracking systems can capture very fine-grained, accurate data, the creation of art pieces and interactive experiences making use of captured data often benefits from being able to work with higher-level features. We propose a computational framework for interpreting person-tracking data and publishing the resulting information over a network for use by client applications, and emphasize the recognition of patterns of movement, both over time and instantaneously. Our system consists of four modules for tracking instantaneous features, short-time features, and using unsupervised and supervised machine learning techniques to extract features at higher levels of abstraction. Data used by the system is collected using OpenPTrack, an open-source library for person and object tracking geared towards accessibility to the arts and education communities.