{"title":"在时间数据中发现潜在趋势的一种模型","authors":"Ity Kaul, É. Martin, V. Puri","doi":"10.1109/ISKE.2017.8258812","DOIUrl":null,"url":null,"abstract":"Trend detection in financial temporal data is a significant problem, with far-reaching applications, that presents researchers with many challenges. Existing techniques require users to choose a given interval, and then provide an approximation of the data on that interval; they always produce some approximation, namely, a member of a class of candidate functions that is \"best\" according to some criteria. Moreover, financial analysis can be performed from different perspectives, at different levels, from short term to long term; it is therefore very desirable to be able to indicate a scale that is suitable and adapted to the analysis of interest. Based on these considerations, our objective was to design a method that lets users input a scale factor, determines the intervals on which an approximation captures a significant trend as a function of the scale factor, and proposes a qualification of the trend. The method we use combines various machine-learning and statistical techniques, a key role being played by a change-point detection method. We describe the architecture of a system that implements the proposed method. Finally, we report on the experiments we ran and use their results to stress how they differ from the results than can be obtained from alternative approaches.","PeriodicalId":208009,"journal":{"name":"2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A model for the detection of underlying trends in temporal data\",\"authors\":\"Ity Kaul, É. Martin, V. Puri\",\"doi\":\"10.1109/ISKE.2017.8258812\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Trend detection in financial temporal data is a significant problem, with far-reaching applications, that presents researchers with many challenges. Existing techniques require users to choose a given interval, and then provide an approximation of the data on that interval; they always produce some approximation, namely, a member of a class of candidate functions that is \\\"best\\\" according to some criteria. Moreover, financial analysis can be performed from different perspectives, at different levels, from short term to long term; it is therefore very desirable to be able to indicate a scale that is suitable and adapted to the analysis of interest. Based on these considerations, our objective was to design a method that lets users input a scale factor, determines the intervals on which an approximation captures a significant trend as a function of the scale factor, and proposes a qualification of the trend. The method we use combines various machine-learning and statistical techniques, a key role being played by a change-point detection method. We describe the architecture of a system that implements the proposed method. Finally, we report on the experiments we ran and use their results to stress how they differ from the results than can be obtained from alternative approaches.\",\"PeriodicalId\":208009,\"journal\":{\"name\":\"2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISKE.2017.8258812\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISKE.2017.8258812","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A model for the detection of underlying trends in temporal data
Trend detection in financial temporal data is a significant problem, with far-reaching applications, that presents researchers with many challenges. Existing techniques require users to choose a given interval, and then provide an approximation of the data on that interval; they always produce some approximation, namely, a member of a class of candidate functions that is "best" according to some criteria. Moreover, financial analysis can be performed from different perspectives, at different levels, from short term to long term; it is therefore very desirable to be able to indicate a scale that is suitable and adapted to the analysis of interest. Based on these considerations, our objective was to design a method that lets users input a scale factor, determines the intervals on which an approximation captures a significant trend as a function of the scale factor, and proposes a qualification of the trend. The method we use combines various machine-learning and statistical techniques, a key role being played by a change-point detection method. We describe the architecture of a system that implements the proposed method. Finally, we report on the experiments we ran and use their results to stress how they differ from the results than can be obtained from alternative approaches.