Hui Liu, Jiacheng Gu, Xiyuan Huang, Junjie Shi, Tongtong Feng, Ning He
{"title":"TCDformer-based Momentum Transfer Model for Long-term Sports Prediction","authors":"Hui Liu, Jiacheng Gu, Xiyuan Huang, Junjie Shi, Tongtong Feng, Ning He","doi":"arxiv-2409.10176","DOIUrl":null,"url":null,"abstract":"Accurate sports prediction is a crucial skill for professional coaches, which\ncan assist in developing effective training strategies and scientific\ncompetition tactics. Traditional methods often use complex mathematical\nstatistical techniques to boost predictability, but this often is limited by\ndataset scale and has difficulty handling long-term predictions with variable\ndistributions, notably underperforming when predicting point-set-game\nmulti-level matches. To deal with this challenge, this paper proposes TM2, a\nTCDformer-based Momentum Transfer Model for long-term sports prediction, which\nencompasses a momentum encoding module and a prediction module based on\nmomentum transfer. TM2 initially encodes momentum in large-scale unstructured\ntime series using the local linear scaling approximation (LLSA) module. Then it\ndecomposes the reconstructed time series with momentum transfer into trend and\nseasonal components. The final prediction results are derived from the additive\ncombination of a multilayer perceptron (MLP) for predicting trend components\nand wavelet attention mechanisms for seasonal components. Comprehensive\nexperimental results show that on the 2023 Wimbledon men's tournament datasets,\nTM2 significantly surpasses existing sports prediction models in terms of\nperformance, reducing MSE by 61.64% and MAE by 63.64%.","PeriodicalId":501172,"journal":{"name":"arXiv - STAT - Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.10176","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate sports prediction is a crucial skill for professional coaches, which
can assist in developing effective training strategies and scientific
competition tactics. Traditional methods often use complex mathematical
statistical techniques to boost predictability, but this often is limited by
dataset scale and has difficulty handling long-term predictions with variable
distributions, notably underperforming when predicting point-set-game
multi-level matches. To deal with this challenge, this paper proposes TM2, a
TCDformer-based Momentum Transfer Model for long-term sports prediction, which
encompasses a momentum encoding module and a prediction module based on
momentum transfer. TM2 initially encodes momentum in large-scale unstructured
time series using the local linear scaling approximation (LLSA) module. Then it
decomposes the reconstructed time series with momentum transfer into trend and
seasonal components. The final prediction results are derived from the additive
combination of a multilayer perceptron (MLP) for predicting trend components
and wavelet attention mechanisms for seasonal components. Comprehensive
experimental results show that on the 2023 Wimbledon men's tournament datasets,
TM2 significantly surpasses existing sports prediction models in terms of
performance, reducing MSE by 61.64% and MAE by 63.64%.