{"title":"Agent Learning and Autoregressive Modeling","authors":"J. Gibson","doi":"10.1109/ITA50056.2020.9244971","DOIUrl":null,"url":null,"abstract":"Relative entropy is used to investigate whether a sequence is memoryless or has memory and to discern the presence of any structure in the sequence. Particular emphasis is placed on obtaining expressions for finite sequence length N and autoregressive sequences with known and unknown autocorrelations. We relate our results to the terms entropy gain, information gain, and redundancy as defined in agent learning studies, and show that these terms can be bounded using the mean squared error due to linear prediction of a stationary sequence.","PeriodicalId":137257,"journal":{"name":"2020 Information Theory and Applications Workshop (ITA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Information Theory and Applications Workshop (ITA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITA50056.2020.9244971","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Relative entropy is used to investigate whether a sequence is memoryless or has memory and to discern the presence of any structure in the sequence. Particular emphasis is placed on obtaining expressions for finite sequence length N and autoregressive sequences with known and unknown autocorrelations. We relate our results to the terms entropy gain, information gain, and redundancy as defined in agent learning studies, and show that these terms can be bounded using the mean squared error due to linear prediction of a stationary sequence.