Pub Date : 2025-01-13DOI: 10.1109/JSAIT.2025.3528825
{"title":"2024 Index IEEE Journal on Selected Areas in Information Theory Vol. 5","authors":"","doi":"10.1109/JSAIT.2025.3528825","DOIUrl":"https://doi.org/10.1109/JSAIT.2025.3528825","url":null,"abstract":"","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"702-714"},"PeriodicalIF":0.0,"publicationDate":"2025-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10839060","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142976176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"JSAIT Issue on Information-Theoretic Methods for Trustworthy and Reliable Machine Learning","authors":"Lalitha Sankar;Oliver Kosut;Flavio Calmon;Ayfer Ozgur;Lele Wang;Ofer Shayevitz;Parastoo Sadeghi","doi":"10.1109/JSAIT.2024.3508492","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3508492","url":null,"abstract":"","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"xii-xv"},"PeriodicalIF":0.0,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10830758","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142938018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-06DOI: 10.1109/JSAIT.2024.3499012
Jun Chen;Jerry Gibson;Ioannis Kontoyiannis;Yingbin Liang;S. Sandeep Pradhan;Andreas Winter;Ram Zamir;Richard E. Blahut;Yasutada Oohama;Aaron B. Wagner;Raymond W. Yeung
{"title":"Editorial Data, Physics, and Life Through the Lens of Information Theory","authors":"Jun Chen;Jerry Gibson;Ioannis Kontoyiannis;Yingbin Liang;S. Sandeep Pradhan;Andreas Winter;Ram Zamir;Richard E. Blahut;Yasutada Oohama;Aaron B. Wagner;Raymond W. Yeung","doi":"10.1109/JSAIT.2024.3499012","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3499012","url":null,"abstract":"","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"iv-xi"},"PeriodicalIF":0.0,"publicationDate":"2025-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10826512","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142938013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-06DOI: 10.1109/JSAIT.2024.3519913
{"title":"Board of Governors","authors":"","doi":"10.1109/JSAIT.2024.3519913","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3519913","url":null,"abstract":"","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"C2-C2"},"PeriodicalIF":0.0,"publicationDate":"2025-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10826513","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142938012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper studies the rate-distortion-perception (RDP) tradeoff for a Gaussian vector source coding problem where the goal is to compress the multi-component source subject to distortion and perception constraints. Specifically, the RDP setting with either the Kullback-Leibler (KL) divergence or Wasserstein-2 metric as the perception loss function is examined, and it is shown that for Gaussian vector sources, jointly Gaussian reconstructions are optimal. We further demonstrate that the optimal tradeoff can be expressed as an optimization problem, which can be explicitly solved. An interesting property of the optimal solution is as follows. Without the perception constraint, the traditional reverse water-filling solution for characterizing the rate-distortion (RD) tradeoff of a Gaussian vector source states that the optimal rate allocated to each component depends on a constant, called the water level. If the variance of a specific component is below the water level, it is assigned a zero compression rate. However, with active distortion and perception constraints, we show that the optimal rates allocated to the different components are always positive. Moreover, the water levels that determine the optimal rate allocation for different components are unequal. We further treat the special case of perceptually perfect reconstruction and study its RDP function in the high-distortion and low-distortion regimes to obtain insight to the structure of the optimal solution.
{"title":"Rate-Distortion-Perception Tradeoff for Gaussian Vector Sources","authors":"Jingjing Qian;Sadaf Salehkalaibar;Jun Chen;Ashish Khisti;Wei Yu;Wuxian Shi;Yiqun Ge;Wen Tong","doi":"10.1109/JSAIT.2024.3509420","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3509420","url":null,"abstract":"This paper studies the rate-distortion-perception (RDP) tradeoff for a Gaussian vector source coding problem where the goal is to compress the multi-component source subject to distortion and perception constraints. Specifically, the RDP setting with either the Kullback-Leibler (KL) divergence or Wasserstein-2 metric as the perception loss function is examined, and it is shown that for Gaussian vector sources, jointly Gaussian reconstructions are optimal. We further demonstrate that the optimal tradeoff can be expressed as an optimization problem, which can be explicitly solved. An interesting property of the optimal solution is as follows. Without the perception constraint, the traditional reverse water-filling solution for characterizing the rate-distortion (RD) tradeoff of a Gaussian vector source states that the optimal rate allocated to each component depends on a constant, called the water level. If the variance of a specific component is below the water level, it is assigned a zero compression rate. However, with active distortion and perception constraints, we show that the optimal rates allocated to the different components are always positive. Moreover, the water levels that determine the optimal rate allocation for different components are unequal. We further treat the special case of perceptually perfect reconstruction and study its RDP function in the high-distortion and low-distortion regimes to obtain insight to the structure of the optimal solution.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"6 ","pages":"1-17"},"PeriodicalIF":0.0,"publicationDate":"2024-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143107142","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-11DOI: 10.1109/JSAIT.2024.3496197
Yasutada Oohama
We consider the one helper source coding problem posed and investigated by Ahlswede, Körner, and Wyner for a class of information sources with memory. For this class of information sources we give explicit inner and outer bounds of the admissible rate region. We also give a certain nontrivial class of information sources where the inner and outer bounds match.
{"title":"Source Coding for Markov Sources With Partial Memoryless Side Information at the Decoder","authors":"Yasutada Oohama","doi":"10.1109/JSAIT.2024.3496197","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3496197","url":null,"abstract":"We consider the one helper source coding problem posed and investigated by Ahlswede, Körner, and Wyner for a class of information sources with memory. For this class of information sources we give explicit inner and outer bounds of the admissible rate region. We also give a certain nontrivial class of information sources where the inner and outer bounds match.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"675-693"},"PeriodicalIF":0.0,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10750312","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142880454","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-31DOI: 10.1109/JSAIT.2024.3487856
Yichen Huang
In a spin chain governed by a local Hamiltonian, we consider a microcanonical ensemble in the middle of the energy spectrum and a contiguous subsystem whose length is a constant fraction of the system size. We prove that if the bandwidth of the ensemble is greater than a certain constant, then the average entanglement entropy (between the subsystem and the rest of the system) of eigenstates in the ensemble deviates from the maximum entropy by at least a positive constant. This result highlights the difference between the entanglement entropy of mid-spectrum eigenstates of (chaotic) local Hamiltonians and that of random states. We also prove that the former deviates from the thermodynamic entropy at the same energy by at least a positive constant.
{"title":"Deviation From Maximal Entanglement for Mid-Spectrum Eigenstates of Local Hamiltonians","authors":"Yichen Huang","doi":"10.1109/JSAIT.2024.3487856","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3487856","url":null,"abstract":"In a spin chain governed by a local Hamiltonian, we consider a microcanonical ensemble in the middle of the energy spectrum and a contiguous subsystem whose length is a constant fraction of the system size. We prove that if the bandwidth of the ensemble is greater than a certain constant, then the average entanglement entropy (between the subsystem and the rest of the system) of eigenstates in the ensemble deviates from the maximum entropy by at least a positive constant. This result highlights the difference between the entanglement entropy of mid-spectrum eigenstates of (chaotic) local Hamiltonians and that of random states. We also prove that the former deviates from the thermodynamic entropy at the same energy by at least a positive constant.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"694-701"},"PeriodicalIF":0.0,"publicationDate":"2024-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142825942","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-14DOI: 10.1109/JSAIT.2024.3481296
Tomer Berg;Or Ordentlich;Ofer Shayevitz
The problem of statistical inference in its various forms has been the subject of decades-long extensive research. Most of the effort has been focused on characterizing the behavior as a function of the number of available samples, with far less attention given to the effect of memory limitations on performance. Recently, this latter topic has drawn much interest in the engineering and computer science literature. In this survey paper, we attempt to review the state-of-the-art of statistical inference under memory constraints in several canonical problems, including hypothesis testing, parameter estimation, and distribution property testing/estimation. We discuss the main results in this developing field, and by identifying recurrent themes, we extract some fundamental building blocks for algorithmic construction, as well as useful techniques for lower bound derivations.
{"title":"Statistical Inference With Limited Memory: A Survey","authors":"Tomer Berg;Or Ordentlich;Ofer Shayevitz","doi":"10.1109/JSAIT.2024.3481296","DOIUrl":"https://doi.org/10.1109/JSAIT.2024.3481296","url":null,"abstract":"The problem of statistical inference in its various forms has been the subject of decades-long extensive research. Most of the effort has been focused on characterizing the behavior as a function of the number of available samples, with far less attention given to the effect of memory limitations on performance. Recently, this latter topic has drawn much interest in the engineering and computer science literature. In this survey paper, we attempt to review the state-of-the-art of statistical inference under memory constraints in several canonical problems, including hypothesis testing, parameter estimation, and distribution property testing/estimation. We discuss the main results in this developing field, and by identifying recurrent themes, we extract some fundamental building blocks for algorithmic construction, as well as useful techniques for lower bound derivations.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"623-644"},"PeriodicalIF":0.0,"publicationDate":"2024-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142595119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-30DOI: 10.1109/JSAIT.2024.3469929
Michael G. Jabbour;Nilanjana Datta
Uniform continuity bounds on entropies are generally expressed in terms of a single distance measure between probability distributions or quantum states, typically, the total variation- or trace distance. However, if an additional distance measure is known, the continuity bounds can be significantly strengthened. Here, we prove a tight uniform continuity bound for the Shannon entropy in terms of both the local- and total variation distances, sharpening an inequality in (Sason, 2013). We also obtain a uniform continuity bound for the von Neumann entropy in terms of both the operator norm- and trace distances. We then apply our results to compute upper bounds on channel capacities. We first refine the concept of approximate degradable channels by introducing $(varepsilon ,nu)$