Pub Date : 2024-08-06DOI: 10.1109/TIT.2024.3439408
Ziling Heng;Xiaoru Li;Yansheng Wu;Qi Wang
Linear codes are widely studied in coding theory as they have nice applications in distributed storage, combinatorics, lattices, cryptography and so on. Constructing linear codes with desirable properties is an interesting research topic. In this paper, based on the augmentation technique, we present two families of linear codes from some functions over finite fields. The first family of linear codes is constructed from monomial functions over finite fields. The weight distribution of the codes is determined in some cases. The codes are proved to be both optimally or almost optimally extendable and self-orthogonal under certain conditions. The localities of the codes and their duals are also studied and we obtain an infinite family of optimal or almost optimal locally recoverable codes. The second family of linear codes is constructed from weakly regular bent functions over finite fields and its weight distribution is explicitly determined. This family of codes is also proved to be both optimally or almost optimally extendable and self-orthogonal. Besides, this family of codes has been proven to have locality 2 or 3 under certain conditions. Particularly, we derive two infinite families of optimal locally recoverable codes. Some infinite families of 2-designs are obtained from the codes in this paper as byproducts.
{"title":"Two Families of Linear Codes With Desirable Properties From Some Functions Over Finite Fields","authors":"Ziling Heng;Xiaoru Li;Yansheng Wu;Qi Wang","doi":"10.1109/TIT.2024.3439408","DOIUrl":"10.1109/TIT.2024.3439408","url":null,"abstract":"Linear codes are widely studied in coding theory as they have nice applications in distributed storage, combinatorics, lattices, cryptography and so on. Constructing linear codes with desirable properties is an interesting research topic. In this paper, based on the augmentation technique, we present two families of linear codes from some functions over finite fields. The first family of linear codes is constructed from monomial functions over finite fields. The weight distribution of the codes is determined in some cases. The codes are proved to be both optimally or almost optimally extendable and self-orthogonal under certain conditions. The localities of the codes and their duals are also studied and we obtain an infinite family of optimal or almost optimal locally recoverable codes. The second family of linear codes is constructed from weakly regular bent functions over finite fields and its weight distribution is explicitly determined. This family of codes is also proved to be both optimally or almost optimally extendable and self-orthogonal. Besides, this family of codes has been proven to have locality 2 or 3 under certain conditions. Particularly, we derive two infinite families of optimal locally recoverable codes. Some infinite families of 2-designs are obtained from the codes in this paper as byproducts.","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"70 11","pages":"8320-8342"},"PeriodicalIF":2.2,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141947531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-05DOI: 10.1109/tit.2024.3435008
Dor Elimelech, Wasim Huleihel
{"title":"Detection of Correlated Random Vectors","authors":"Dor Elimelech, Wasim Huleihel","doi":"10.1109/tit.2024.3435008","DOIUrl":"https://doi.org/10.1109/tit.2024.3435008","url":null,"abstract":"","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"62 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141947534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-05DOI: 10.1109/TIT.2024.3439136
Yuesheng Xu;Haizhang Zhang
We consider deep neural networks (DNNs) with a Lipschitz continuous activation function and with weight matrices of variable widths. We establish a uniform convergence analysis framework in which sufficient conditions on weight matrices and bias vectors together with the Lipschitz constant are provided to ensure uniform convergence of DNNs to a meaningful function as the number of their layers tends to infinity. In the framework, special results on uniform convergence of DNNs with a fixed width, bounded widths and unbounded widths are presented. In particular, as convolutional neural networks are special DNNs with weight matrices of increasing widths, we put forward conditions on the mask sequence which lead to uniform convergence of the resulting convolutional neural networks. The Lipschitz continuity assumption on the activation functions allows us to include in our theory most of commonly used activation functions in applications.
{"title":"Uniform Convergence of Deep Neural Networks With Lipschitz Continuous Activation Functions and Variable Widths","authors":"Yuesheng Xu;Haizhang Zhang","doi":"10.1109/TIT.2024.3439136","DOIUrl":"10.1109/TIT.2024.3439136","url":null,"abstract":"We consider deep neural networks (DNNs) with a Lipschitz continuous activation function and with weight matrices of variable widths. We establish a uniform convergence analysis framework in which sufficient conditions on weight matrices and bias vectors together with the Lipschitz constant are provided to ensure uniform convergence of DNNs to a meaningful function as the number of their layers tends to infinity. In the framework, special results on uniform convergence of DNNs with a fixed width, bounded widths and unbounded widths are presented. In particular, as convolutional neural networks are special DNNs with weight matrices of increasing widths, we put forward conditions on the mask sequence which lead to uniform convergence of the resulting convolutional neural networks. The Lipschitz continuity assumption on the activation functions allows us to include in our theory most of commonly used activation functions in applications.","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"70 10","pages":"7125-7142"},"PeriodicalIF":2.2,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10623495","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141947533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-01DOI: 10.1109/TIT.2024.3436923
Minjia Shi;Shitao Li;Tor Helleseth
Determining the weight distribution of a code is an old and fundamental topic in coding theory that has been thoroughly studied. In 1977, Helleseth, Kløve, and Mykkeltveit presented a weight enumerator polynomial of the lifted code over ${mathbb {F}}_{q^{ell } }$