Pub Date : 2021-10-10DOI: 10.26634/jmat.11.2.19190
A. Kaltchenko
We introduce a notion of Kolmogorov complexity of unitary transformation, which can (roughly) be understood as the least possible amount of information required to fully describe and reconstruct a given finite unitary transformation. In the context of quantum computing, it corresponds to the least possible amount of data to define and describe a quantum circuit or quantum computer program. Our Kolmogorov complexity of unitary transformation is built upon Kolmogorov "qubit complexity" of Berthiaume, W. Van Dam and S. Laplante via mapping from unitary transformations to unnormalized density operators, which are subsequently "purified" into unnormalized vectors in Hilbert space. We discuss the optimality of our notion of Kolmogorov complexity in a broad sense.
我们引入了酉变换的Kolmogorov复杂度的概念,它可以(粗略地)被理解为完全描述和重构给定的有限酉变换所需的最少信息量。在量子计算的背景下,它对应于尽可能少的数据量来定义和描述量子电路或量子计算机程序。我们的幺正变换的Kolmogorov复杂度是建立在Berthiaume, W. Van Dam和S. Laplante的Kolmogorov“量子比特复杂度”的基础上,通过将幺正变换映射到非规范化密度算子,然后将其“纯化”成Hilbert空间中的非规范化向量。我们从广义上讨论了Kolmogorov复杂性概念的最优性。
{"title":"On Kolmogorov Complexity of Unitary Transformations in Quantum Computing","authors":"A. Kaltchenko","doi":"10.26634/jmat.11.2.19190","DOIUrl":"https://doi.org/10.26634/jmat.11.2.19190","url":null,"abstract":"We introduce a notion of Kolmogorov complexity of unitary transformation, which can (roughly) be understood as the least possible amount of information required to fully describe and reconstruct a given finite unitary transformation. In the context of quantum computing, it corresponds to the least possible amount of data to define and describe a quantum circuit or quantum computer program. Our Kolmogorov complexity of unitary transformation is built upon Kolmogorov \"qubit complexity\" of Berthiaume, W. Van Dam and S. Laplante via mapping from unitary transformations to unnormalized density operators, which are subsequently \"purified\" into unnormalized vectors in Hilbert space. We discuss the optimality of our notion of Kolmogorov complexity in a broad sense.","PeriodicalId":297202,"journal":{"name":"i-manager’s Journal on Mathematics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124499320","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.26634/jmat.12.1.19398
Danish Sana, Ul Rahman Jamshaid, Haider Gulfam
Optimizers in Convolutional Neural Networks play an important role in many advanced deep learning models. Studies on advanced optimizers and modifications of existing optimizers continue to hold significant importance in the study of machine tools and algorithms. There are a number of studies to defend and the selection of these optimizers illustrate some of the challenges on the effectiveness of these optimizers. Comprehensive analysis on the optimizers and alteration with famous activation function Rectified Linear Unit (ReLU) offered to protect effectiveness. Significance is determined based on the adjustment with the original Softmax and ReLU. Experiments were performed with Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad) and Stochastic Gradient Descent (SGD) to examine the performance of Convolutional Neural Networks for image classification using the Canadian Institute for Advanced Research dataset (CIFAR-10).
{"title":"Performance analysis of convolutional neural networks for image classification with appropriate optimizers","authors":"Danish Sana, Ul Rahman Jamshaid, Haider Gulfam","doi":"10.26634/jmat.12.1.19398","DOIUrl":"https://doi.org/10.26634/jmat.12.1.19398","url":null,"abstract":"Optimizers in Convolutional Neural Networks play an important role in many advanced deep learning models. Studies on advanced optimizers and modifications of existing optimizers continue to hold significant importance in the study of machine tools and algorithms. There are a number of studies to defend and the selection of these optimizers illustrate some of the challenges on the effectiveness of these optimizers. Comprehensive analysis on the optimizers and alteration with famous activation function Rectified Linear Unit (ReLU) offered to protect effectiveness. Significance is determined based on the adjustment with the original Softmax and ReLU. Experiments were performed with Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad) and Stochastic Gradient Descent (SGD) to examine the performance of Convolutional Neural Networks for image classification using the Canadian Institute for Advanced Research dataset (CIFAR-10).","PeriodicalId":297202,"journal":{"name":"i-manager’s Journal on Mathematics","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128739940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.26634/jmat.11.1.18469
A. Rehman, Maqsood-Ul-Hassan
Analytic Rayleigh wave speed formula in nonlinear orthotropic medium is determined. Speed of Rayleigh waves in iodic acid, a specimen of non-linear orthotropic materials, is calculated and is compared with that of the speed in linear orthotropic iodic acid. In linear iodic acid, three distinct Rayleigh waves propagate with speeds 53.44 km/s, 80.94 km/s, and 125.37 km/s, respectively. While, in nonlinear iodic acid these three waves become coincident and only one Rayleigh wave seem to propagate with velocity 63.68 km/s.
{"title":"Analytic rayleigh wave speed formula in non-linear orthotropic material","authors":"A. Rehman, Maqsood-Ul-Hassan","doi":"10.26634/jmat.11.1.18469","DOIUrl":"https://doi.org/10.26634/jmat.11.1.18469","url":null,"abstract":"Analytic Rayleigh wave speed formula in nonlinear orthotropic medium is determined. Speed of Rayleigh waves in iodic acid, a specimen of non-linear orthotropic materials, is calculated and is compared with that of the speed in linear orthotropic iodic acid. In linear iodic acid, three distinct Rayleigh waves propagate with speeds 53.44 km/s, 80.94 km/s, and 125.37 km/s, respectively. While, in nonlinear iodic acid these three waves become coincident and only one Rayleigh wave seem to propagate with velocity 63.68 km/s.","PeriodicalId":297202,"journal":{"name":"i-manager’s Journal on Mathematics","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126385966","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}