Pablo Díez-Valle, Fernando Martínez-García, Juan José García-Ripoll, Diego Porras
{"title":"利用矩阵乘积状态学习广义统计力学","authors":"Pablo Díez-Valle, Fernando Martínez-García, Juan José García-Ripoll, Diego Porras","doi":"arxiv-2409.08352","DOIUrl":null,"url":null,"abstract":"We introduce a variational algorithm based on Matrix Product States that is\ntrained by minimizing a generalized free energy defined using Tsallis entropy\ninstead of the standard Gibbs entropy. As a result, our model can generate the\nprobability distributions associated with generalized statistical mechanics.\nThe resulting model can be efficiently trained, since the resulting free energy\nand its gradient can be calculated exactly through tensor network contractions,\nas opposed to standard methods which require estimating the Gibbs entropy by\nsampling. We devise a variational annealing scheme by ramping up the inverse\ntemperature, which allows us to train the model while avoiding getting trapped\nin local minima. We show the validity of our approach in Ising spin-glass\nproblems by comparing it to exact numerical results and quasi-exact analytical\napproximations. Our work opens up new possibilities for studying generalized\nstatistical physics and solving combinatorial optimization problems with tensor\nnetworks.","PeriodicalId":501520,"journal":{"name":"arXiv - PHYS - Statistical Mechanics","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Learning Generalized Statistical Mechanics with Matrix Product States\",\"authors\":\"Pablo Díez-Valle, Fernando Martínez-García, Juan José García-Ripoll, Diego Porras\",\"doi\":\"arxiv-2409.08352\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We introduce a variational algorithm based on Matrix Product States that is\\ntrained by minimizing a generalized free energy defined using Tsallis entropy\\ninstead of the standard Gibbs entropy. As a result, our model can generate the\\nprobability distributions associated with generalized statistical mechanics.\\nThe resulting model can be efficiently trained, since the resulting free energy\\nand its gradient can be calculated exactly through tensor network contractions,\\nas opposed to standard methods which require estimating the Gibbs entropy by\\nsampling. We devise a variational annealing scheme by ramping up the inverse\\ntemperature, which allows us to train the model while avoiding getting trapped\\nin local minima. We show the validity of our approach in Ising spin-glass\\nproblems by comparing it to exact numerical results and quasi-exact analytical\\napproximations. Our work opens up new possibilities for studying generalized\\nstatistical physics and solving combinatorial optimization problems with tensor\\nnetworks.\",\"PeriodicalId\":501520,\"journal\":{\"name\":\"arXiv - PHYS - Statistical Mechanics\",\"volume\":\"11 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Statistical Mechanics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.08352\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Statistical Mechanics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08352","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Learning Generalized Statistical Mechanics with Matrix Product States
We introduce a variational algorithm based on Matrix Product States that is
trained by minimizing a generalized free energy defined using Tsallis entropy
instead of the standard Gibbs entropy. As a result, our model can generate the
probability distributions associated with generalized statistical mechanics.
The resulting model can be efficiently trained, since the resulting free energy
and its gradient can be calculated exactly through tensor network contractions,
as opposed to standard methods which require estimating the Gibbs entropy by
sampling. We devise a variational annealing scheme by ramping up the inverse
temperature, which allows us to train the model while avoiding getting trapped
in local minima. We show the validity of our approach in Ising spin-glass
problems by comparing it to exact numerical results and quasi-exact analytical
approximations. Our work opens up new possibilities for studying generalized
statistical physics and solving combinatorial optimization problems with tensor
networks.