{"title":"选择性修剪和神经元死亡产生重尾网络连通性","authors":"Rodrigo Siqueira Kazu, Kleber Neves, Bruno Mota","doi":"arxiv-2408.02625","DOIUrl":null,"url":null,"abstract":"From the proliferative mechanisms generating neurons from progenitor cells to\nneuron migration and synaptic connection formation, several vicissitudes\nculminate in the mature brain. Both component loss and gain remain ubiquitous\nduring brain development. For example, rodent brains lose over half of their\ninitial neurons and synapses during healthy development. The role of\ndeleterious steps in network ontogeny remains unclear, yet it is unlikely these\ncostly processes are random. Like neurogenesis and synaptogenesis, synaptic\npruning and neuron death likely evolved to support complex, efficient\ncomputations. In order to incorporate both component loss and gain in\ndescribing neuronal networks, we propose an algorithm where a directed network\nevolves through the selective deletion of less-connected nodes (neurons) and\nedges (synapses). Resulting in networks that display scale-invariant degree\ndistributions, provided the network is predominantly feed-forward.\nScale-invariance offers several advantages in biological networks: scalability,\nresistance to random deletions, and strong connectivity with parsimonious\nwiring. Whilst our algorithm is not intended to be a realistic model of\nneuronal network formation, our results suggest selective deletion is an\nadaptive mechanism contributing to more stable and efficient networks. This\nprocess aligns with observed decreasing pruning rates in animal studies,\nresulting in higher synapse preservation. Our overall findings have broader\nimplications for network science. Scale-invariance in degree distributions was\ndemonstrated in growing preferential attachment networks and observed\nempirically. Our preferential detachment algorithm offers an alternative\nmechanism for generating such networks, suggesting that both mechanisms may be\npart of a broader class of algorithms resulting in scale-free networks.","PeriodicalId":501044,"journal":{"name":"arXiv - QuanBio - Populations and Evolution","volume":"6 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Selective pruning and neuronal death generate heavy-tail network connectivity\",\"authors\":\"Rodrigo Siqueira Kazu, Kleber Neves, Bruno Mota\",\"doi\":\"arxiv-2408.02625\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"From the proliferative mechanisms generating neurons from progenitor cells to\\nneuron migration and synaptic connection formation, several vicissitudes\\nculminate in the mature brain. Both component loss and gain remain ubiquitous\\nduring brain development. For example, rodent brains lose over half of their\\ninitial neurons and synapses during healthy development. The role of\\ndeleterious steps in network ontogeny remains unclear, yet it is unlikely these\\ncostly processes are random. Like neurogenesis and synaptogenesis, synaptic\\npruning and neuron death likely evolved to support complex, efficient\\ncomputations. In order to incorporate both component loss and gain in\\ndescribing neuronal networks, we propose an algorithm where a directed network\\nevolves through the selective deletion of less-connected nodes (neurons) and\\nedges (synapses). Resulting in networks that display scale-invariant degree\\ndistributions, provided the network is predominantly feed-forward.\\nScale-invariance offers several advantages in biological networks: scalability,\\nresistance to random deletions, and strong connectivity with parsimonious\\nwiring. Whilst our algorithm is not intended to be a realistic model of\\nneuronal network formation, our results suggest selective deletion is an\\nadaptive mechanism contributing to more stable and efficient networks. This\\nprocess aligns with observed decreasing pruning rates in animal studies,\\nresulting in higher synapse preservation. Our overall findings have broader\\nimplications for network science. Scale-invariance in degree distributions was\\ndemonstrated in growing preferential attachment networks and observed\\nempirically. Our preferential detachment algorithm offers an alternative\\nmechanism for generating such networks, suggesting that both mechanisms may be\\npart of a broader class of algorithms resulting in scale-free networks.\",\"PeriodicalId\":501044,\"journal\":{\"name\":\"arXiv - QuanBio - Populations and Evolution\",\"volume\":\"6 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - QuanBio - Populations and Evolution\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.02625\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Populations and Evolution","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.02625","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Selective pruning and neuronal death generate heavy-tail network connectivity
From the proliferative mechanisms generating neurons from progenitor cells to
neuron migration and synaptic connection formation, several vicissitudes
culminate in the mature brain. Both component loss and gain remain ubiquitous
during brain development. For example, rodent brains lose over half of their
initial neurons and synapses during healthy development. The role of
deleterious steps in network ontogeny remains unclear, yet it is unlikely these
costly processes are random. Like neurogenesis and synaptogenesis, synaptic
pruning and neuron death likely evolved to support complex, efficient
computations. In order to incorporate both component loss and gain in
describing neuronal networks, we propose an algorithm where a directed network
evolves through the selective deletion of less-connected nodes (neurons) and
edges (synapses). Resulting in networks that display scale-invariant degree
distributions, provided the network is predominantly feed-forward.
Scale-invariance offers several advantages in biological networks: scalability,
resistance to random deletions, and strong connectivity with parsimonious
wiring. Whilst our algorithm is not intended to be a realistic model of
neuronal network formation, our results suggest selective deletion is an
adaptive mechanism contributing to more stable and efficient networks. This
process aligns with observed decreasing pruning rates in animal studies,
resulting in higher synapse preservation. Our overall findings have broader
implications for network science. Scale-invariance in degree distributions was
demonstrated in growing preferential attachment networks and observed
empirically. Our preferential detachment algorithm offers an alternative
mechanism for generating such networks, suggesting that both mechanisms may be
part of a broader class of algorithms resulting in scale-free networks.