Pub Date : 2025-12-30DOI: 10.1038/s41592-025-02930-w
We developed SmartEM, a method that integrates machine learning directly into the image acquisition process of an electron microscope. By allocating imaging time in a specific manner — scanning quickly at first, then rescanning only critical areas more slowly — we are able to accelerate the mapping of neural circuits up to sevenfold without sacrificing accuracy.
{"title":"AI-guided electron microscopy accelerates brain mapping","authors":"","doi":"10.1038/s41592-025-02930-w","DOIUrl":"10.1038/s41592-025-02930-w","url":null,"abstract":"We developed SmartEM, a method that integrates machine learning directly into the image acquisition process of an electron microscope. By allocating imaging time in a specific manner — scanning quickly at first, then rescanning only critical areas more slowly — we are able to accelerate the mapping of neural circuits up to sevenfold without sacrificing accuracy.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 1","pages":"28-29"},"PeriodicalIF":32.1,"publicationDate":"2025-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145863833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-29DOI: 10.1038/s41592-025-02929-3
Yaron Meirovitch, Ishaan Singh Chandok, Core Francisco Park, Pavel Potocek, Lu Mi, Shashata Sawmya, Yicong Li, Thomas L. Athey, Vladislav Susoy, Neha Karlupia, Yuelong Wu, Daniel R. Berger, Richard Schalek, Caitlyn A. Bishop, Daniel Xenes, Hannah Martinez, Jordan Matelsky, Brock A. Wester, Hanspeter Pfister, Remco Schoenmakers, Maurice Peemen, Jeff W. Lichtman, Aravinthan D. T. Samuel, Nir Shavit
Connectomics provides nanometer-resolution, synapse-level maps of neural circuits to understand brain activity and behavior. However, few researchers have access to the high-throughput electron microscopes necessary to generate enough data for whole-brain or even whole-circuit reconstruction. To date, machine learning methods have been used after the collection of images by electron microscopy (EM) to accelerate and improve neuronal segmentation, synapse reconstruction and other data analysis. With the continual computational improvements in processing EM images, acquiring EM images will become the rate-limiting step in automated connectomics. Here, in order to speed up EM imaging, we integrate machine learning into real-time image acquisition in a single-beam scanning electron microscope. This SmartEM approach allows an electron microscope to perform data-aware imaging of specimens. SmartEM saves time by allocating the proper imaging time for each region of interest—first scanning all pixels rapidly and then rescanning more slowly only the small subareas where a higher quality signal is required. We demonstrate that SmartEM achieves up to an ~7-fold acceleration of image acquisition time for connectomic samples using a commercial single-beam SEM in samples from nematodes, mice and human brain. We apply this fast imaging method to reconstruct a portion of mouse cerebral cortex with an accuracy comparable to traditional electron microscopy. SmartEM is a ‘smart’ pipeline for electron microscopy-based data acquisition for connectomics. In order to efficiently image large datasets, the approach involves imaging at short pixel dwell times and identifying problematic regions that are then imaged with longer dwell times and therefore higher quality.
{"title":"SmartEM: machine learning-guided electron microscopy","authors":"Yaron Meirovitch, Ishaan Singh Chandok, Core Francisco Park, Pavel Potocek, Lu Mi, Shashata Sawmya, Yicong Li, Thomas L. Athey, Vladislav Susoy, Neha Karlupia, Yuelong Wu, Daniel R. Berger, Richard Schalek, Caitlyn A. Bishop, Daniel Xenes, Hannah Martinez, Jordan Matelsky, Brock A. Wester, Hanspeter Pfister, Remco Schoenmakers, Maurice Peemen, Jeff W. Lichtman, Aravinthan D. T. Samuel, Nir Shavit","doi":"10.1038/s41592-025-02929-3","DOIUrl":"10.1038/s41592-025-02929-3","url":null,"abstract":"Connectomics provides nanometer-resolution, synapse-level maps of neural circuits to understand brain activity and behavior. However, few researchers have access to the high-throughput electron microscopes necessary to generate enough data for whole-brain or even whole-circuit reconstruction. To date, machine learning methods have been used after the collection of images by electron microscopy (EM) to accelerate and improve neuronal segmentation, synapse reconstruction and other data analysis. With the continual computational improvements in processing EM images, acquiring EM images will become the rate-limiting step in automated connectomics. Here, in order to speed up EM imaging, we integrate machine learning into real-time image acquisition in a single-beam scanning electron microscope. This SmartEM approach allows an electron microscope to perform data-aware imaging of specimens. SmartEM saves time by allocating the proper imaging time for each region of interest—first scanning all pixels rapidly and then rescanning more slowly only the small subareas where a higher quality signal is required. We demonstrate that SmartEM achieves up to an ~7-fold acceleration of image acquisition time for connectomic samples using a commercial single-beam SEM in samples from nematodes, mice and human brain. We apply this fast imaging method to reconstruct a portion of mouse cerebral cortex with an accuracy comparable to traditional electron microscopy. SmartEM is a ‘smart’ pipeline for electron microscopy-based data acquisition for connectomics. In order to efficiently image large datasets, the approach involves imaging at short pixel dwell times and identifying problematic regions that are then imaged with longer dwell times and therefore higher quality.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 1","pages":"193-204"},"PeriodicalIF":32.1,"publicationDate":"2025-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145857319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Despite advances in whole-brain imaging technologies, the lack of quantitative approaches to bridge rodent preclinical and human studies remains a critical challenge. Here we present TransBrain, a computational framework enabling bidirectional translation of brain-wide phenotypes between humans and mice. TransBrain improves human–mouse homology mapping accuracy through (1) a cortical and subcortical detached region-specific deep neural network trained on integrated multimodal human transcriptomics to improve cortical correspondence (89.5% improvement over the original transcriptome), which revealed 2 evolutionarily conserved gradients, and (2) a graph-based approach to construct a unified cross-species representational space incorporating anatomical hierarchies and structural connectivity. We demonstrate TransBrain’s utility through three cross-species applications: quantitative assessment of resting-state brain organizational features, inferring human cognitive functions from mouse optogenetic circuits and translating molecular insights from mouse models to individual-level mechanisms in autism. TransBrain enables quantitative cross-species comparison and mechanistic investigation of both normal and pathological brain functions. TransBrain translates brain phenotypes between mouse and human via homology mapping, thus making it possible to capitalize on the wealth of knowledge about the mouse brain and gain insights into the human brain.
{"title":"TransBrain: a computational framework for translating brain-wide phenotypes between humans and mice","authors":"Shangzheng Huang, Tongyu Zhang, Changsheng Dong, Yingchao Shi, Yingjie Peng, Xiya Liu, Kaixin Li, Luqi Cheng, Qi Wang, Yini He, Yitong Guo, Fengqian Xiao, Xiaohan Tian, Junxing Xian, Changjiang Zhang, Qian Wu, Yijuan Zou, Long Li, Bing Liu, Xiaoqun Wang, Ang Li","doi":"10.1038/s41592-025-02961-3","DOIUrl":"10.1038/s41592-025-02961-3","url":null,"abstract":"Despite advances in whole-brain imaging technologies, the lack of quantitative approaches to bridge rodent preclinical and human studies remains a critical challenge. Here we present TransBrain, a computational framework enabling bidirectional translation of brain-wide phenotypes between humans and mice. TransBrain improves human–mouse homology mapping accuracy through (1) a cortical and subcortical detached region-specific deep neural network trained on integrated multimodal human transcriptomics to improve cortical correspondence (89.5% improvement over the original transcriptome), which revealed 2 evolutionarily conserved gradients, and (2) a graph-based approach to construct a unified cross-species representational space incorporating anatomical hierarchies and structural connectivity. We demonstrate TransBrain’s utility through three cross-species applications: quantitative assessment of resting-state brain organizational features, inferring human cognitive functions from mouse optogenetic circuits and translating molecular insights from mouse models to individual-level mechanisms in autism. TransBrain enables quantitative cross-species comparison and mechanistic investigation of both normal and pathological brain functions. TransBrain translates brain phenotypes between mouse and human via homology mapping, thus making it possible to capitalize on the wealth of knowledge about the mouse brain and gain insights into the human brain.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 2","pages":"426-437"},"PeriodicalIF":32.1,"publicationDate":"2025-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145857428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-29DOI: 10.1038/s41592-025-02934-6
Jorge Barrasa-Fano, Apeksha Shapeti, Alejandro Apolinar-Fernández, Laurens Kimps, Bart Smeets, José Antonio Sanz-Herrera, Hans Van Oosterwyck
The field of mechanobiology studies how mechanical forces influence cell behavior, relying on tools like traction force microscopy (TFM) to quantify cell forces exerted on the extracellular matrix. While well established for two-dimensional in vitro systems, its three-dimensional form, 3DTFM, remains underutilized despite notable technical advancements. Here, we outline common skepticism about 3DTFM, detailing current experimental and computational strategies to address its limitations. We describe how to integrate 3DTFM with biological readouts, focusing on its application in long-term experiments. We discuss metrics for data interpretation and how pairing these with optimal traction recovery methods can address specific biological questions. Finally, we outline future directions by proposing combinations with emerging technologies to address challenges like extracellular matrix heterogeneity and intracellular stress analysis within three-dimensional cell clusters. By addressing these critical gaps, this Perspective aims to advance 3DTFM's utility, promote its broader adoption and guide future developments in mechanobiology.
{"title":"Guidance for 3D traction force microscopy today and in the next decade.","authors":"Jorge Barrasa-Fano, Apeksha Shapeti, Alejandro Apolinar-Fernández, Laurens Kimps, Bart Smeets, José Antonio Sanz-Herrera, Hans Van Oosterwyck","doi":"10.1038/s41592-025-02934-6","DOIUrl":"https://doi.org/10.1038/s41592-025-02934-6","url":null,"abstract":"<p><p>The field of mechanobiology studies how mechanical forces influence cell behavior, relying on tools like traction force microscopy (TFM) to quantify cell forces exerted on the extracellular matrix. While well established for two-dimensional in vitro systems, its three-dimensional form, 3DTFM, remains underutilized despite notable technical advancements. Here, we outline common skepticism about 3DTFM, detailing current experimental and computational strategies to address its limitations. We describe how to integrate 3DTFM with biological readouts, focusing on its application in long-term experiments. We discuss metrics for data interpretation and how pairing these with optimal traction recovery methods can address specific biological questions. Finally, we outline future directions by proposing combinations with emerging technologies to address challenges like extracellular matrix heterogeneity and intracellular stress analysis within three-dimensional cell clusters. By addressing these critical gaps, this Perspective aims to advance 3DTFM's utility, promote its broader adoption and guide future developments in mechanobiology.</p>","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":" ","pages":""},"PeriodicalIF":32.1,"publicationDate":"2025-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145857272","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-24DOI: 10.1038/s41592-025-02897-8
Matthew F. Lefebvre, Vishank Jain-Sharma, Nikolas Claussen, Noah P. Mitchell, Marion K. Raich, Hannah J. Gustafson, Friederike E. Streichan, Andreas R. Bausch, Sebastian J. Streichan
Living organisms develop their shape through the interplay of gene expression and mechanics. While atlases of static samples characterize cell fates and gene regulation, understanding dynamic shape changes requires live imaging. Here we present DynamicAtlas: a ‘morphodynamic atlas’ of live and static datasets from 500 Drosophila melanogaster embryos (wild type and 18 mutants), aligned to a common morphological timeline. Surprisingly, characterizing wild-type surface tissue flows reveals distinct ‘morphodynamic modules’—time periods in which the global pattern of motion is stationary—corresponding to key developmental stages. Mutant analysis shows stationary flow patterns depend on genes that break spatial symmetry along the dorsal–ventral axis. Temperature perturbations indicate that morphodynamic modules change in response to accumulated tissue deformation, rather than elapsed time. Extending our approach to the embryonic Drosophila midgut, we find modules in covariant measures of the dynamic three-dimensional surface. DynamicAtlas provides a high-resolution framework for studying shape formation across living systems. DynamicAtlas integrates fixed and live imaging data to generate a morphodynamic atlas of Drosophila development.
{"title":"DynamicAtlas: a morphodynamic atlas for Drosophila development","authors":"Matthew F. Lefebvre, Vishank Jain-Sharma, Nikolas Claussen, Noah P. Mitchell, Marion K. Raich, Hannah J. Gustafson, Friederike E. Streichan, Andreas R. Bausch, Sebastian J. Streichan","doi":"10.1038/s41592-025-02897-8","DOIUrl":"10.1038/s41592-025-02897-8","url":null,"abstract":"Living organisms develop their shape through the interplay of gene expression and mechanics. While atlases of static samples characterize cell fates and gene regulation, understanding dynamic shape changes requires live imaging. Here we present DynamicAtlas: a ‘morphodynamic atlas’ of live and static datasets from 500 Drosophila melanogaster embryos (wild type and 18 mutants), aligned to a common morphological timeline. Surprisingly, characterizing wild-type surface tissue flows reveals distinct ‘morphodynamic modules’—time periods in which the global pattern of motion is stationary—corresponding to key developmental stages. Mutant analysis shows stationary flow patterns depend on genes that break spatial symmetry along the dorsal–ventral axis. Temperature perturbations indicate that morphodynamic modules change in response to accumulated tissue deformation, rather than elapsed time. Extending our approach to the embryonic Drosophila midgut, we find modules in covariant measures of the dynamic three-dimensional surface. DynamicAtlas provides a high-resolution framework for studying shape formation across living systems. DynamicAtlas integrates fixed and live imaging data to generate a morphodynamic atlas of Drosophila development.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 1","pages":"260-270"},"PeriodicalIF":32.1,"publicationDate":"2025-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s41592-025-02897-8.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145828024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-24DOI: 10.1038/s41592-025-02950-6
Miriam Osterfield
DynamicAtlas is a new open-source tool for incorporating gene expression and tissue shape changes into a single atlas with a continuous developmental timeline.
{"title":"Tissue maps in motion","authors":"Miriam Osterfield","doi":"10.1038/s41592-025-02950-6","DOIUrl":"10.1038/s41592-025-02950-6","url":null,"abstract":"DynamicAtlas is a new open-source tool for incorporating gene expression and tissue shape changes into a single atlas with a continuous developmental timeline.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 1","pages":"22-23"},"PeriodicalIF":32.1,"publicationDate":"2025-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145828068","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-23DOI: 10.1038/s41592-025-02992-w
Alexander J. S. Beckett, David A. Feinberg
A technique called SASS increases temporal signal-to-noise ratio for functional MRI by taking advantage of the time it takes to reach steady state when collecting functional images.
{"title":"Taking advantage of non-steady-state imaging to increase temporal SNR for fMRI","authors":"Alexander J. S. Beckett, David A. Feinberg","doi":"10.1038/s41592-025-02992-w","DOIUrl":"10.1038/s41592-025-02992-w","url":null,"abstract":"A technique called SASS increases temporal signal-to-noise ratio for functional MRI by taking advantage of the time it takes to reach steady state when collecting functional images.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 1","pages":"20-21"},"PeriodicalIF":32.1,"publicationDate":"2025-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145820357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-23DOI: 10.1038/s41592-025-02965-z
Abhi Aggarwal, Adrian Negrean, Yang Chen, Rishyashring Iyer, Daniel Reep, Anyi Liu, Anirudh Palutla, Michael E. Xie, Bryan J. MacLennan, Kenta M. Hagihara, Lucas W. Kinsey, Julianna L. Sun, Pantong Yao, Jihong Zheng, Arthur Tsang, Getahun Tsegaye, Yonghai Zhang, Ronak H. Patel, Benjamin J. Arthur, Julien Hiblot, Philipp Leippe, Miroslaw Tarnawski, Jonathan S. Marvin, Jason D. Vevea, Srinivas C. Turaga, Alison G. Tebo, Matteo Carandini, L. Federico Rossi, David Kleinfeld, Arthur Konnerth, Karel Svoboda, Glenn C. Turner, Jeremy P. Hasseman, Kaspar Podgorski
Understanding how neurons integrate signals from thousands of input synapses requires methods to monitor neurotransmission across many sites simultaneously. The fluorescent protein glutamate indicator iGluSnFR enables visualization of synaptic signaling, but the sensitivity, scale and speed of such measurements are limited by existing variants. Here we developed two highly sensitive fourth-generation iGluSnFR variants with fast activation and tailored deactivation rates: iGluSnFR4f for tracking rapid dynamics, and iGluSnFR4s for recording from large populations of synapses. These indicators detect glutamate with high spatial specificity and single-vesicle sensitivity in vivo. We used them to record natural patterns of synaptic transmission across multiple experimental contexts in mice, including two-photon imaging in cortical layers 1–4 and hippocampal CA1, and photometry in the midbrain. The iGluSnFR4 variants extend the speed, sensitivity and scalability of glutamate imaging, enabling direct observation of information flow through neural networks in the intact brain. iGluSnFR4f and iGluSnFR4s are the latest generation of genetically encoded glutamate sensors. They are advantageous for detecting rapid dynamics and large population activity, respectively, as demonstrated in a variety of applications in the mouse brain.
{"title":"Glutamate indicators with increased sensitivity and tailored deactivation rates","authors":"Abhi Aggarwal, Adrian Negrean, Yang Chen, Rishyashring Iyer, Daniel Reep, Anyi Liu, Anirudh Palutla, Michael E. Xie, Bryan J. MacLennan, Kenta M. Hagihara, Lucas W. Kinsey, Julianna L. Sun, Pantong Yao, Jihong Zheng, Arthur Tsang, Getahun Tsegaye, Yonghai Zhang, Ronak H. Patel, Benjamin J. Arthur, Julien Hiblot, Philipp Leippe, Miroslaw Tarnawski, Jonathan S. Marvin, Jason D. Vevea, Srinivas C. Turaga, Alison G. Tebo, Matteo Carandini, L. Federico Rossi, David Kleinfeld, Arthur Konnerth, Karel Svoboda, Glenn C. Turner, Jeremy P. Hasseman, Kaspar Podgorski","doi":"10.1038/s41592-025-02965-z","DOIUrl":"10.1038/s41592-025-02965-z","url":null,"abstract":"Understanding how neurons integrate signals from thousands of input synapses requires methods to monitor neurotransmission across many sites simultaneously. The fluorescent protein glutamate indicator iGluSnFR enables visualization of synaptic signaling, but the sensitivity, scale and speed of such measurements are limited by existing variants. Here we developed two highly sensitive fourth-generation iGluSnFR variants with fast activation and tailored deactivation rates: iGluSnFR4f for tracking rapid dynamics, and iGluSnFR4s for recording from large populations of synapses. These indicators detect glutamate with high spatial specificity and single-vesicle sensitivity in vivo. We used them to record natural patterns of synaptic transmission across multiple experimental contexts in mice, including two-photon imaging in cortical layers 1–4 and hippocampal CA1, and photometry in the midbrain. The iGluSnFR4 variants extend the speed, sensitivity and scalability of glutamate imaging, enabling direct observation of information flow through neural networks in the intact brain. iGluSnFR4f and iGluSnFR4s are the latest generation of genetically encoded glutamate sensors. They are advantageous for detecting rapid dynamics and large population activity, respectively, as demonstrated in a variety of applications in the mouse brain.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 2","pages":"417-425"},"PeriodicalIF":32.1,"publicationDate":"2025-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s41592-025-02965-z.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145820382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-22DOI: 10.1038/s41592-025-02964-0
We developed SpaceBar, a method that uses DNA barcodes to label both individual cells and their progeny and seamlessly integrates with high-resolution imaging-based spatial transcriptomics technologies. This new approach enables the elucidation of how a cell’s location and ancestry jointly inform its function and gene expression in complex tissues.
{"title":"Tracking cell ancestry and spatial gene expression with high resolution","authors":"","doi":"10.1038/s41592-025-02964-0","DOIUrl":"10.1038/s41592-025-02964-0","url":null,"abstract":"We developed SpaceBar, a method that uses DNA barcodes to label both individual cells and their progeny and seamlessly integrates with high-resolution imaging-based spatial transcriptomics technologies. This new approach enables the elucidation of how a cell’s location and ancestry jointly inform its function and gene expression in complex tissues.","PeriodicalId":18981,"journal":{"name":"Nature Methods","volume":"23 2","pages":"291-292"},"PeriodicalIF":32.1,"publicationDate":"2025-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145810659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}