Advances in single-cell technology have enabled the measurement of cell-resolved molecular states across a variety of cell lines and tissues under a plethora of genetic, chemical, environmental or disease perturbations. Current methods focus on differential comparison or are specific to a particular task in a multi-condition setting with purely statistical perspectives. The quickly growing number, size and complexity of such studies require a scalable analysis framework that takes existing biological context into account. Here we present pertpy, a Python-based modular framework for the analysis of large-scale single-cell perturbation experiments. Pertpy provides access to harmonized perturbation datasets and metadata databases along with numerous fast and user-friendly implementations of both established and novel methods, such as automatic metadata annotation or perturbation distances, to efficiently analyze perturbation data. As part of the scverse ecosystem, pertpy interoperates with existing single-cell analysis libraries and is designed to be easily extended.
Spatial transcriptomics (ST) has revolutionized our understanding of tissue architecture, yet constructing comprehensive three-dimensional (3D) cell atlases remains challenging due to technical limitations and high cost. Current approaches typically capture only sparsely sampled two-dimensional sections, leaving substantial gaps that limit our understanding of continuous organ organization. Here, we present SpatialZ, a computational framework that bridges these gaps by generating virtual slices between experimentally measured sections, enabling the construction of dense 3D cell atlases from planar ST data. SpatialZ is designed to operate at single-cell resolution and function independently of gene coverage limitations inherent to specific spatial technologies. Comprehensive validation demonstrates that SpatialZ accurately preserves cell identities, gene expression patterns and spatial relationships. Leveraging the BRAIN Initiative Cell Census Network data, we constructed a 3D hemisphere atlas comprising over 38 million cells. This dense atlas enables new capabilities, including in silico sectioning at arbitrary angles, explorations of gene expression across both 3D volumes and surfaces, 3D mapping of query tissue sections, and discovery of 3D spatial molecular architectures through new synthesized views. To demonstrate its extensibility beyond transcriptomics, we applied SpatialZ to imaging mass cytometry data from human breast cancer, successfully deciphering 3D spatial gradients within the tumor microenvironment. Our approach generates cell atlases that provide previously unattainable 3D resolution of spatial molecular landscapes.
Super-resolution microscopy (SRM) has revolutionized nanoscale cellular imaging, providing detailed insights into cellular architecture, organelle organization, molecular interactions and subcellular dynamics. Artificial intelligence (AI) has shown its transformative potential for improving SRM to advance our understanding of complex cellular structures and dynamics. This Review begins by offering a comprehensive overview of AI techniques in computer vision, focusing on their application to SRM. Additionally, this Review provides a thorough summary of publicly available code and datasets that can support the development and evaluation of AI-empowered SRM. Notably, many AI techniques in the domain of computer vision remain underexplored in SRM. The ongoing evolution of AI promises to unlock new potential in SRM, and the integration of cutting-edge AI technologies is poised to pioneer breakthroughs in nanoscale cellular imaging.

