James L. Evans, Matthew T. Bramlet, Connor Davey, Eliot Bethke, Aaron T. Anderson, Graham Huesmann, Yogatheesan Varatharajah, Andres Maldonado, Jennifer R. Amos, Bradley P. Sutton
{"title":"SEEG4D: a tool for 4D visualization of stereoelectroencephalography data","authors":"James L. Evans, Matthew T. Bramlet, Connor Davey, Eliot Bethke, Aaron T. Anderson, Graham Huesmann, Yogatheesan Varatharajah, Andres Maldonado, Jennifer R. Amos, Bradley P. Sutton","doi":"10.3389/fninf.2024.1465231","DOIUrl":null,"url":null,"abstract":"Epilepsy is a prevalent and serious neurological condition which impacts millions of people worldwide. Stereoelectroencephalography (sEEG) is used in cases of drug resistant epilepsy to aid in surgical resection planning due to its high spatial resolution and ability to visualize seizure onset zones. For accurate localization of the seizure focus, sEEG studies combine pre-implantation magnetic resonance imaging, post-implant computed tomography to visualize electrodes, and temporally recorded sEEG electrophysiological data. Many tools exist to assist in merging multimodal spatial information; however, few allow for an integrated spatiotemporal view of the electrical activity. In the current work, we present SEEG4D, an automated tool to merge spatial and temporal data into a complete, four-dimensional virtual reality (VR) object with temporal electrophysiology that enables the simultaneous viewing of anatomy and seizure activity for seizure localization and presurgical planning. We developed an automated, containerized pipeline to segment tissues and electrode contacts. Contacts are aligned with electrical activity and then animated based on relative power. SEEG4D generates models which can be loaded into VR platforms for viewing and planning with the surgical team. Automated contact segmentation locations are within 1 mm of trained raters and models generated show signal propagation along electrodes. Critically, spatial–temporal information communicated through our models in a VR space have potential to enhance sEEG pre-surgical planning.","PeriodicalId":12462,"journal":{"name":"Frontiers in Neuroinformatics","volume":"271 1","pages":""},"PeriodicalIF":2.5000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Neuroinformatics","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fninf.2024.1465231","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Epilepsy is a prevalent and serious neurological condition which impacts millions of people worldwide. Stereoelectroencephalography (sEEG) is used in cases of drug resistant epilepsy to aid in surgical resection planning due to its high spatial resolution and ability to visualize seizure onset zones. For accurate localization of the seizure focus, sEEG studies combine pre-implantation magnetic resonance imaging, post-implant computed tomography to visualize electrodes, and temporally recorded sEEG electrophysiological data. Many tools exist to assist in merging multimodal spatial information; however, few allow for an integrated spatiotemporal view of the electrical activity. In the current work, we present SEEG4D, an automated tool to merge spatial and temporal data into a complete, four-dimensional virtual reality (VR) object with temporal electrophysiology that enables the simultaneous viewing of anatomy and seizure activity for seizure localization and presurgical planning. We developed an automated, containerized pipeline to segment tissues and electrode contacts. Contacts are aligned with electrical activity and then animated based on relative power. SEEG4D generates models which can be loaded into VR platforms for viewing and planning with the surgical team. Automated contact segmentation locations are within 1 mm of trained raters and models generated show signal propagation along electrodes. Critically, spatial–temporal information communicated through our models in a VR space have potential to enhance sEEG pre-surgical planning.
期刊介绍:
Frontiers in Neuroinformatics publishes rigorously peer-reviewed research on the development and implementation of numerical/computational models and analytical tools used to share, integrate and analyze experimental data and advance theories of the nervous system functions. Specialty Chief Editors Jan G. Bjaalie at the University of Oslo and Sean L. Hill at the École Polytechnique Fédérale de Lausanne are supported by an outstanding Editorial Board of international experts. This multidisciplinary open-access journal is at the forefront of disseminating and communicating scientific knowledge and impactful discoveries to researchers, academics and the public worldwide.
Neuroscience is being propelled into the information age as the volume of information explodes, demanding organization and synthesis. Novel synthesis approaches are opening up a new dimension for the exploration of the components of brain elements and systems and the vast number of variables that underlie their functions. Neural data is highly heterogeneous with complex inter-relations across multiple levels, driving the need for innovative organizing and synthesizing approaches from genes to cognition, and covering a range of species and disease states.
Frontiers in Neuroinformatics therefore welcomes submissions on existing neuroscience databases, development of data and knowledge bases for all levels of neuroscience, applications and technologies that can facilitate data sharing (interoperability, formats, terminologies, and ontologies), and novel tools for data acquisition, analyses, visualization, and dissemination of nervous system data. Our journal welcomes submissions on new tools (software and hardware) that support brain modeling, and the merging of neuroscience databases with brain models used for simulation and visualization.