A. García, S. Bourov, A. Hammad, T. Jejkal, Jens C. Otte, S. Pfeiffer, T. Schenker, Christian Schmidt, J. V. Wezel, Bernhard Neumair, A. Streit
{"title":"大型数据设施的数据管理和分析","authors":"A. García, S. Bourov, A. Hammad, T. Jejkal, Jens C. Otte, S. Pfeiffer, T. Schenker, Christian Schmidt, J. V. Wezel, Bernhard Neumair, A. Streit","doi":"10.1109/ICDIM.2011.6093357","DOIUrl":null,"url":null,"abstract":"The Large Scale Data Facility (LSDF) was started at the Karlsruhe Institute of Technology (KIT) end of 2009 to address the growing need for value-added storage services for its data intensive experiments. The main focus of the project is to provide scientific communities producing data collections in the tera — to petabyte range with the necessary hardware infrastructure as well as with adequate value-added services and support for the data management, processing, and preservation. In this work we describe the project's infrastructure and services design, as well as its meta data handling. Both community specific meta data schemes, a meta data repository, an application programming interface and a graphical tool for accessing the resources were developed to further support the processing workflows of the partner scientific communities. The analysis workflow of high throughput microscopy images for studying biomedical processes is described in detail.","PeriodicalId":355775,"journal":{"name":"2011 Sixth International Conference on Digital Information Management","volume":"90 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Data management and analysis at the Large Scale Data Facility\",\"authors\":\"A. García, S. Bourov, A. Hammad, T. Jejkal, Jens C. Otte, S. Pfeiffer, T. Schenker, Christian Schmidt, J. V. Wezel, Bernhard Neumair, A. Streit\",\"doi\":\"10.1109/ICDIM.2011.6093357\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Large Scale Data Facility (LSDF) was started at the Karlsruhe Institute of Technology (KIT) end of 2009 to address the growing need for value-added storage services for its data intensive experiments. The main focus of the project is to provide scientific communities producing data collections in the tera — to petabyte range with the necessary hardware infrastructure as well as with adequate value-added services and support for the data management, processing, and preservation. In this work we describe the project's infrastructure and services design, as well as its meta data handling. Both community specific meta data schemes, a meta data repository, an application programming interface and a graphical tool for accessing the resources were developed to further support the processing workflows of the partner scientific communities. The analysis workflow of high throughput microscopy images for studying biomedical processes is described in detail.\",\"PeriodicalId\":355775,\"journal\":{\"name\":\"2011 Sixth International Conference on Digital Information Management\",\"volume\":\"90 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 Sixth International Conference on Digital Information Management\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDIM.2011.6093357\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 Sixth International Conference on Digital Information Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDIM.2011.6093357","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Data management and analysis at the Large Scale Data Facility
The Large Scale Data Facility (LSDF) was started at the Karlsruhe Institute of Technology (KIT) end of 2009 to address the growing need for value-added storage services for its data intensive experiments. The main focus of the project is to provide scientific communities producing data collections in the tera — to petabyte range with the necessary hardware infrastructure as well as with adequate value-added services and support for the data management, processing, and preservation. In this work we describe the project's infrastructure and services design, as well as its meta data handling. Both community specific meta data schemes, a meta data repository, an application programming interface and a graphical tool for accessing the resources were developed to further support the processing workflows of the partner scientific communities. The analysis workflow of high throughput microscopy images for studying biomedical processes is described in detail.