{"title":"Performance and Security Challenges in Science Workflows","authors":"D. Ghosal","doi":"10.1145/3322798.3329260","DOIUrl":null,"url":null,"abstract":"Scientific workflows are complex, often generating large amounts of data that need to be processed in multiple stages. The data often generated at remote locations must be transferred from the source and between the distributed HPC nodes interconnected by high-speed networks that carry other background traffic. Increasingly, many of these scientific workflows require processing to be completed within a deadline, which, in turn, imposes deadline on the network data transfer. A recent example of a deadline-driven workflow occurred when LIGO and Virgo detectors observed a gravitational wave signal associated with the merger of two neutron stars. The merger, known as a kilonova, occurred in a galaxy 130 million light-years from Earth in the southern constellation of Hydra. The data from this initial observation had to be processed in a timely manner and sent to astronomers around the world so that they could aim their instruments to the right section of the sky to image the source of the signal.","PeriodicalId":365009,"journal":{"name":"Proceedings of the ACM Workshop on Systems and Network Telemetry and Analytics","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Workshop on Systems and Network Telemetry and Analytics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3322798.3329260","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Scientific workflows are complex, often generating large amounts of data that need to be processed in multiple stages. The data often generated at remote locations must be transferred from the source and between the distributed HPC nodes interconnected by high-speed networks that carry other background traffic. Increasingly, many of these scientific workflows require processing to be completed within a deadline, which, in turn, imposes deadline on the network data transfer. A recent example of a deadline-driven workflow occurred when LIGO and Virgo detectors observed a gravitational wave signal associated with the merger of two neutron stars. The merger, known as a kilonova, occurred in a galaxy 130 million light-years from Earth in the southern constellation of Hydra. The data from this initial observation had to be processed in a timely manner and sent to astronomers around the world so that they could aim their instruments to the right section of the sky to image the source of the signal.