SEG Innovation Advisor Ellie Ardakani writes about the new SEG Community — an online space to discover opportunities, connect with others who share a passion for geophysics, and create a meaningful impact on the future of SEG.
{"title":"President's Page: 2023 is off to a good start!","authors":"Ellie Ardakani","doi":"10.1190/tle42010006.1","DOIUrl":"https://doi.org/10.1190/tle42010006.1","url":null,"abstract":"SEG Innovation Advisor Ellie Ardakani writes about the new SEG Community — an online space to discover opportunities, connect with others who share a passion for geophysics, and create a meaningful impact on the future of SEG.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48761617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Puryear, R. Tharimela, D. Ray, V. Egorov, Graeme Baille, A. Hernandianto
Spectral extrapolation is a bandwidth extension technique that we implement by combining spectral inversion with constraints, time-variant wavelet extraction, and targeted broadband filtering. We explain the principles of spectral extrapolation as a valid and effective bandwidth extension method and demonstrate its application to a 2D onshore Philippines legacy seismic data set using a time-variant wavelet extraction, resulting in a tripling of the frequency range of the spectrum. The results indicate significant potential for mapping complex stratigraphy and geomorphological features not evident on the input seismic data images, yielding information about reservoir distribution and connectivity that is often critical for optimal well placement.
{"title":"Spectral extrapolation principles and application: Mindoro Island, Philippines, seismic data","authors":"C. Puryear, R. Tharimela, D. Ray, V. Egorov, Graeme Baille, A. Hernandianto","doi":"10.1190/tle42010044.1","DOIUrl":"https://doi.org/10.1190/tle42010044.1","url":null,"abstract":"Spectral extrapolation is a bandwidth extension technique that we implement by combining spectral inversion with constraints, time-variant wavelet extraction, and targeted broadband filtering. We explain the principles of spectral extrapolation as a valid and effective bandwidth extension method and demonstrate its application to a 2D onshore Philippines legacy seismic data set using a time-variant wavelet extraction, resulting in a tripling of the frequency range of the spectrum. The results indicate significant potential for mapping complex stratigraphy and geomorphological features not evident on the input seismic data images, yielding information about reservoir distribution and connectivity that is often critical for optimal well placement.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42074371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. McLeman, T. Rayment, T. Burgess, K. Dancer, G. Hampson, A. Pauli
Seismic processing and imaging workflows have been refined over many decades to attenuate aspects of the recorded wavefield which would be improperly mapped into the image domain by legacy migration algorithms such as Kirchhoff prestack depth migration. These workflows, which include techniques such as deghosting, designature, demultiple, and regularization, have become increasingly complex and time-consuming due to the sequential fashion in which they must be tested and applied. The single-scattering (primary-only) preprocessed data are then migrated and used in extensive model building workflows, including reflection residual moveout tomography, to refine low-frequency subsurface models. Obtaining optimal results at each stage requires subjective assessment of a wide range of parameter tests. Results can be highly variable, with different decisions resulting in very different outcomes. Such workflows mean that projects may take many months or even years. Full-waveform inversion (FWI) imaging offers an alternative philosophy to this conventional approach. FWI imaging is a least-squares multiscattering algorithm that uses the raw field data (transmitted and reflected arrivals as well as their multiples and ghosts) to determine many different subsurface parameters, including reflectivity. Because this approach uses the full wavefield, the subsurface is sampled more completely during the inversion. Here, we demonstrate the application of a novel multiparameter FWI imaging technique to generate high-resolution amplitude variation with angle reflectivity simultaneously with other model parameters, such as velocity and anisotropy, directly from the raw field data. Given that these results are obtained faster than the conventional workflow with a higher resolution, improved illumination, and reduced noise, we highlight the potential of multiparameter FWI imaging to supersede the conventional workflow.
{"title":"Superior resolution through multiparameter FWI imaging: A new philosophy in seismic processing and imaging","authors":"J. McLeman, T. Rayment, T. Burgess, K. Dancer, G. Hampson, A. Pauli","doi":"10.1190/tle42010034.1","DOIUrl":"https://doi.org/10.1190/tle42010034.1","url":null,"abstract":"Seismic processing and imaging workflows have been refined over many decades to attenuate aspects of the recorded wavefield which would be improperly mapped into the image domain by legacy migration algorithms such as Kirchhoff prestack depth migration. These workflows, which include techniques such as deghosting, designature, demultiple, and regularization, have become increasingly complex and time-consuming due to the sequential fashion in which they must be tested and applied. The single-scattering (primary-only) preprocessed data are then migrated and used in extensive model building workflows, including reflection residual moveout tomography, to refine low-frequency subsurface models. Obtaining optimal results at each stage requires subjective assessment of a wide range of parameter tests. Results can be highly variable, with different decisions resulting in very different outcomes. Such workflows mean that projects may take many months or even years. Full-waveform inversion (FWI) imaging offers an alternative philosophy to this conventional approach. FWI imaging is a least-squares multiscattering algorithm that uses the raw field data (transmitted and reflected arrivals as well as their multiples and ghosts) to determine many different subsurface parameters, including reflectivity. Because this approach uses the full wavefield, the subsurface is sampled more completely during the inversion. Here, we demonstrate the application of a novel multiparameter FWI imaging technique to generate high-resolution amplitude variation with angle reflectivity simultaneously with other model parameters, such as velocity and anisotropy, directly from the raw field data. Given that these results are obtained faster than the conventional workflow with a higher resolution, improved illumination, and reduced noise, we highlight the potential of multiparameter FWI imaging to supersede the conventional workflow.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49258782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A recent SEG workshop enabled discussion among participants from around the world on the application of machine learning and artificial intelligence (AI) to a number of geophysical methods, applications, and to geophysical data at various scales. Applications of Machine Learning and AI in Geophysics was organized by SEG's Eurasia Regional Advisory Committee and took place virtually from 10 to 13 May 2022.
{"title":"Workshop Review: Virtual workshop on AI and machine learning in geophysics draws global audience","authors":"S. Brown","doi":"10.1190/tle41120872.1","DOIUrl":"https://doi.org/10.1190/tle41120872.1","url":null,"abstract":"A recent SEG workshop enabled discussion among participants from around the world on the application of machine learning and artificial intelligence (AI) to a number of geophysical methods, applications, and to geophysical data at various scales. Applications of Machine Learning and AI in Geophysics was organized by SEG's Eurasia Regional Advisory Committee and took place virtually from 10 to 13 May 2022.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45560266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Why can't we seem to keep young geophysicists engaged? How many times have we circled back to this question when discussing and planning for “the future of geophysics” as a professional society or at a company level?
{"title":"President's Page: A perspective from ‘the future’","authors":"Samara Omar","doi":"10.1190/tle41120812.1","DOIUrl":"https://doi.org/10.1190/tle41120812.1","url":null,"abstract":"Why can't we seem to keep young geophysicists engaged? How many times have we circled back to this question when discussing and planning for “the future of geophysics” as a professional society or at a company level?","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42739451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Shadlow, D. Christiansen, Meshari Al-Houli, A. Paxton, Thomas Wilson
A case study is presented for the seismic interpretation of a 3D seismic reprocessing project covering approximately 7200 km2 within a rift basin setting on the Northwest Shelf of Australia. The area includes two main petroleum plays: the Cretaceous Barrow Group Delta and the fluvio-deltaic Triassic Mungaroo Formation. Multiple 3D surveys of varying vintages were reprocessed to provide a unified continuous data set over the area. Seismic amplitude variation with offset inversions were conducted in time and depth domains to produce acoustic impedance and VP/VS volumes. The use of depth-domain inversion enabled more accurate inversion products to be developed with a large lateral and vertical zone of interest to assist in prospectivity assessments. Project time and cost constraints indicated a traditional seismic interpretation process would be ineffective and inefficient. The workflows applied included optimizations of the initial horizon interpretation to improve efficiency, machine learning (ML)-based automatic fault interpretation to save time, and bulk horizon interpretation for time savings and rapid stratal slicing. Utilizing ML and automated interpretation processes in conjunction with seismic inversion products enabled a full prospectivity assessment to be developed within six months. In addition to completing the work within the available time, the applied workflows allowed for significantly more time to be spent on prospectivity assessment rather than structural and stratigraphic interpretations.
{"title":"Getting the most out of a large data set: A case study for a large 3D seismic interpretation project in the Carnarvon Basin, NW Australia","authors":"J. Shadlow, D. Christiansen, Meshari Al-Houli, A. Paxton, Thomas Wilson","doi":"10.1190/tle41120857.1","DOIUrl":"https://doi.org/10.1190/tle41120857.1","url":null,"abstract":"A case study is presented for the seismic interpretation of a 3D seismic reprocessing project covering approximately 7200 km2 within a rift basin setting on the Northwest Shelf of Australia. The area includes two main petroleum plays: the Cretaceous Barrow Group Delta and the fluvio-deltaic Triassic Mungaroo Formation. Multiple 3D surveys of varying vintages were reprocessed to provide a unified continuous data set over the area. Seismic amplitude variation with offset inversions were conducted in time and depth domains to produce acoustic impedance and VP/VS volumes. The use of depth-domain inversion enabled more accurate inversion products to be developed with a large lateral and vertical zone of interest to assist in prospectivity assessments. Project time and cost constraints indicated a traditional seismic interpretation process would be ineffective and inefficient. The workflows applied included optimizations of the initial horizon interpretation to improve efficiency, machine learning (ML)-based automatic fault interpretation to save time, and bulk horizon interpretation for time savings and rapid stratal slicing. Utilizing ML and automated interpretation processes in conjunction with seismic inversion products enabled a full prospectivity assessment to be developed within six months. In addition to completing the work within the available time, the applied workflows allowed for significantly more time to be spent on prospectivity assessment rather than structural and stratigraphic interpretations.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44595059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
There is a lack of quality-control (QC) methods to ensure measured seismic data are within the span of modeled seismic data in the context of ensemble-based seismic history matching of reservoir models. The dimensionality of seismic data makes it difficult to visualize the data and further compare them to the large number of ensembles in an efficient manner. Two attributes called coverage and importance are introduced to incorporate the key elements of reviewing an ensemble. The coverage attribute delineates where the set of models replicates the measured data, and the importance attribute identifies where it is important to fit the data above the noise threshold. The two attributes are then combined to highlight in which spatial area our reservoir model ensemble appropriately models the data and where a significant discrepancy exists between our ensemble of models and the measured data. The attributes are closely connected to noise, as coverage always must be analyzed in terms of the noise level. Although noise may not be explicitly corrected for, the methodology corrects the attributes for the noise assessed. The method is applied on two data examples from field seismic data: a 4D absolute difference amplitude map and a 4D relative impedance difference cube. The first example shows how changing the oil-water contact of the ensemble can improve the coverage without any history matching, and the second shows how it is more difficult to get a good coverage using 3D seismic attributes rather than using 2D maps of seismic data. The proposed QC attributes provide tools to better manage coverage of seismic data in the ensemble.
{"title":"Inclusion of seismic attributes in reservoir ensemble coverage analysis","authors":"E. Lie, T. Bhakta, I. Sandø","doi":"10.1190/tle41120848.1","DOIUrl":"https://doi.org/10.1190/tle41120848.1","url":null,"abstract":"There is a lack of quality-control (QC) methods to ensure measured seismic data are within the span of modeled seismic data in the context of ensemble-based seismic history matching of reservoir models. The dimensionality of seismic data makes it difficult to visualize the data and further compare them to the large number of ensembles in an efficient manner. Two attributes called coverage and importance are introduced to incorporate the key elements of reviewing an ensemble. The coverage attribute delineates where the set of models replicates the measured data, and the importance attribute identifies where it is important to fit the data above the noise threshold. The two attributes are then combined to highlight in which spatial area our reservoir model ensemble appropriately models the data and where a significant discrepancy exists between our ensemble of models and the measured data. The attributes are closely connected to noise, as coverage always must be analyzed in terms of the noise level. Although noise may not be explicitly corrected for, the methodology corrects the attributes for the noise assessed. The method is applied on two data examples from field seismic data: a 4D absolute difference amplitude map and a 4D relative impedance difference cube. The first example shows how changing the oil-water contact of the ensemble can improve the coverage without any history matching, and the second shows how it is more difficult to get a good coverage using 3D seismic attributes rather than using 2D maps of seismic data. The proposed QC attributes provide tools to better manage coverage of seismic data in the ensemble.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47894730","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seismic data are an important source of information to guide and constrain reservoir modeling as it samples the subsurface in 3D away from wells. Seismic interpretations are used to constrain the structure of reservoir models. Different seismic attributes can support the identification and definition of stratigraphic features, and seismic inversion products can help constrain the rock properties. Different methods exist for integration of seismic data in the modeling process. Here, we present two new methods. The first method constrains facies definition and modeling with seismic data through a geobody earth modeling approach. The second method updates existing facies models with new seismic data using a Bayesian approach. Both methods are applied to a case study with good quality seismic data. The results show that the reservoir model becomes more consistent with the observed field seismic data when these fast and repeatable methods are applied (compared to not integrating seismic constraints or using time-intensive manual integration approaches), thus enabling more robust reservoir models and forecasts.
{"title":"Integration of seismic data in reservoir modeling through seismically constrained facies models","authors":"M. Amaru, Lewis Li, Aigul Tyshkanbayeva","doi":"10.1190/tle41120815.1","DOIUrl":"https://doi.org/10.1190/tle41120815.1","url":null,"abstract":"Seismic data are an important source of information to guide and constrain reservoir modeling as it samples the subsurface in 3D away from wells. Seismic interpretations are used to constrain the structure of reservoir models. Different seismic attributes can support the identification and definition of stratigraphic features, and seismic inversion products can help constrain the rock properties. Different methods exist for integration of seismic data in the modeling process. Here, we present two new methods. The first method constrains facies definition and modeling with seismic data through a geobody earth modeling approach. The second method updates existing facies models with new seismic data using a Bayesian approach. Both methods are applied to a case study with good quality seismic data. The results show that the reservoir model becomes more consistent with the observed field seismic data when these fast and repeatable methods are applied (compared to not integrating seismic constraints or using time-intensive manual integration approaches), thus enabling more robust reservoir models and forecasts.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42963489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the last two decades, 4D seismic monitoring has become a widely used technique for oil and gas field production. Modeling studies are a standard for defining reservoir monitoring plans, optimizing survey design, and justifying the expense of data acquisition. Discrepancies between 4D seismic data and synthetic results can be analyzed through petroelastic modeling of reservoir simulations. However, assuming that a history match is available and that the reservoir model and fluid-flow simulation results can be trusted, characterization of pressure and fluid changes in the field remain challenging. A workflow is proposed to adjust the 4D petroelastic model (PEM) to better fit 4D seismic attributes with the dynamic behavior of the reservoir. The input data for 4D inversion consist of multiple broadband 4D-compliant processed base and monitor surveys recorded in a highly depleted clastic field offshore Africa. The broadband inversion results greatly reduce the background noise level, enhance the signal-to-noise ratio, and improve the definition of 4D signals. Due to various production effects all over the field, a new global calibration workflow to speed up the 4D petroelastic model adjustment is proposed. The combination of good 4D seismic inversions and a well-calibrated PEM is expected to have a significant impact on the reservoir monitoring. During the calibration process, reservoir model discrepancies with 4D seismic attributes can be identified, suggesting some updates of the reservoir model. In addition, when further monitors are considered, the calibrated 4D PEM provides more reliable predictability.
{"title":"4D petroelastic model calibration using time-lapse seismic signal","authors":"D. Rappin, P. Trinh","doi":"10.1190/tle41120824.1","DOIUrl":"https://doi.org/10.1190/tle41120824.1","url":null,"abstract":"In the last two decades, 4D seismic monitoring has become a widely used technique for oil and gas field production. Modeling studies are a standard for defining reservoir monitoring plans, optimizing survey design, and justifying the expense of data acquisition. Discrepancies between 4D seismic data and synthetic results can be analyzed through petroelastic modeling of reservoir simulations. However, assuming that a history match is available and that the reservoir model and fluid-flow simulation results can be trusted, characterization of pressure and fluid changes in the field remain challenging. A workflow is proposed to adjust the 4D petroelastic model (PEM) to better fit 4D seismic attributes with the dynamic behavior of the reservoir. The input data for 4D inversion consist of multiple broadband 4D-compliant processed base and monitor surveys recorded in a highly depleted clastic field offshore Africa. The broadband inversion results greatly reduce the background noise level, enhance the signal-to-noise ratio, and improve the definition of 4D signals. Due to various production effects all over the field, a new global calibration workflow to speed up the 4D petroelastic model adjustment is proposed. The combination of good 4D seismic inversions and a well-calibrated PEM is expected to have a significant impact on the reservoir monitoring. During the calibration process, reservoir model discrepancies with 4D seismic attributes can be identified, suggesting some updates of the reservoir model. In addition, when further monitors are considered, the calibrated 4D PEM provides more reliable predictability.","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49108329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this episode, Steve Darnell, the president and CEO of Katalyst Data Management, discusses how digital transformation improves business processes. He describes the importance of cybersecurity, how to start the digitalization process, and the common obstacles companies face when embracing digital transformation. He also comments on the common misperceptions and the hidden benefits of embracing digital advancements. This conversation connects to all parts of the oil and gas workflow and showcases the value proposition for companies. Hear the full episode at https://seg.org/podcast/post/15881 .
在本期节目中,Katalyst Data Management总裁兼首席执行官Steve Darnell讨论了数字化转型如何改善业务流程。他描述了网络安全的重要性,如何启动数字化进程,以及公司在接受数字化转型时面临的常见障碍。他还评论了普遍的误解和拥抱数字进步的隐藏好处。本次对话涉及石油和天然气工作流程的各个部分,并展示了公司的价值主张。在上收听完整一集https://seg.org/podcast/post/15881。
{"title":"Seismic Soundoff: Integrating digital transformation to improve business processes","authors":"A. Geary","doi":"10.1190/tle41120884.1","DOIUrl":"https://doi.org/10.1190/tle41120884.1","url":null,"abstract":"In this episode, Steve Darnell, the president and CEO of Katalyst Data Management, discusses how digital transformation improves business processes. He describes the importance of cybersecurity, how to start the digitalization process, and the common obstacles companies face when embracing digital transformation. He also comments on the common misperceptions and the hidden benefits of embracing digital advancements. This conversation connects to all parts of the oil and gas workflow and showcases the value proposition for companies. Hear the full episode at https://seg.org/podcast/post/15881 .","PeriodicalId":35661,"journal":{"name":"Leading Edge","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44248220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}