We introduce a system for rapid retrieval of relevant well related information from a corpus of over 20 million documents. This allows for exploration workers to retrieve important business data more quickly. Tracking down all of the information required to make complex business decisions is a time consuming and error prone process. This poses a direct risk of expensive miscalculations and missed opportunities. A first version of this system is currently undergoing tests with select users. As the work here represents the first version of the system, it is expected that improvements will be made. This is a system that can be used at enterprise scale to enable searches to more easily yield usable information to workers. This system uses a supervised learning model to identify well related documents from several categories. Examples of these categories include (but are not limited to) formation evaluation and well completion reports. A machine learning model was trained to classify documents according to input from a well document expert. This input came in the form of a set of labeled documents compiled by said expert. This model was then applied to over 20 million documents that are deemed relevant to the exploration process. The inferred classifications for each document were stored in a search engine in order to facilitate retrieval of documents by each of the labels from above. The benefits of this system are twofold. First, it reduces the number of documents that come back for a given search of a large corpus of documents. Second, it allows users without technical experience in well-related work to more easily find documents.
{"title":"Improved Information Retrieval From Well Related Documents Using Supervised Learning","authors":"Glenn Miers, M. Czernuszenko, Brian Hughes","doi":"10.2118/210146-ms","DOIUrl":"https://doi.org/10.2118/210146-ms","url":null,"abstract":"\u0000 We introduce a system for rapid retrieval of relevant well related information from a corpus of over 20 million documents. This allows for exploration workers to retrieve important business data more quickly. Tracking down all of the information required to make complex business decisions is a time consuming and error prone process. This poses a direct risk of expensive miscalculations and missed opportunities. A first version of this system is currently undergoing tests with select users. As the work here represents the first version of the system, it is expected that improvements will be made. This is a system that can be used at enterprise scale to enable searches to more easily yield usable information to workers.\u0000 This system uses a supervised learning model to identify well related documents from several categories. Examples of these categories include (but are not limited to) formation evaluation and well completion reports. A machine learning model was trained to classify documents according to input from a well document expert. This input came in the form of a set of labeled documents compiled by said expert. This model was then applied to over 20 million documents that are deemed relevant to the exploration process. The inferred classifications for each document were stored in a search engine in order to facilitate retrieval of documents by each of the labels from above. The benefits of this system are twofold. First, it reduces the number of documents that come back for a given search of a large corpus of documents. Second, it allows users without technical experience in well-related work to more easily find documents.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115289525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nurania Saubryani, S. Kaswan, M. Gough, Rifky Akbar
The Banyu Urip production facility located in East Java, Indonesia; currently produces ca. 30% of the country's daily oil production. Field fluids are sour with high H2S (1.6%) and CO2 (45%) in the gas, which is conditioned prior to it's use as fuel, for Sulphur Recovery, or for reinjection. Gas conditioning takes place in two amine units, the Acid Gas Recovery Unit (AGRU) and the Acid Gas Enrichment Unit (AGE). Both units use aqueous MDEA as the amine solvent, with Wetted Surface Air Coolers (WSAC) used to cool hot lean amine off the regenerator columns. In early operation both conditioning units operated at design case. In the period 2018-2020 however, the WSACs became progressively fouled with scale and algae which led to a decrease in thermal transfer efficiency and a consequential decline in plant performance and reliability. SOx emissions were also impacted negatively. To resolve fouling and its detrimental consequences, a chemical treatment program was developed and implemented. The program involved laboratory qualification of candidate chemicals, including evaluation in a novel pilot skid that accurately simulated WSAC field conditions; followed by extended field trials. System performance was evaluated, which verified the pilot skid test results, and the program was implemented on a continuous basis. Extensive surveillance of multiple chemical and operational parameters was performed, and with critical evaluation of these derived data sets, improvements in operational practices were implemented, and unit performance gains realized. Implementation of the program has improved the reliability of the Fuel Gas Compressors (FGC) reducing monthly Gas Turbine Generator (GTG) diesel consumption rates by a factor of > 6. Secondly, AGE operational improvements reduced net SOx emissions for the facility by ca. 70% (2019 vs 2021) through a reduction in Thermal Oxidizer feed gas H2S content, and in lowering LP flaring.
{"title":"Improving Banyu Urip Acid Gas Removal Unit (AGRU) and Acid Gas Enrichment (AGE) System Performance and Reliability by Implementing an Effective Wetted Surface Air Cooler (WSAC) Chemical Treatment Program","authors":"Nurania Saubryani, S. Kaswan, M. Gough, Rifky Akbar","doi":"10.2118/210018-ms","DOIUrl":"https://doi.org/10.2118/210018-ms","url":null,"abstract":"\u0000 The Banyu Urip production facility located in East Java, Indonesia; currently produces ca. 30% of the country's daily oil production. Field fluids are sour with high H2S (1.6%) and CO2 (45%) in the gas, which is conditioned prior to it's use as fuel, for Sulphur Recovery, or for reinjection. Gas conditioning takes place in two amine units, the Acid Gas Recovery Unit (AGRU) and the Acid Gas Enrichment Unit (AGE). Both units use aqueous MDEA as the amine solvent, with Wetted Surface Air Coolers (WSAC) used to cool hot lean amine off the regenerator columns.\u0000 In early operation both conditioning units operated at design case. In the period 2018-2020 however, the WSACs became progressively fouled with scale and algae which led to a decrease in thermal transfer efficiency and a consequential decline in plant performance and reliability. SOx emissions were also impacted negatively.\u0000 To resolve fouling and its detrimental consequences, a chemical treatment program was developed and implemented. The program involved laboratory qualification of candidate chemicals, including evaluation in a novel pilot skid that accurately simulated WSAC field conditions; followed by extended field trials. System performance was evaluated, which verified the pilot skid test results, and the program was implemented on a continuous basis. Extensive surveillance of multiple chemical and operational parameters was performed, and with critical evaluation of these derived data sets, improvements in operational practices were implemented, and unit performance gains realized.\u0000 Implementation of the program has improved the reliability of the Fuel Gas Compressors (FGC) reducing monthly Gas Turbine Generator (GTG) diesel consumption rates by a factor of > 6. Secondly, AGE operational improvements reduced net SOx emissions for the facility by ca. 70% (2019 vs 2021) through a reduction in Thermal Oxidizer feed gas H2S content, and in lowering LP flaring.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123267223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This study shows that measures to reduce concentration in hydrocarbon markets should be considered and added and regulatory options must be addressed to provide incentives and avoid and / or reduce the impact of the concentration on competitiveness deficiencies for the efficiency of recovery of oil fields in the long-run. The authors approach the theme of the curse of natural resources and the Regulatory Framework Effects in the Recovery Factor from the hydrocarbons and its economic incentives for Norway and Colombia with some of the particularities of Tax burden, Legal Framework and Structure of the Sector, addressing an analysis and comparison of measures of hydrocarbons market concentration for its Downstream, Midstream and Upstream of both countries emphasizing the particularities in the Regulation of access to the Pipeline System obtaining Results, Observations, Conclusions and Recommendations for Colombia, Norway and in general to economies exposed to hydrocarbon sectors with national state companies.
{"title":"Regulatory Framework Effects in the Recovery Factor, A New Approach from the Competition Concentration Analysis O&G Sector, Comparative Case Norway and Colombia 2000-2016","authors":"Eusebio Jose Orozco Cera, Felipe Romero Consuegra","doi":"10.2118/210348-ms","DOIUrl":"https://doi.org/10.2118/210348-ms","url":null,"abstract":"\u0000 This study shows that measures to reduce concentration in hydrocarbon markets should be considered and added and regulatory options must be addressed to provide incentives and avoid and / or reduce the impact of the concentration on competitiveness deficiencies for the efficiency of recovery of oil fields in the long-run.\u0000 The authors approach the theme of the curse of natural resources and the Regulatory Framework Effects in the Recovery Factor from the hydrocarbons and its economic incentives for Norway and Colombia with some of the particularities of Tax burden, Legal Framework and Structure of the Sector, addressing an analysis and comparison of measures of hydrocarbons market concentration for its Downstream, Midstream and Upstream of both countries emphasizing the particularities in the Regulation of access to the Pipeline System obtaining Results, Observations, Conclusions and Recommendations for Colombia, Norway and in general to economies exposed to hydrocarbon sectors with national state companies.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129804394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. García, Romel Perez, Hector Rodríguez, B. Sequera-Dalton, M. Ursenbach, S. Mehta, R. G. Moore, D. Gutiérrez, E. Manrique
An experimental program has been designed and executed to evaluate the performance of hybrid Cyclic Steam Stimulation (CSS) recovery methods. The overarching goal is to improve the energy efficiency and reduce the carbon footprint of CSS in Colombian heavy oil fields. Specifically, this work compares the impact that adding solvent or flue gas to cyclic steam injection has on the recovery of a recombined heavy live oil at a laboratory scale. A novel experimental setup was designed to evaluate hybrid CSS methods, which allows displacement of fluids out of the core during injection cycles and the return of those fluids to the core during soaking and production periods, by the use of a ballast system. A CSS baseline test and two hybrid CSS tests were performed at reservoir conditions (RC) with recombined live oil and core material from a Colombian heavy oil field. Each test consisted of four cycles with the same amount of steam injection. The hybrid CSS tests consisted of a steam-solvent and a steam-flue gas hybrid test. The CSS baseline and the hybrid CSS tests were successfully performed in the core pack with the injection of 0.12 pore volume CWE (Cold Water Equivalent) of steam per cycle, at core pressure near 680 psig and an initial core temperature of 45°C. In addition, steam-solvent and steam-flue gas hybrid tests injected near 0.01 and 0.05 PV (CWE) of solvent and flue gas per cycle, respectively. The steam front location during each cycle was identified with temperature profiles recorded along the core during the tests. Core pressures and fluid volumes displaced to and from the ballast were also recorded. Post-test core analyses allowed to estimate residual liquid saturations after each test. The addition of solvent or flue gas did not hinder the CSS oil recovery process which was in the order of 40% for all tests. The recovery, energy efficiency and carbon footprint of the hybrid CSS tests are compared to the CSS baseline case. Although a small amount of hydrogen sulphide (H2S) was detected at the end of the CSS baseline test, H2S was not detected in the produced gas of the hybrid tests. The experimental program enhanced the understanding of hybrid steam cyclic methods and the impact of solvent and flue gas addition on the recovery, energy efficiency and carbon footprint reduction of heavy oil CSS recovery processes. These results assist in the quest of improving CSS performance and provide key data for tuning numerical models. This novel experimental apparatus is one of a kind as it captures the cyclic nature of fluid movement during CSS.
{"title":"Evaluating Performance and Energy Efficiency of Hybrid Cyclic Steam Stimulation Technologies with a Novel Experimental Setup","authors":"H. García, Romel Perez, Hector Rodríguez, B. Sequera-Dalton, M. Ursenbach, S. Mehta, R. G. Moore, D. Gutiérrez, E. Manrique","doi":"10.2118/210459-ms","DOIUrl":"https://doi.org/10.2118/210459-ms","url":null,"abstract":"\u0000 An experimental program has been designed and executed to evaluate the performance of hybrid Cyclic Steam Stimulation (CSS) recovery methods. The overarching goal is to improve the energy efficiency and reduce the carbon footprint of CSS in Colombian heavy oil fields. Specifically, this work compares the impact that adding solvent or flue gas to cyclic steam injection has on the recovery of a recombined heavy live oil at a laboratory scale.\u0000 A novel experimental setup was designed to evaluate hybrid CSS methods, which allows displacement of fluids out of the core during injection cycles and the return of those fluids to the core during soaking and production periods, by the use of a ballast system. A CSS baseline test and two hybrid CSS tests were performed at reservoir conditions (RC) with recombined live oil and core material from a Colombian heavy oil field. Each test consisted of four cycles with the same amount of steam injection. The hybrid CSS tests consisted of a steam-solvent and a steam-flue gas hybrid test.\u0000 The CSS baseline and the hybrid CSS tests were successfully performed in the core pack with the injection of 0.12 pore volume CWE (Cold Water Equivalent) of steam per cycle, at core pressure near 680 psig and an initial core temperature of 45°C. In addition, steam-solvent and steam-flue gas hybrid tests injected near 0.01 and 0.05 PV (CWE) of solvent and flue gas per cycle, respectively. The steam front location during each cycle was identified with temperature profiles recorded along the core during the tests. Core pressures and fluid volumes displaced to and from the ballast were also recorded. Post-test core analyses allowed to estimate residual liquid saturations after each test. The addition of solvent or flue gas did not hinder the CSS oil recovery process which was in the order of 40% for all tests. The recovery, energy efficiency and carbon footprint of the hybrid CSS tests are compared to the CSS baseline case. Although a small amount of hydrogen sulphide (H2S) was detected at the end of the CSS baseline test, H2S was not detected in the produced gas of the hybrid tests.\u0000 The experimental program enhanced the understanding of hybrid steam cyclic methods and the impact of solvent and flue gas addition on the recovery, energy efficiency and carbon footprint reduction of heavy oil CSS recovery processes. These results assist in the quest of improving CSS performance and provide key data for tuning numerical models. This novel experimental apparatus is one of a kind as it captures the cyclic nature of fluid movement during CSS.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129308618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Olabode Ajibola, J. Sheng, P. McElroy, Christopher Armistead, James Rutley, J. Smitherman
Historically, there has been controversies between the oil & gas companies and potash miners in the Secretarial Order Potash Area (SOPA) of Delaware basin. Mostly, these disputes are based on high pressure related operational failures in the area. To reduce these operational anxieties, it is vital to calculate the reservoir pressures, verify the pressures with machine learning predictions, and use the verified pressures to build pressure trend profiles using geophysical log cross-sections. To fulfil the above-mentioned objectives, the methodology used in the process starts with the calculation of reservoir pressures for the area using drilling data. The calculated pressures are then verified with Artificial Neural Network (ANN) machine learning model predictions utilizing well logs and drilling parameters. The verified reservoir pressures are then used to build pressure trend profiles using geophysical log cross-sections. Parameters used in building the ANN include deep, medium, & shallow laterolog resistivity logs, gamma ray log, neutron & density porosity logs, calculated overburden stress, cable tension log, well, caliper log, depth, lithology, mud weight, photoelectric cross-section log, calculated average porosity, calculated water saturation, corrected bulk density log, and bulk density log. Potash is mined in a limited area in the southeast portion of the state of New Mexico. This "potash area" has been afforded special status through the Department of the Interior through several Orders authored by the then Secretary of the Interior. In this work, this "potash area" will be known as the Secretarial Order Potash Area or SOPA. The reservoir pressure gradients were calculated according to the hydrostatic gradients of over 229 selected wells drilled and completed within the SOPA. The ANN model was built using 3 steps including data manipulation, analysis, and deployment. The reservoir pressures were predicted by the Artificial Neural Network (ANN) with high accuracy. The correlation coefficient, R for the training, validation, and testing are 0.978, 0.985, and 0.976, respectively. The Mean Square Error (MSE) was 2.9129 after 136 epochs optimum number of iterations. The overall correlation coefficient (R) is greater than 0.979. These results show that ANN models predicted the measured reservoir pressures accurately for the potash area. Next, the geophysical log cross-sections were created in 2-Dimensional and 3-Dimensional profiles with the verified reservoir pressures using Petra, Matlab, IHS Kingdom, and R machine language. Three west to east cross-sections were created for the three portions of the area namely Back-reef, Reef, and Basin separately. The fourth cross-section was created from the North (Back-Reef) to the South (Basin) through the Reef. The cross sections are displayed showing formation strata, depths, and pressure trends. The information gained from this study will be used to optimize the economic recovery of oil and gas
{"title":"Reservoir Pressure Gradient Trend Prediction for the Potash Area of Delaware Basin Using Artificial Neural Network and Geophysical Log Cross Sections","authors":"Olabode Ajibola, J. Sheng, P. McElroy, Christopher Armistead, James Rutley, J. Smitherman","doi":"10.2118/210031-ms","DOIUrl":"https://doi.org/10.2118/210031-ms","url":null,"abstract":"\u0000 Historically, there has been controversies between the oil & gas companies and potash miners in the Secretarial Order Potash Area (SOPA) of Delaware basin. Mostly, these disputes are based on high pressure related operational failures in the area. To reduce these operational anxieties, it is vital to calculate the reservoir pressures, verify the pressures with machine learning predictions, and use the verified pressures to build pressure trend profiles using geophysical log cross-sections.\u0000 To fulfil the above-mentioned objectives, the methodology used in the process starts with the calculation of reservoir pressures for the area using drilling data. The calculated pressures are then verified with Artificial Neural Network (ANN) machine learning model predictions utilizing well logs and drilling parameters. The verified reservoir pressures are then used to build pressure trend profiles using geophysical log cross-sections. Parameters used in building the ANN include deep, medium, & shallow laterolog resistivity logs, gamma ray log, neutron & density porosity logs, calculated overburden stress, cable tension log, well, caliper log, depth, lithology, mud weight, photoelectric cross-section log, calculated average porosity, calculated water saturation, corrected bulk density log, and bulk density log.\u0000 Potash is mined in a limited area in the southeast portion of the state of New Mexico. This \"potash area\" has been afforded special status through the Department of the Interior through several Orders authored by the then Secretary of the Interior. In this work, this \"potash area\" will be known as the Secretarial Order Potash Area or SOPA. The reservoir pressure gradients were calculated according to the hydrostatic gradients of over 229 selected wells drilled and completed within the SOPA. The ANN model was built using 3 steps including data manipulation, analysis, and deployment. The reservoir pressures were predicted by the Artificial Neural Network (ANN) with high accuracy. The correlation coefficient, R for the training, validation, and testing are 0.978, 0.985, and 0.976, respectively. The Mean Square Error (MSE) was 2.9129 after 136 epochs optimum number of iterations. The overall correlation coefficient (R) is greater than 0.979. These results show that ANN models predicted the measured reservoir pressures accurately for the potash area. Next, the geophysical log cross-sections were created in 2-Dimensional and 3-Dimensional profiles with the verified reservoir pressures using Petra, Matlab, IHS Kingdom, and R machine language. Three west to east cross-sections were created for the three portions of the area namely Back-reef, Reef, and Basin separately. The fourth cross-section was created from the North (Back-Reef) to the South (Basin) through the Reef. The cross sections are displayed showing formation strata, depths, and pressure trends.\u0000 The information gained from this study will be used to optimize the economic recovery of oil and gas ","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127707351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pedro J. Arévalo, M. Forshaw, A. Starostin, Roger Aragall, S. Grymalyuk
Steady-state hole-cleaning models used to monitor cuttings during well construction rely on static parameters that portrait specific drilling scenarios disconnected from each other. This paper presents the integration of transient hole-cleaning models validated in the field into a digital twin of the wellbore deployed while drilling. Thus, enabling the monitoring of the evolution of cuttings, which reduces uncertainty around the state of hole-cleaning procedures and minimizes the associated risk. A digital twin of the wellbore equipped with physics-based transient models is prepared in the planning phase, and later deployed to a real-time environment. While drilling, smart triggering algorithms constantly monitor drilling parameters at surface and downhole to automatically update the digital twin and refine simulation results. The physics-based transient model continuously estimates cuttings suspended in the drilling mud and cuttings deposited as stationary beds, which enables evaluation of cuttings distributions along the wellbore in real time. Automation systems consume the predicted results via an aggregation layer to refine fit-for-purpose hole-cleaning monitoring applications deployed at the rig. The transient hole-cleaning model has been integrated into digital twins used during pre-job planning as well as in real-time environments. The system deployed in real-time successfully tracks the state of cuttings concentration in the wellbore during all operations (drilling, tripping, off-bottom circulation, connections) considering the effects of high-temperature and high-pressure on the drilling fluid. Moreover, since the model uses previous results as starting point for the next estimation cycle, it creates a dynamic prediction of how the cuttings evolve while drilling. Fit-for-purpose automation and monitoring services predict drilling issues related to hole-cleaning, downhole pressure, among others. Drillers and drilling optimization personnel receive actionable information to mitigate hole-cleaning issues and avoid detrimental effects for operations. The user interface (UI) presents how the cuttings distribution change with evolution of input parameters (rate of penetration, string rotation, and flow rate). A set of case studies confirm the effectiveness of the approach and illustrate its benefits. One case study from the North Sea illustrates the reaction of the model to changing operational parameters, while another combines along-string-measurements of density with the cuttings predictions to confirm the trend established by the predicted cuttings concentration.
{"title":"Monitoring Hole-Cleaning during Drilling Operations: Case Studies with a Real-Time Transient Model","authors":"Pedro J. Arévalo, M. Forshaw, A. Starostin, Roger Aragall, S. Grymalyuk","doi":"10.2118/210244-ms","DOIUrl":"https://doi.org/10.2118/210244-ms","url":null,"abstract":"\u0000 Steady-state hole-cleaning models used to monitor cuttings during well construction rely on static parameters that portrait specific drilling scenarios disconnected from each other. This paper presents the integration of transient hole-cleaning models validated in the field into a digital twin of the wellbore deployed while drilling. Thus, enabling the monitoring of the evolution of cuttings, which reduces uncertainty around the state of hole-cleaning procedures and minimizes the associated risk.\u0000 A digital twin of the wellbore equipped with physics-based transient models is prepared in the planning phase, and later deployed to a real-time environment. While drilling, smart triggering algorithms constantly monitor drilling parameters at surface and downhole to automatically update the digital twin and refine simulation results. The physics-based transient model continuously estimates cuttings suspended in the drilling mud and cuttings deposited as stationary beds, which enables evaluation of cuttings distributions along the wellbore in real time. Automation systems consume the predicted results via an aggregation layer to refine fit-for-purpose hole-cleaning monitoring applications deployed at the rig.\u0000 The transient hole-cleaning model has been integrated into digital twins used during pre-job planning as well as in real-time environments. The system deployed in real-time successfully tracks the state of cuttings concentration in the wellbore during all operations (drilling, tripping, off-bottom circulation, connections) considering the effects of high-temperature and high-pressure on the drilling fluid. Moreover, since the model uses previous results as starting point for the next estimation cycle, it creates a dynamic prediction of how the cuttings evolve while drilling. Fit-for-purpose automation and monitoring services predict drilling issues related to hole-cleaning, downhole pressure, among others. Drillers and drilling optimization personnel receive actionable information to mitigate hole-cleaning issues and avoid detrimental effects for operations. The user interface (UI) presents how the cuttings distribution change with evolution of input parameters (rate of penetration, string rotation, and flow rate).\u0000 A set of case studies confirm the effectiveness of the approach and illustrate its benefits. One case study from the North Sea illustrates the reaction of the model to changing operational parameters, while another combines along-string-measurements of density with the cuttings predictions to confirm the trend established by the predicted cuttings concentration.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125446328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Transient-test data can be significantly affected by subtle non-reservoir noise generated by natural or operational factors. The impact of such noise on the pressure-transient analysis depends on the signal-to-noise ratio achieved during transient tests. Often, a misleading interpretation of the reservoir characteristics can result. This study will quantify such effects on the interpretation results by evaluating the signal-to-noise ratio as in wireline testing operations, deep transient testing, drill-stem testing and production testing. Effects of non-reservoir factors can be difficult to identify, and often may lead to misrepresentations or misinterpretations. Analytical and numerical reservoir simulations will be used to illustrate quantitative criteria of defining the acceptable operating conditions and preferable techniques for pressure-transient-tests, depending on the reservoir characteristics. Convoluted effects of noise, drift, resolution, periodic tides have been quantitatively evaluated to demonstrate the situations when the reservoir signal is too weak to achieve meaningful characterization. Different pressure-transient techniques will be evaluated with a focus on the signal-to-noise ratio. Certain disruptive behaviors of equipment and nature tend to distort the measurements performed during such tests. Depending on the amount of disruption caused in the measurements during the tests, there are situations when the test objective may not be achieved at all. Failure to create dominant reservoir responses can result from an insufficient signal-to-noise ratio with the rate of production and pressure drawdown. It is a function of formation and fluid properties and/or mechanical environment. A minimum rate of production is needed for creating a necessary magnitude of signal-to-noise ratio to interpret correctly the reservoir response. The paper will help determine the minimum rate of production and the duration of flow needed to obtain the presence of deep heterogeneities or boundaries with a reasonable level of certainty. If a test is run with a rate lower than the critical value, for example, the data will be biased by other hardware or natural factors that are unrelated to the reservoir signals. Illustrative examples will also be presented to show how misleading characteristics of the reservoir and the well can be deduced without sufficient signal-to-noise ratios. This study will quantify the non-reservoir factors by evaluating the corresponding signal-to-noise ratios. As a result, a practical guide will be created for selecting a proper testing method from a quantitative point of view.
{"title":"An Investigation into Non-Reservoir Components that Undermine Reservoir Responses in Transient-Test Data","authors":"N. Rahman, S. Sarac","doi":"10.2118/210366-ms","DOIUrl":"https://doi.org/10.2118/210366-ms","url":null,"abstract":"\u0000 Transient-test data can be significantly affected by subtle non-reservoir noise generated by natural or operational factors. The impact of such noise on the pressure-transient analysis depends on the signal-to-noise ratio achieved during transient tests. Often, a misleading interpretation of the reservoir characteristics can result. This study will quantify such effects on the interpretation results by evaluating the signal-to-noise ratio as in wireline testing operations, deep transient testing, drill-stem testing and production testing.\u0000 Effects of non-reservoir factors can be difficult to identify, and often may lead to misrepresentations or misinterpretations. Analytical and numerical reservoir simulations will be used to illustrate quantitative criteria of defining the acceptable operating conditions and preferable techniques for pressure-transient-tests, depending on the reservoir characteristics. Convoluted effects of noise, drift, resolution, periodic tides have been quantitatively evaluated to demonstrate the situations when the reservoir signal is too weak to achieve meaningful characterization. Different pressure-transient techniques will be evaluated with a focus on the signal-to-noise ratio.\u0000 Certain disruptive behaviors of equipment and nature tend to distort the measurements performed during such tests. Depending on the amount of disruption caused in the measurements during the tests, there are situations when the test objective may not be achieved at all. Failure to create dominant reservoir responses can result from an insufficient signal-to-noise ratio with the rate of production and pressure drawdown. It is a function of formation and fluid properties and/or mechanical environment. A minimum rate of production is needed for creating a necessary magnitude of signal-to-noise ratio to interpret correctly the reservoir response. The paper will help determine the minimum rate of production and the duration of flow needed to obtain the presence of deep heterogeneities or boundaries with a reasonable level of certainty. If a test is run with a rate lower than the critical value, for example, the data will be biased by other hardware or natural factors that are unrelated to the reservoir signals. Illustrative examples will also be presented to show how misleading characteristics of the reservoir and the well can be deduced without sufficient signal-to-noise ratios.\u0000 This study will quantify the non-reservoir factors by evaluating the corresponding signal-to-noise ratios. As a result, a practical guide will be created for selecting a proper testing method from a quantitative point of view.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131867080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Reducing water production is the primary problem in the oil and gas industry. There are a few flow control technologies with moving parts available on the market to choke back water. However, the main issue with those technologies is the potential of plugging and scaling. In this paper, we will introduce a novel passive flow control nozzle, which has no moving part inside. All the choking is implemented through its internal geometry. Therefore, the risk of plugging and scaling will be significantly mitigated. In this paper, a passive flow control nozzle designed specifically for water choking will be presented. Design philosophy in fluid mechanics will be introduced in detail. The results of Computational Fluid Dynamics (CFD) and physical flow loop testing will be shown to evaluate the performance of the technology. It is shown that a passive choking nozzle can choke back more than 40% of water compared to an orifice while maintaining oil production rates. We will also perform simulation case studies to compare conventional slotted liner completions with the completions equipped with a passive choking nozzle (PCN). We will show that the nozzle can effectively choke back water and promote oil production in a long horizontal well. Finally, we will briefly discuss how the passive nozzle can mitigate well-known issues such as scaling and plugging.
{"title":"A Passive Flow Control Nozzle for Water Choking Application","authors":"Dachuan Zhu, M. Soroush, G. Rosi, R. Scott","doi":"10.2118/210411-ms","DOIUrl":"https://doi.org/10.2118/210411-ms","url":null,"abstract":"\u0000 Reducing water production is the primary problem in the oil and gas industry. There are a few flow control technologies with moving parts available on the market to choke back water. However, the main issue with those technologies is the potential of plugging and scaling. In this paper, we will introduce a novel passive flow control nozzle, which has no moving part inside. All the choking is implemented through its internal geometry. Therefore, the risk of plugging and scaling will be significantly mitigated.\u0000 In this paper, a passive flow control nozzle designed specifically for water choking will be presented. Design philosophy in fluid mechanics will be introduced in detail. The results of Computational Fluid Dynamics (CFD) and physical flow loop testing will be shown to evaluate the performance of the technology. It is shown that a passive choking nozzle can choke back more than 40% of water compared to an orifice while maintaining oil production rates. We will also perform simulation case studies to compare conventional slotted liner completions with the completions equipped with a passive choking nozzle (PCN). We will show that the nozzle can effectively choke back water and promote oil production in a long horizontal well. Finally, we will briefly discuss how the passive nozzle can mitigate well-known issues such as scaling and plugging.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115325208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Accurately measuring oil, water, and gas flow rates is a significant difficulty for the oil and gas sector. Multiphase flow meters or wet gas flow meters (i.e. MPFM) have opened the door to the development of marginal assets and promote more efficient production of a large field with continuous monitoring. However, this MPFM must be calibrated, and a correct uncertainty assessment is necessary, particularly for the allocation method. The new paradigm is to calibrate them in the field, as well as to achieve in situ validation, to significantly cut OPEX. Indeed, MPFM's manufacturers frequently charge a monthly fee to ensure the technology's performance over time without being able to independently assert the MPFM's performance, leaving the end-user to conduct their tests to determine the MPFM's true field performance. How do we address the in situ of MFPM performance? There are two methods. The first is to take the manufacturer's statement, literature, and the laboratory's knowledge to establish the performance at line conditions, to ensure that the estimations are accurate. Monte Carlo simulation analysis is a way to do it. It is possible to define the key parameters to monitor to determine whether the MPFM is still in a healthy condition and within the sweet range. But in this extremely conventional and conservative business, this strategy is sometimes viewed as too much data computational driven and not as strong as the second method which is to do an MPFM performance review at the well site, either by a remote witnessing or a physical third party service. This process is typically done if there are any uncertainties about the MPFM's performance but requires supplementary equipment to verify it. Third-party experts are frequently consulted at an early stage to advise on what might be required as the best metering solution to define and use as a reference, bearing in mind that space, timing, and measurement principles must be simple to comprehend to establish or confirm the performance of the so-called reference flowmeter. Our research has established that reported MPFM performance is, on average, too optimistic, based on the manufacturer's claims only. It was demonstrated that manufacturers rarely disclose the predicted output specification (i.e. uncertainty) of oil, water, and gas flow rates to the end-user. Rather than that, they provide a mixture of various output parameters at line conditions. And to the lack of competencies in fluid behavior (i.e. PVT) necessary to convert flow rate to standard conditions, there is no way to establish a correct performance statement for the end-user. This leaves the end-user to translate/calculate/convert any stated numbers to the expected parameters and associated values by themselves. Sometimes, the manufacturers have provided them with enough relevant data or information to achieve this. Finally, there are no standard requirements that can be applied directly because of the complexity and multipha
{"title":"An Innovative Solution for Any Wet Gas and Multiphase Flowmeters with in situ Flow Validation","authors":"B. Pinguet","doi":"10.2118/209982-ms","DOIUrl":"https://doi.org/10.2118/209982-ms","url":null,"abstract":"\u0000 Accurately measuring oil, water, and gas flow rates is a significant difficulty for the oil and gas sector. Multiphase flow meters or wet gas flow meters (i.e. MPFM) have opened the door to the development of marginal assets and promote more efficient production of a large field with continuous monitoring. However, this MPFM must be calibrated, and a correct uncertainty assessment is necessary, particularly for the allocation method. The new paradigm is to calibrate them in the field, as well as to achieve in situ validation, to significantly cut OPEX. Indeed, MPFM's manufacturers frequently charge a monthly fee to ensure the technology's performance over time without being able to independently assert the MPFM's performance, leaving the end-user to conduct their tests to determine the MPFM's true field performance.\u0000 How do we address the in situ of MFPM performance?\u0000 There are two methods. The first is to take the manufacturer's statement, literature, and the laboratory's knowledge to establish the performance at line conditions, to ensure that the estimations are accurate. Monte Carlo simulation analysis is a way to do it. It is possible to define the key parameters to monitor to determine whether the MPFM is still in a healthy condition and within the sweet range. But in this extremely conventional and conservative business, this strategy is sometimes viewed as too much data computational driven and not as strong as the second method which is to do an MPFM performance review at the well site, either by a remote witnessing or a physical third party service.\u0000 This process is typically done if there are any uncertainties about the MPFM's performance but requires supplementary equipment to verify it. Third-party experts are frequently consulted at an early stage to advise on what might be required as the best metering solution to define and use as a reference, bearing in mind that space, timing, and measurement principles must be simple to comprehend to establish or confirm the performance of the so-called reference flowmeter.\u0000 Our research has established that reported MPFM performance is, on average, too optimistic, based on the manufacturer's claims only. It was demonstrated that manufacturers rarely disclose the predicted output specification (i.e. uncertainty) of oil, water, and gas flow rates to the end-user. Rather than that, they provide a mixture of various output parameters at line conditions. And to the lack of competencies in fluid behavior (i.e. PVT) necessary to convert flow rate to standard conditions, there is no way to establish a correct performance statement for the end-user.\u0000 This leaves the end-user to translate/calculate/convert any stated numbers to the expected parameters and associated values by themselves. Sometimes, the manufacturers have provided them with enough relevant data or information to achieve this. Finally, there are no standard requirements that can be applied directly because of the complexity and multipha","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"94 14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124251094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Radwan, R. Ramanathan, Igor B. Ivanishin, A. Ibrahim
In some shale plays, insufficient formation breakdown and presence of near-wellbore tortuosity make it challenging to reach the designed pumping rate and lead to premature screen-outs. Screen-outs during a fracturing operation are a tremendous burden for operators as they diminish the well's total production and add cost to do a wellbore cleanout. In some cases, these issues could cause suboptimal perforation cluster efficiency and production loss. There is a critical need for an easy-to-implement solution that can help operators in achieving their desired fracture designs. This paper presents field case studies of a new microparticles-based slurry (MPS) technology that proves ease of operations and an improvement in production across four different US shale basins. Non-hazardous water-based slurry contains engineered glass microparticles with a median size of 550–625 mesh. It was implemented in the Rockies, Powder River, Permian, and SCOOP/STACK with over 10,000 stages stimulated so far. The slurry was usually deployed as an additive to the pad or as a pill before pumping the proppant-laden slurries. It is compatible with commonly used fracturing fluids. The MPS technology helps in scouring the perforations and lessening fracture entry restrictions. This results in better fracture initiation and lowers the screen-out potential. The technology also widens fracture openings, restricts fracture complexity, reduces near-wellbore tortuosity, and increases reservoir connectivity. The slurry can be used as a far-field diverter pill as well. Field studies in multiple challenging formations involving alternating stages between the microparticle slurry and the standard control showed a 12–25% reduction in pump time due to significant pressure relief. In another pad, the MPS reduced the screen-outs by over 6 folds. Production data showed up to 19% uplift within a 15-month period against control wells. The production improvement analysis is a subject of further study. Oil and water tracer tests confirmed the production improvement in stages that had the microparticle slurry. Overall, the success rate of the technology has been unprecedented and has been gaining significant ground over the past year. Realizing a treatment design is a critical step in maximizing the rate of return on a well. This new chemical slurry offers operators a simple, cost-effective, and field proven solution to alleviate operational issues and potentially be more aggressive in completion designs. The diverse case studies in this paper prove the efficacy of this innovative technology in solving the major day-to-day fracturing challenges faced by completion engineers.
{"title":"An Engineered Microparticles-Based Slurry Pumped in Over 10,000 Stages Provided Notable Operational and Production Improvements in Challenging Formations","authors":"A. Radwan, R. Ramanathan, Igor B. Ivanishin, A. Ibrahim","doi":"10.2118/210364-ms","DOIUrl":"https://doi.org/10.2118/210364-ms","url":null,"abstract":"\u0000 In some shale plays, insufficient formation breakdown and presence of near-wellbore tortuosity make it challenging to reach the designed pumping rate and lead to premature screen-outs. Screen-outs during a fracturing operation are a tremendous burden for operators as they diminish the well's total production and add cost to do a wellbore cleanout. In some cases, these issues could cause suboptimal perforation cluster efficiency and production loss. There is a critical need for an easy-to-implement solution that can help operators in achieving their desired fracture designs. This paper presents field case studies of a new microparticles-based slurry (MPS) technology that proves ease of operations and an improvement in production across four different US shale basins.\u0000 Non-hazardous water-based slurry contains engineered glass microparticles with a median size of 550–625 mesh. It was implemented in the Rockies, Powder River, Permian, and SCOOP/STACK with over 10,000 stages stimulated so far. The slurry was usually deployed as an additive to the pad or as a pill before pumping the proppant-laden slurries. It is compatible with commonly used fracturing fluids. The MPS technology helps in scouring the perforations and lessening fracture entry restrictions. This results in better fracture initiation and lowers the screen-out potential. The technology also widens fracture openings, restricts fracture complexity, reduces near-wellbore tortuosity, and increases reservoir connectivity. The slurry can be used as a far-field diverter pill as well.\u0000 Field studies in multiple challenging formations involving alternating stages between the microparticle slurry and the standard control showed a 12–25% reduction in pump time due to significant pressure relief. In another pad, the MPS reduced the screen-outs by over 6 folds. Production data showed up to 19% uplift within a 15-month period against control wells. The production improvement analysis is a subject of further study. Oil and water tracer tests confirmed the production improvement in stages that had the microparticle slurry. Overall, the success rate of the technology has been unprecedented and has been gaining significant ground over the past year.\u0000 Realizing a treatment design is a critical step in maximizing the rate of return on a well. This new chemical slurry offers operators a simple, cost-effective, and field proven solution to alleviate operational issues and potentially be more aggressive in completion designs. The diverse case studies in this paper prove the efficacy of this innovative technology in solving the major day-to-day fracturing challenges faced by completion engineers.","PeriodicalId":113697,"journal":{"name":"Day 2 Tue, October 04, 2022","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114641989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}