Hydrocarbon production is commonly associated as the dispersed flow of two and more immiscible phases starting from porous media to surface facilities. In the dispersed flow, one phase is usually dispersed into another dominating phase in terms of droplets. Accurate prediction of the droplet size distribution of a dispersed phase is critical in characterizing complex flow behavior in pipe flows. In the first part of this paper, we provide the analyses of open-source experimental data on the maximum droplet size in gas-liquid annular flow and evaluate the existing theoretical models and suggest an improvement based on the experimental data analyses to predict the maximum droplet size of the entrained liquid droplets in gas-liquid annular flow. In the second part of this paper, we cover the experimental results from the open-source literature data and in-house experimental data to give the general understanding on droplet formation concepts and evaluate the existing predictive models and present a new modeling approach to determine a maximum stable droplet size of the dispersed phase in the liquid-liquid dispersed flow under turbulent flow conditions.
{"title":"Modeling Maximum Droplet Size In Gas-Liquid Annular Flow and Liquid–Liquid Dispersed Flow","authors":"Kanat Karatayev, Yilin Fan","doi":"10.2118/206081-ms","DOIUrl":"https://doi.org/10.2118/206081-ms","url":null,"abstract":"\u0000 Hydrocarbon production is commonly associated as the dispersed flow of two and more immiscible phases starting from porous media to surface facilities. In the dispersed flow, one phase is usually dispersed into another dominating phase in terms of droplets. Accurate prediction of the droplet size distribution of a dispersed phase is critical in characterizing complex flow behavior in pipe flows. In the first part of this paper, we provide the analyses of open-source experimental data on the maximum droplet size in gas-liquid annular flow and evaluate the existing theoretical models and suggest an improvement based on the experimental data analyses to predict the maximum droplet size of the entrained liquid droplets in gas-liquid annular flow. In the second part of this paper, we cover the experimental results from the open-source literature data and in-house experimental data to give the general understanding on droplet formation concepts and evaluate the existing predictive models and present a new modeling approach to determine a maximum stable droplet size of the dispersed phase in the liquid-liquid dispersed flow under turbulent flow conditions.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"61 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74959222","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With the advent of increased measurements and instrumentation in oil and gas upstream production infrastructure; in the wellbore, in subsea and on surface processing facilities, data integration from all sources can be used more effectively in producing consistent and robust production profiles. The proposed data integration methodology aims at identifying the sources of measurement and process errors and removing them from the system. This ensures quasi error-free data when driving critical applications such as well rate determination from virtual and multiphase meters, and production allocation schemes, to name few. Confidence in the data is further enhanced by quantifying the uncertainty of each measured and unmeasured variable. Advanced Data Validation and Reconciliation (DVR) methodology uses data redundancy to correct measurements. As more data is ingested in a modeling system the statistical aspect attached to each measurement becomes an important source of information to further improve its precision. DVR is an equation-based calculation process. It combines data redundancy and conservation laws to correct measurements and convert them into accurate and reliable information. The methodology is used in upstream oil & gas, refineries and gas plants, petrochemical plants as well as power plants including nuclear. DVR detects faulty sensors and identifies degradation of equipment performance. As such, it provides more robust inputs to operations, simulation, and automation processes. The DVR methodology is presented using field data from a producing offshore field. The discussion details the design and implementation of a DVR system to integrate all available field data from the wellbore and surface facilities. The integrated data in this end-to-end evaluation includes reservoir productivity parameters, downhole and wellhead measurements, tuned vertical lift models, artificial lift devices, fluid sample analysis and thermodynamic models, and top facility process measurements. The automated DVR iterative runs solve all conservation equations simultaneously when determining the production flowrates "true values" and their uncertainties. The DVR field application is successfully used in real-time to ensure data consistency across a number of production tasks including the continual surveillance of the critical components of the production facility, the evaluation and validation of well tests using multiphase flow metering, the virtual flow metering of each well, the modeling of fluid phase behavior in the well and in the multistage separation facility, and performing the back allocation from sales meters to individual wells.
{"title":"The Application of Data Validation and Reconciliation to Upstream Production Measurement Integration and Surveillance – Field Study","authors":"V. Bent, A. Amin, Timothy Jadot","doi":"10.2118/205934-ms","DOIUrl":"https://doi.org/10.2118/205934-ms","url":null,"abstract":"\u0000 With the advent of increased measurements and instrumentation in oil and gas upstream production infrastructure; in the wellbore, in subsea and on surface processing facilities, data integration from all sources can be used more effectively in producing consistent and robust production profiles. The proposed data integration methodology aims at identifying the sources of measurement and process errors and removing them from the system. This ensures quasi error-free data when driving critical applications such as well rate determination from virtual and multiphase meters, and production allocation schemes, to name few. Confidence in the data is further enhanced by quantifying the uncertainty of each measured and unmeasured variable.\u0000 Advanced Data Validation and Reconciliation (DVR) methodology uses data redundancy to correct measurements. As more data is ingested in a modeling system the statistical aspect attached to each measurement becomes an important source of information to further improve its precision. DVR is an equation-based calculation process. It combines data redundancy and conservation laws to correct measurements and convert them into accurate and reliable information. The methodology is used in upstream oil & gas, refineries and gas plants, petrochemical plants as well as power plants including nuclear. DVR detects faulty sensors and identifies degradation of equipment performance. As such, it provides more robust inputs to operations, simulation, and automation processes.\u0000 The DVR methodology is presented using field data from a producing offshore field. The discussion details the design and implementation of a DVR system to integrate all available field data from the wellbore and surface facilities. The integrated data in this end-to-end evaluation includes reservoir productivity parameters, downhole and wellhead measurements, tuned vertical lift models, artificial lift devices, fluid sample analysis and thermodynamic models, and top facility process measurements. The automated DVR iterative runs solve all conservation equations simultaneously when determining the production flowrates \"true values\" and their uncertainties. The DVR field application is successfully used in real-time to ensure data consistency across a number of production tasks including the continual surveillance of the critical components of the production facility, the evaluation and validation of well tests using multiphase flow metering, the virtual flow metering of each well, the modeling of fluid phase behavior in the well and in the multistage separation facility, and performing the back allocation from sales meters to individual wells.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88594624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sidewall coring is a cost-effective process to complement conventional fullbore coring. Because sidewall cores target exact depth points, verification of the sidewall core recovery depth is required. We present an automated, fast workflow to perform the depth verification using borehole images, thereby providing consistent results. An application example using a typical dataset is used to showcase the workflow. A novel automated approach based on image analysis techniques and Bayesian statistical analysis is developed to verify sidewall core recovery depth using borehole image logs. A complete workflow is presented covering: 1) utilization of reference logs, e.g., gamma ray, to correct image log depth using cross correlation and/or dynamic time warping, 2) automated identification of sidewall core cavity in borehole image log using the circle Hough transform, and 3) estimation of confidence in the identification using Bayesian statistics and specialized metrics. The workflow is applied on a typical dataset containing tens of sidewall core cavities with varying quality. Results are comparable to the manual interpretation from an experienced engineer. A number of observations are made. First, the use of reference logs to correct the image log allows for determining the exact well logs values where the sidewall core was sampled, which is then compared to the initial target well logs values. This increases the confidence that the target lithofacies was sampled as planned. Second, the circle Hough Transform is suitable for this problem because it provides stable solutions for partially imaged sidewall core cavities typical in pad-based borehole images. Third, the use of Bayesian statistics and specialized metrics for the problem, such as average and standard deviation borehole image intensity in the cavity, provides customizability to work with multiple types of borehole images and with varying initial depth guess uncertainties. Overall, the use of fast and automated methodology for depth verification opens up avenues for near real-time combined sidewall coring, imaging, and verification workflows. The novelty in this study lies in using a combination of image processing techniques and statistical analysis to automate an established manual workflow. The automated workflow provides consistent results in minutes rather than hours. Results also incorporate a confidence index estimation.
{"title":"Automated Verification of Sidewall Core Recovery Depth using Borehole Image Logs","authors":"M. A. Ibrahim, V. Torlov, M. Mezghani","doi":"10.2118/206145-ms","DOIUrl":"https://doi.org/10.2118/206145-ms","url":null,"abstract":"\u0000 Sidewall coring is a cost-effective process to complement conventional fullbore coring. Because sidewall cores target exact depth points, verification of the sidewall core recovery depth is required. We present an automated, fast workflow to perform the depth verification using borehole images, thereby providing consistent results. An application example using a typical dataset is used to showcase the workflow. A novel automated approach based on image analysis techniques and Bayesian statistical analysis is developed to verify sidewall core recovery depth using borehole image logs. A complete workflow is presented covering: 1) utilization of reference logs, e.g., gamma ray, to correct image log depth using cross correlation and/or dynamic time warping, 2) automated identification of sidewall core cavity in borehole image log using the circle Hough transform, and 3) estimation of confidence in the identification using Bayesian statistics and specialized metrics. The workflow is applied on a typical dataset containing tens of sidewall core cavities with varying quality. Results are comparable to the manual interpretation from an experienced engineer. A number of observations are made. First, the use of reference logs to correct the image log allows for determining the exact well logs values where the sidewall core was sampled, which is then compared to the initial target well logs values. This increases the confidence that the target lithofacies was sampled as planned. Second, the circle Hough Transform is suitable for this problem because it provides stable solutions for partially imaged sidewall core cavities typical in pad-based borehole images. Third, the use of Bayesian statistics and specialized metrics for the problem, such as average and standard deviation borehole image intensity in the cavity, provides customizability to work with multiple types of borehole images and with varying initial depth guess uncertainties. Overall, the use of fast and automated methodology for depth verification opens up avenues for near real-time combined sidewall coring, imaging, and verification workflows. The novelty in this study lies in using a combination of image processing techniques and statistical analysis to automate an established manual workflow. The automated workflow provides consistent results in minutes rather than hours. Results also incorporate a confidence index estimation.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"31 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74411570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Meng, L. Frash, J. Carey, Wenfeng Li, N. Welch, Hongtao Zhang
Accurate characterization of oilwell cement mechanical properties is a prerequisite for maintaining long-term wellbore integrity. The drawback of the most widely used technique is unable to measure the mechanical property under in situ curing environment. We developed a high pressure and high temperature vessel that can hydrate cement under downhole conditions and directly measure its elastic modulus and Poisson's ratio at any interested time point without cooling or depressurization. The equipment has been validated by using water and a reasonable bulk modulus of 2.37 GPa was captured. Neat Class G cement was hydrated in this equipment for seven days under axial stress of 40 MPa, and an in situ measurement in the elastic range shows elastic modulus of 37.3 GPa and Poisson's ratio of 0.15. After that, the specimen was taken out from the vessel, and setted up in the triaxial compression platform. Under a similar confining pressure condition, elastic modulus was 23.6 GPa and Possion's ratio was 0.26. We also measured the properties of cement with the same batch of the slurry but cured under ambient conditions. The elastic modulus was 1.63 GPa, and Poisson's ratio was 0.085. Therefore, we found that the curing condition is significant to cement mechanical property, and the traditional cooling or depressurization method could provide mechanical properties that were quite different (50% difference) from the in situ measurement.
{"title":"Measurement of Cement in Situ Stresses and Mechanical Properties Without Cooling or Depressurization","authors":"M. Meng, L. Frash, J. Carey, Wenfeng Li, N. Welch, Hongtao Zhang","doi":"10.2118/206139-ms","DOIUrl":"https://doi.org/10.2118/206139-ms","url":null,"abstract":"\u0000 Accurate characterization of oilwell cement mechanical properties is a prerequisite for maintaining long-term wellbore integrity. The drawback of the most widely used technique is unable to measure the mechanical property under in situ curing environment. We developed a high pressure and high temperature vessel that can hydrate cement under downhole conditions and directly measure its elastic modulus and Poisson's ratio at any interested time point without cooling or depressurization. The equipment has been validated by using water and a reasonable bulk modulus of 2.37 GPa was captured. Neat Class G cement was hydrated in this equipment for seven days under axial stress of 40 MPa, and an in situ measurement in the elastic range shows elastic modulus of 37.3 GPa and Poisson's ratio of 0.15. After that, the specimen was taken out from the vessel, and setted up in the triaxial compression platform. Under a similar confining pressure condition, elastic modulus was 23.6 GPa and Possion's ratio was 0.26. We also measured the properties of cement with the same batch of the slurry but cured under ambient conditions. The elastic modulus was 1.63 GPa, and Poisson's ratio was 0.085. Therefore, we found that the curing condition is significant to cement mechanical property, and the traditional cooling or depressurization method could provide mechanical properties that were quite different (50% difference) from the in situ measurement.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"26 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74385285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Most lab-scale acidizing experiments are performed in core samples with 100% water saturation conditions and at pore pressures around 1100 psi. However, this is seldom the case on the field, where different saturation conditions exist with high temperature and pressure conditions. Carbon-di-Oxide (CO2), a by-product evolved during the acidizing process, is long thought to behave inertly during the acidizing process. Recent investigations reveal that the presence of CO2 dynamically changes the behavior of wormhole patterns and acid efficiency. A compositional simulation technique was adopted to understand the process thoroughly. A validated compositional numerical model capable of replicating acidizing experiments at the core-scale level, in fully aqueous environments described in published literature was utilized in this study. The numerical model was extended to a three-phase environment and applied at the field scale level to monitor and evaluate the impacts of evolved CO2 during the carbonate acidizing processes. Lessons learned from the lab-scale were tested at the field-scale scenario via a numerical model with radial coordinates. Contrary to popular belief, high pore pressures of 1,000 psi and above are not sufficient to keep all the evolved CO2 in solution. The presence of CO2 as a separate phase hinders acid efficiency. The reach or extent of the evolved CO2 is shown to exist only near the damage zone and seldom penetrates the reservoir matrix. Based on the field scale model's predictions, this study warrants conducting acidizing experiments at the laboratory level, at precisely similar pressure, temperature, and salinity conditions faced in the near-wellbore region, and urges the application of compositional modeling techniques to account for CO2 evolution, while studying and predicting matrix acidizing jobs.
{"title":"The Role of CO2 in Carbonate Acidizing at the Field Scale – A Multi-Phase Perspective","authors":"H. Kumar, Sajjaat Muhemmed, H. Nasr-El-Din","doi":"10.2118/206033-ms","DOIUrl":"https://doi.org/10.2118/206033-ms","url":null,"abstract":"\u0000 Most lab-scale acidizing experiments are performed in core samples with 100% water saturation conditions and at pore pressures around 1100 psi. However, this is seldom the case on the field, where different saturation conditions exist with high temperature and pressure conditions. Carbon-di-Oxide (CO2), a by-product evolved during the acidizing process, is long thought to behave inertly during the acidizing process. Recent investigations reveal that the presence of CO2 dynamically changes the behavior of wormhole patterns and acid efficiency.\u0000 A compositional simulation technique was adopted to understand the process thoroughly. A validated compositional numerical model capable of replicating acidizing experiments at the core-scale level, in fully aqueous environments described in published literature was utilized in this study. The numerical model was extended to a three-phase environment and applied at the field scale level to monitor and evaluate the impacts of evolved CO2 during the carbonate acidizing processes. Lessons learned from the lab-scale were tested at the field-scale scenario via a numerical model with radial coordinates.\u0000 Contrary to popular belief, high pore pressures of 1,000 psi and above are not sufficient to keep all the evolved CO2 in solution. The presence of CO2 as a separate phase hinders acid efficiency. The reach or extent of the evolved CO2 is shown to exist only near the damage zone and seldom penetrates the reservoir matrix. Based on the field scale model's predictions, this study warrants conducting acidizing experiments at the laboratory level, at precisely similar pressure, temperature, and salinity conditions faced in the near-wellbore region, and urges the application of compositional modeling techniques to account for CO2 evolution, while studying and predicting matrix acidizing jobs.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"17 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87519760","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A critical component of natural gas in organic-rich shales is adsorbed gas within organic matter. Quantification of adsorbed gas is essential for reliable estimates of gas-in-place in shale reservoirs. However, conventional high-pressure adsorption measurements for coal on the volumetric method are prone to error when applied to characterize sorption kinetics in shale-gas systems due to limited adsorption capacity and finer pores of shale matrix. An innovated laboratory apparatus and measurement procedures have been developed for accurate determination of the relatively small amount of adsorbed gas in the Marcellus shale sample. The custom-built volumetric apparatus is a differential unit composed of two identical single-sided units (one blank and one adsorption side) connected with a differential pressure transducer. The scale of the differential pressure transducer is ± 50 psi, a hundred-fold smaller than the absolute pressure transducer measuring to 5000 psi, leading to a significant increase in the accuracy of adsorption measurement. Methane adsorption isotherms on Marcellus shale are measured at 303, 313, 323 and 333 K with pressure up to 3000 psi. A fugacity-based Dubinin-Astakhov (D-A) isotherm is implemented to correct for the non-ideality and predict the temperature-dependence of supercritical gas sorption. The Marcellus shale studied displays generally linear correlations between adsorption capacity and pressure over the range of temperature and pressure investigated, indicating the presence of a solute gas component. It is noted that the condensed phase gas storage exists as the adsorbed gas on shale surface and dissolved gas in kerogen, where the solute gas amount is proportional to the partial pressure of that gas above the solution. To our best understanding, it is the first time to observe the contribution of dissolved gas to total gas storage. With adsorption potential being modeled by a temperature dependence expression, the D-A isotherm can successfully describe supercritical gas sorption for shale at multiple temperatures. Adsorption capacity remarkably decreases with temperature attributed to the isosteric heat of adsorption. Lastly, the wide applicability of the proposed fugacity-based D-A model is also tested for literature adsorption data on Woodford, Barnett, and Devonian shale. Overall, the fugacity-based D-A isotherm provides precise representations of the temperature-dependent gas adsorption on shales investigated in this work. The application of the proposed adsorption model allows predicting adsorption data at multiple temperatures based on the adsorption data collected at a single temperature. This study lays the foundation for accurate evaluation of gas storage in shale.
{"title":"Quantification of Temperature-Dependent Sorption Kinetics in Shale Gas Reservoirs: Experiment and Theory","authors":"Yun Yang, Shimin Liu","doi":"10.2118/205897-ms","DOIUrl":"https://doi.org/10.2118/205897-ms","url":null,"abstract":"\u0000 A critical component of natural gas in organic-rich shales is adsorbed gas within organic matter. Quantification of adsorbed gas is essential for reliable estimates of gas-in-place in shale reservoirs. However, conventional high-pressure adsorption measurements for coal on the volumetric method are prone to error when applied to characterize sorption kinetics in shale-gas systems due to limited adsorption capacity and finer pores of shale matrix. An innovated laboratory apparatus and measurement procedures have been developed for accurate determination of the relatively small amount of adsorbed gas in the Marcellus shale sample.\u0000 The custom-built volumetric apparatus is a differential unit composed of two identical single-sided units (one blank and one adsorption side) connected with a differential pressure transducer. The scale of the differential pressure transducer is ± 50 psi, a hundred-fold smaller than the absolute pressure transducer measuring to 5000 psi, leading to a significant increase in the accuracy of adsorption measurement. Methane adsorption isotherms on Marcellus shale are measured at 303, 313, 323 and 333 K with pressure up to 3000 psi. A fugacity-based Dubinin-Astakhov (D-A) isotherm is implemented to correct for the non-ideality and predict the temperature-dependence of supercritical gas sorption.\u0000 The Marcellus shale studied displays generally linear correlations between adsorption capacity and pressure over the range of temperature and pressure investigated, indicating the presence of a solute gas component. It is noted that the condensed phase gas storage exists as the adsorbed gas on shale surface and dissolved gas in kerogen, where the solute gas amount is proportional to the partial pressure of that gas above the solution. To our best understanding, it is the first time to observe the contribution of dissolved gas to total gas storage. With adsorption potential being modeled by a temperature dependence expression, the D-A isotherm can successfully describe supercritical gas sorption for shale at multiple temperatures. Adsorption capacity remarkably decreases with temperature attributed to the isosteric heat of adsorption. Lastly, the wide applicability of the proposed fugacity-based D-A model is also tested for literature adsorption data on Woodford, Barnett, and Devonian shale. Overall, the fugacity-based D-A isotherm provides precise representations of the temperature-dependent gas adsorption on shales investigated in this work. The application of the proposed adsorption model allows predicting adsorption data at multiple temperatures based on the adsorption data collected at a single temperature. This study lays the foundation for accurate evaluation of gas storage in shale.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"33 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75212814","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Acid fracturing is a preferred method of stimulating low permeability limestone formations throughout the world. The treatment consists of pumping alternating cycles of viscous pad and acid to promote differential etching, thereby creating a conductive acid-etched fracture. Acid-type, pad and acid volumes, and the injection rates in the designed pump schedule are based on treatment objectives, rock-types and in-situ conditions such as temperatures, in-situ stress, proximity to water-bearing layers, and others. During the acid fracturing treatment, the acid-rock interaction is often marked by signature pressure responses, that are a combined outcome of acid reaction kinetics, responses to changes in fluid viscosity and densities, fluid-frictional drop in narrow hydraulic fractures, and other such parameters. This paper focuses on interpretation of bottomhole pressures during acid fracturing treatment to separate these individual effects and determine the effectiveness of the treatment. Unlike propped fracturing treatments where most fracturing treatments result in net pressure gain, acid fracturing treatments seldom result in net pressure increase at the end of the treatment because the in-situ stresses are generally relieved during the rock-dissolution and fracture width creation process that results from acid-mineral reactions. Not only is the extent of stress relief evident from the difference in the start and the end of the treatment instantaneous shut-in pressures, the loss of stresses is also apparent during the treatment itself, especially in jobs where the treatment data is constantly monitored and evaluated in real-time. The study reveals that the changes in pressure responses with the onset of acid in the formation can be successfully used to determine the effectiveness of treatment design and can aid in carrying out informed changes during the treatment. Better understanding of these responses can also lead to more effective treatment designs for future jobs. The interpretation developed in the study can be applied to most of the acid fracturing treatments that are pumped worldwide.
{"title":"Pressure Interpretations in Acid Fracturing Treatments","authors":"V. Pandey","doi":"10.2118/205990-ms","DOIUrl":"https://doi.org/10.2118/205990-ms","url":null,"abstract":"\u0000 Acid fracturing is a preferred method of stimulating low permeability limestone formations throughout the world. The treatment consists of pumping alternating cycles of viscous pad and acid to promote differential etching, thereby creating a conductive acid-etched fracture.\u0000 Acid-type, pad and acid volumes, and the injection rates in the designed pump schedule are based on treatment objectives, rock-types and in-situ conditions such as temperatures, in-situ stress, proximity to water-bearing layers, and others. During the acid fracturing treatment, the acid-rock interaction is often marked by signature pressure responses, that are a combined outcome of acid reaction kinetics, responses to changes in fluid viscosity and densities, fluid-frictional drop in narrow hydraulic fractures, and other such parameters. This paper focuses on interpretation of bottomhole pressures during acid fracturing treatment to separate these individual effects and determine the effectiveness of the treatment.\u0000 Unlike propped fracturing treatments where most fracturing treatments result in net pressure gain, acid fracturing treatments seldom result in net pressure increase at the end of the treatment because the in-situ stresses are generally relieved during the rock-dissolution and fracture width creation process that results from acid-mineral reactions. Not only is the extent of stress relief evident from the difference in the start and the end of the treatment instantaneous shut-in pressures, the loss of stresses is also apparent during the treatment itself, especially in jobs where the treatment data is constantly monitored and evaluated in real-time. The study reveals that the changes in pressure responses with the onset of acid in the formation can be successfully used to determine the effectiveness of treatment design and can aid in carrying out informed changes during the treatment. Better understanding of these responses can also lead to more effective treatment designs for future jobs.\u0000 The interpretation developed in the study can be applied to most of the acid fracturing treatments that are pumped worldwide.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"9 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84850929","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thitaree Lertliangchai, B. Dindoruk, Ligang Lu, Xi Yang
Dew point pressure (DPP) is a key variable that may be needed to predict the condensate to gas ratio behavior of a reservoir along with some production/completion related issues and calibrate/constrain the EOS models for integrated modeling. However, DPP is a challenging property in terms of its predictability. Recognizing the complexities, we present a state-of-the-art method for DPP prediction using advanced machine learning (ML) techniques. We compare the outcomes of our methodology with that of published empirical correlation-based approaches on two datasets with small sizes and different inputs. Our ML method noticeably outperforms the correlation-based predictors while also showing its flexibility and robustness even with small training datasets provided various classes of fluids are represented within the datasets. We have collected the condensate PVT data from public domain resources and GeoMark RFDBASE containing dew point pressure (the target variable), and the compositional data (mole percentage of each component), temperature, molecular weight (MW), MW and specific gravity (SG) of heptane plus as input variables. Using domain knowledge, before embarking the study, we have extensively checked the measurement quality and the outcomes using statistical techniques. We then apply advanced ML techniques to train predictive models with cross-validation to avoid overfitting the models to the small datasets. We compare our models against the best published DDP predictors with empirical correlation-based techniques. For fair comparisons, the correlation-based predictors are also trained using the underlying datasets. In order to improve the outcomes and using the generalized input data, pseudo-critical properties and artificial proxy features are also employed.
{"title":"A Comparative Analysis of the Prediction of Gas Condensate Dew Point Pressure Using Advanced Machine Learning Algorithms","authors":"Thitaree Lertliangchai, B. Dindoruk, Ligang Lu, Xi Yang","doi":"10.2118/205997-ms","DOIUrl":"https://doi.org/10.2118/205997-ms","url":null,"abstract":"\u0000 Dew point pressure (DPP) is a key variable that may be needed to predict the condensate to gas ratio behavior of a reservoir along with some production/completion related issues and calibrate/constrain the EOS models for integrated modeling. However, DPP is a challenging property in terms of its predictability. Recognizing the complexities, we present a state-of-the-art method for DPP prediction using advanced machine learning (ML) techniques. We compare the outcomes of our methodology with that of published empirical correlation-based approaches on two datasets with small sizes and different inputs. Our ML method noticeably outperforms the correlation-based predictors while also showing its flexibility and robustness even with small training datasets provided various classes of fluids are represented within the datasets. We have collected the condensate PVT data from public domain resources and GeoMark RFDBASE containing dew point pressure (the target variable), and the compositional data (mole percentage of each component), temperature, molecular weight (MW), MW and specific gravity (SG) of heptane plus as input variables. Using domain knowledge, before embarking the study, we have extensively checked the measurement quality and the outcomes using statistical techniques. We then apply advanced ML techniques to train predictive models with cross-validation to avoid overfitting the models to the small datasets. We compare our models against the best published DDP predictors with empirical correlation-based techniques. For fair comparisons, the correlation-based predictors are also trained using the underlying datasets. In order to improve the outcomes and using the generalized input data, pseudo-critical properties and artificial proxy features are also employed.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"52 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82543890","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper develops a mathematical model for rate transient analysis in multi-stage fractured horizontal wells with considering weak fluid supply. A new concept of additional skin factor is introduced in the proposed model to characterize the fluid supply. Then, the mathematical model are solved by using the perturbation transformation, point source integration method, Laplace transform, and numerical inversion, while the fracture flow equations are solved by fracture discretization and superposition principle. First, the flow regimes of multi-stage fractured horizontal wells with considering weak fluid supply are identified based on the rate transient behaviors, including wellbore storage and skin effect, bilinear flow, linear flow, pseudo-radial flow in the fractured zone, interface skin effect, pseudo-radial flow in the original zone, and boundary-dominated flow. The effect of additional interface skin makes the double logarithmic curve of production rate appear an abrupt "overlap". The results of the sensitivity study show that the abrupt "overlap" becomes more obvious with the increase of the fracture conductivity, fracture number, the stress sensitivity coefficient, especially the interface skin. Finally, the proposed mathematical model is used to perform a case study on the production data of actual tight-gas wells from the Ordos Basin. The interface skin factor, fracture half-length, fracture conductivity, and boundary radius are evaluated. Through the proposed model, the characteristics of weak fluid supply in tight gas reservoirs are fully understood.
{"title":"Mathematical Model for Rate Transient Analysis with Additional Interface Skin for Fractured Horizontal Well With Weak Fluid Supply","authors":"Jiali Zhang, X. Liao, Nai Cao","doi":"10.2118/206169-ms","DOIUrl":"https://doi.org/10.2118/206169-ms","url":null,"abstract":"This paper develops a mathematical model for rate transient analysis in multi-stage fractured horizontal wells with considering weak fluid supply. A new concept of additional skin factor is introduced in the proposed model to characterize the fluid supply. Then, the mathematical model are solved by using the perturbation transformation, point source integration method, Laplace transform, and numerical inversion, while the fracture flow equations are solved by fracture discretization and superposition principle. First, the flow regimes of multi-stage fractured horizontal wells with considering weak fluid supply are identified based on the rate transient behaviors, including wellbore storage and skin effect, bilinear flow, linear flow, pseudo-radial flow in the fractured zone, interface skin effect, pseudo-radial flow in the original zone, and boundary-dominated flow. The effect of additional interface skin makes the double logarithmic curve of production rate appear an abrupt \"overlap\". The results of the sensitivity study show that the abrupt \"overlap\" becomes more obvious with the increase of the fracture conductivity, fracture number, the stress sensitivity coefficient, especially the interface skin. Finally, the proposed mathematical model is used to perform a case study on the production data of actual tight-gas wells from the Ordos Basin. The interface skin factor, fracture half-length, fracture conductivity, and boundary radius are evaluated. Through the proposed model, the characteristics of weak fluid supply in tight gas reservoirs are fully understood.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"3 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82759157","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Alsaeedi, M. Elabrashy, M. Alzeyoudi, M. Albadi, Sandeep Soni, Jose Isambertt, Deepak Tripathi
The concept of integrated modeling and digital transformation has grown within the oil and gas industry over the past decade and every such digital transformation has its own set of challenges from which significant learnings can be derived to enhance the knowledge base of the industry. This paper encompasses the successful achievement journey from the UAE's first end to end standardized workflow- based digital transformation in a giant gas producing asset, where several key challenges and learnings have been summarized that are originated from a unique project for a giant gas-condensate asset. The role and importance from multiple business stakeholders such as the planning, engineering, operations and performance teams was imperative to establish a collaborative working philosophy and a detailed specification document, the end-to-end solution, functional and non-functional requirements were captured and aligned with end-user needs. Firstly, a detailed offline phase along with focused efforts in understanding data-quality and establishing representative base-models, was key to enhance the benefit-realization of the integrated platform. Secondly, the online implementation helped in achieving significant process efficiency improvement as inbuilt data validation features significantly improved the confidence of the output. The diagnostic workflows replaced the conventional spreadsheet-based approach. The digital platform works as a common reference of "truth" for everyone across the organization. It helped to produce several the business KPIs to assist the engineers in emphasizing on the problem area, such as improved well test planning.
{"title":"UAE's First End to End Standardized Workflow-Based Digital Transformation in a Giant Gas Producing Asset - Lessons Learned and Way Forward","authors":"A. Alsaeedi, M. Elabrashy, M. Alzeyoudi, M. Albadi, Sandeep Soni, Jose Isambertt, Deepak Tripathi","doi":"10.2118/205851-ms","DOIUrl":"https://doi.org/10.2118/205851-ms","url":null,"abstract":"\u0000 The concept of integrated modeling and digital transformation has grown within the oil and gas industry over the past decade and every such digital transformation has its own set of challenges from which significant learnings can be derived to enhance the knowledge base of the industry. This paper encompasses the successful achievement journey from the UAE's first end to end standardized workflow- based digital transformation in a giant gas producing asset, where several key challenges and learnings have been summarized that are originated from a unique project for a giant gas-condensate asset.\u0000 The role and importance from multiple business stakeholders such as the planning, engineering, operations and performance teams was imperative to establish a collaborative working philosophy and a detailed specification document, the end-to-end solution, functional and non-functional requirements were captured and aligned with end-user needs.\u0000 Firstly, a detailed offline phase along with focused efforts in understanding data-quality and establishing representative base-models, was key to enhance the benefit-realization of the integrated platform. Secondly, the online implementation helped in achieving significant process efficiency improvement as inbuilt data validation features significantly improved the confidence of the output.\u0000 The diagnostic workflows replaced the conventional spreadsheet-based approach. The digital platform works as a common reference of \"truth\" for everyone across the organization. It helped to produce several the business KPIs to assist the engineers in emphasizing on the problem area, such as improved well test planning.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"11 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87829692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}