Addressing wellbore integrity through cement evaluation has been an evergreen topic which frequently catches major operators by surprise due to premature water or gas breakthrough causing low production attainability from the wells. Managing idle well strings arising from integrity issues is also a challenge throughout the production period. The remedial solutions to these issues do not come conveniently and require high cost during late life well intervention which often erodes the well economic limit. A critical element of wellbore barrier which is cement integrity evaluation is proposed to be uplifted and given a new perspective to define success criteria for producer wells to achieve certain reserves addition and production recovery. This paper will highlight integrated factors affecting cement bond quality, impact to well production, potential remedies for poor cement bond observed leveraging on the enhanced workflow and new technology and way forward to proactively prevent the unwanted circumstances in the first opportunity taken. A set of recommendations and prioritization criteria for future cement improvement will be also highlighted. Several case specific wells logged with variable cement bond evaluation tools are re-assessed and deep-dived to trace the root causes for unsatisfactory cement bond quality observed which include reservoir characteristics, understanding anomalies during drilling and cementing operation, identifying cement recipe used, log processing parameters applied and observing best practices during cementing operation to improve the quality. New and emerging cement evaluation technology inclusive of radioactive-based logging to meet specific well objectives will be also briefly discussed in terms of differences and technical deliverables. Looking at each spectrum, results show that there are several interdependent factors contributing to poor cement bond quality observed. Accurate understanding of formation behavior, designing fit-for-purpose cement recipe and adequate planning for cementing operation on well-by-well basis are among the top- notch approaches to be applied for an acceptable cement bond quality and placement. Statistics show that 27% to 64% of production attainability is achieved by wells with good cement quality within the first 3 months of production and this increases to 85% to 98% up until 7 months of production period, while only 12% production attainability achieved for those wells with adverse cement quality issue. In another well, water cut as high as 47% since the first day of production is observed which keeps increasing up to 40% thereafter. In a nutshell, cement evaluation exercise shall not be treated as vacuum, instead it requires an integrated foundation and close collaboration to materialize the desired outcomes. Arresting the issue with the right approach in the first place will be the enabler for optimum well performance and productivity to exceed the recovery target.
{"title":"Cement Conundrum: Valuable Lessons Learned for Sustaining Production","authors":"S. Zulkipli","doi":"10.2118/207405-ms","DOIUrl":"https://doi.org/10.2118/207405-ms","url":null,"abstract":"\u0000 Addressing wellbore integrity through cement evaluation has been an evergreen topic which frequently catches major operators by surprise due to premature water or gas breakthrough causing low production attainability from the wells. Managing idle well strings arising from integrity issues is also a challenge throughout the production period. The remedial solutions to these issues do not come conveniently and require high cost during late life well intervention which often erodes the well economic limit. A critical element of wellbore barrier which is cement integrity evaluation is proposed to be uplifted and given a new perspective to define success criteria for producer wells to achieve certain reserves addition and production recovery. This paper will highlight integrated factors affecting cement bond quality, impact to well production, potential remedies for poor cement bond observed leveraging on the enhanced workflow and new technology and way forward to proactively prevent the unwanted circumstances in the first opportunity taken. A set of recommendations and prioritization criteria for future cement improvement will be also highlighted.\u0000 Several case specific wells logged with variable cement bond evaluation tools are re-assessed and deep-dived to trace the root causes for unsatisfactory cement bond quality observed which include reservoir characteristics, understanding anomalies during drilling and cementing operation, identifying cement recipe used, log processing parameters applied and observing best practices during cementing operation to improve the quality. New and emerging cement evaluation technology inclusive of radioactive-based logging to meet specific well objectives will be also briefly discussed in terms of differences and technical deliverables.\u0000 Looking at each spectrum, results show that there are several interdependent factors contributing to poor cement bond quality observed. Accurate understanding of formation behavior, designing fit-for-purpose cement recipe and adequate planning for cementing operation on well-by-well basis are among the top- notch approaches to be applied for an acceptable cement bond quality and placement. Statistics show that 27% to 64% of production attainability is achieved by wells with good cement quality within the first 3 months of production and this increases to 85% to 98% up until 7 months of production period, while only 12% production attainability achieved for those wells with adverse cement quality issue. In another well, water cut as high as 47% since the first day of production is observed which keeps increasing up to 40% thereafter.\u0000 In a nutshell, cement evaluation exercise shall not be treated as vacuum, instead it requires an integrated foundation and close collaboration to materialize the desired outcomes. Arresting the issue with the right approach in the first place will be the enabler for optimum well performance and productivity to exceed the recovery target.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"17 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82409965","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Muhammad Tayab, Aaesha Hashem, Shaikha Al Hamoudi, Farrukh Qureshi, Safdar Khan
Over the last decade, Oil & Gas operations have come under tremendous pressures due to increasing production demands and venturing into harsher environmental conditions, increasing the health risks to crew with underlying medical conditions. Although there are strict medical fitness, requirements in place to reduce the vulnerability of crewmembers, increasing number Non Accidental Deaths (NAD) have challenged the Oil & Gas operations. NAD risks are often linked with medical assessment/fitness to work, training and medical emergency response, NAD questions the adequacy of management controls at work locations, especially in remote locations. ADNOC Group Companies adopt very HSE high standards to protect the workers, environment and assets; however, the risks of aggravating underlying medical conditions, illnesses or disorders often materialize and result in NADs. An extended analysis of over historical NAD events was performed and strengths of NAD barriers (Tayab et al, 2012) was assessed. Based on the review NAD Barriers were further redefined as follow:Adequacy of pre-employment medical assessmentAlert of underlying medical conditionsFollow up on chronic medical conditionsAlert for abnormal behavioursAwareness & Training It was found that over 70% of NAD cases were triggered due to aggravation of chronic illnesses, approximately 50 % of NAD cases were triggered during the first year of employment, 77% of NAD cases were due to cardiovascular illnesses and 18% were due to suicides and 13% were attributed to COVID & other factors. Additional NAD barriers were identified to update the barrier analysis as follows:Alert for abnormal behaviorReadiness to manage Medical EmergenciesWelfare & Counselling
{"title":"Integration of Health Risk Management Techniques to Address Increasing Numbers and Prevention of Non-Accident Deaths NAD in Oil & Gas Operations","authors":"Muhammad Tayab, Aaesha Hashem, Shaikha Al Hamoudi, Farrukh Qureshi, Safdar Khan","doi":"10.2118/208203-ms","DOIUrl":"https://doi.org/10.2118/208203-ms","url":null,"abstract":"\u0000 Over the last decade, Oil & Gas operations have come under tremendous pressures due to increasing production demands and venturing into harsher environmental conditions, increasing the health risks to crew with underlying medical conditions. Although there are strict medical fitness, requirements in place to reduce the vulnerability of crewmembers, increasing number Non Accidental Deaths (NAD) have challenged the Oil & Gas operations. NAD risks are often linked with medical assessment/fitness to work, training and medical emergency response, NAD questions the adequacy of management controls at work locations, especially in remote locations. ADNOC Group Companies adopt very HSE high standards to protect the workers, environment and assets; however, the risks of aggravating underlying medical conditions, illnesses or disorders often materialize and result in NADs. An extended analysis of over historical NAD events was performed and strengths of NAD barriers (Tayab et al, 2012) was assessed. Based on the review NAD Barriers were further redefined as follow:Adequacy of pre-employment medical assessmentAlert of underlying medical conditionsFollow up on chronic medical conditionsAlert for abnormal behavioursAwareness & Training\u0000 It was found that over 70% of NAD cases were triggered due to aggravation of chronic illnesses, approximately 50 % of NAD cases were triggered during the first year of employment, 77% of NAD cases were due to cardiovascular illnesses and 18% were due to suicides and 13% were attributed to COVID & other factors. Additional NAD barriers were identified to update the barrier analysis as follows:Alert for abnormal behaviorReadiness to manage Medical EmergenciesWelfare & Counselling","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"59 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83908762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The objective of this paper is to showcase the successful and innovative troubleshooting data analysis techniques to operate a TEG dehydration system optimally and reduce glycol loss and to meet the product specifications in one of the gas dehydration systems in an upstream gas oil separation plant (GOSP). The gas dehydration system using Triethylene Glycol (TEG) is the most widely used and reliable gas dehydration system in upstream operation. These proven data analysis techniques were used to tackle major and chronic issues associated with gas dehydration system operation that lead to excessive glycol losses, glycol degradation, and off-specification products. Glycol loss is the most important operating problem in the gas dehydration system and it represents a concern to the operation personnel. Most dehydration units are designed for a loss of less than 1 pound of glycol per million standard cubic feet of natural gas treated, depending on the TEG contactor operating temperature. In this paper, comprehensive data analysis of the potential root causes that aggravate undesired glycol losses degradation and off-specification products will be discussed along with solutions to minimize the expected impact. For example, operating the absorption vessel (contactor) or still column at high temperature will increase the glycol loss by vaporization. Also, the glycol losses occurring in the glycol regenerator section are usually caused by excessive reboiler temperature, which causes vaporization or thermal decomposition of glycol (TEG). In addition, excessive top temperature in the still column allows vaporized glycol to escape from the still column with the water vapor. Excessive contactor operating temperature could be the result of malfunction glycol cooler or high TEG flow rate. This paper will focus on a detailed case study in one of the running TEG systems at a gas-oil separation plant.
{"title":"Troubleshooting Gas Dehydration Systems Using Data Analysis","authors":"A. Al-Aiderous","doi":"10.2118/207390-ms","DOIUrl":"https://doi.org/10.2118/207390-ms","url":null,"abstract":"\u0000 The objective of this paper is to showcase the successful and innovative troubleshooting data analysis techniques to operate a TEG dehydration system optimally and reduce glycol loss and to meet the product specifications in one of the gas dehydration systems in an upstream gas oil separation plant (GOSP). The gas dehydration system using Triethylene Glycol (TEG) is the most widely used and reliable gas dehydration system in upstream operation. These proven data analysis techniques were used to tackle major and chronic issues associated with gas dehydration system operation that lead to excessive glycol losses, glycol degradation, and off-specification products. Glycol loss is the most important operating problem in the gas dehydration system and it represents a concern to the operation personnel. Most dehydration units are designed for a loss of less than 1 pound of glycol per million standard cubic feet of natural gas treated, depending on the TEG contactor operating temperature.\u0000 In this paper, comprehensive data analysis of the potential root causes that aggravate undesired glycol losses degradation and off-specification products will be discussed along with solutions to minimize the expected impact. For example, operating the absorption vessel (contactor) or still column at high temperature will increase the glycol loss by vaporization. Also, the glycol losses occurring in the glycol regenerator section are usually caused by excessive reboiler temperature, which causes vaporization or thermal decomposition of glycol (TEG). In addition, excessive top temperature in the still column allows vaporized glycol to escape from the still column with the water vapor. Excessive contactor operating temperature could be the result of malfunction glycol cooler or high TEG flow rate. This paper will focus on a detailed case study in one of the running TEG systems at a gas-oil separation plant.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"8 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82671893","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The maximum use of existing surface produced water treatment (PWT) facilities is a prerequisite for an economic chemical enhanced oil recovery (cEOR) in mature fields, as the erection of additional dedicated polymer treatment facilities can seriously harm the project's business case. These existing facilities often exhibit a reliable design, but do not necessarily fulfill the requirements of treating back-produced polymer. An optimization of installed facilities based on prior assessment of limitations is a way to upgrade facilities with regard to future EOR operations. Since its start-up in 2015, the main PWT plant comprised three separation stages: corrugated plate interceptors (CPIs), dissolved gas flotations (DGFs) and nutshell filters (NSFs). The plant processes up to 1,200 m3/h of conventional produced water at the Matzen field in Austria. Additionally, in 2009 a polymer injection pilot was initiated, with continuous polymer injection started in 2012, and now produces a segregated water stream containing back-produced polymer. Prior field tests with a pilot scale water treatment plant indicated operational issues with the existing set-up of facilities and the flotation chemicals used, with increasing polymer concentrations. At the end of 2018, severe injectivity issues were observed at injectors which were supplied with commingled conventional and polymer containing produced water. These were caused by a chemical interaction between the partially hydrolyzed polyacrylamide (HPAM) and alumina-based water clarifiers, which were applied in the dissolved gas flotation, finally leading to a loss of production. Therefore, a strict segregation of polymer and conventional streams at the common well network has been developed and established, where the separated streams could be injected into different parts of the injection system without any issues. This experience pointed out the future risks and hurdles of an economic cEOR full field roll-out where up to 200 ppm back-produced polymer at all surface treatment facilities is expected. Several studies were performed to identify alternative technologies able to treat polymer containing water. A business case driven option was to initiate an optimization program to develop smart upgrades and ensure maximum use of the existing PWT facilities. The main task was to substitute or stop the current poly-aluminum chloride-based coagulant in the DGF with a dosage of 40 to 60 ppm due to its unfavorable interactions with the back-produced HPAM. A technology assessment, comprehensive measures and economic retrofits of the installed gas dissolving units, the circulation cycle and bubble injection points resulted in a 200% higher flotation bubble bed density. Thanks to these improvements, the dosage of water clarifiers could be stopped, accomplishing similar or even better PWT performance values. In addition to the operational savings achieved, the existing treatment plant can now be used to treat cEOR fluids,
{"title":"Smart Upgrades to Maximize the Use of Existing Produced Water Treatment Facilities for CEOR","authors":"S. Grottendorfer, R. Kadnar, Günter Staudigl","doi":"10.2118/207345-ms","DOIUrl":"https://doi.org/10.2118/207345-ms","url":null,"abstract":"\u0000 The maximum use of existing surface produced water treatment (PWT) facilities is a prerequisite for an economic chemical enhanced oil recovery (cEOR) in mature fields, as the erection of additional dedicated polymer treatment facilities can seriously harm the project's business case. These existing facilities often exhibit a reliable design, but do not necessarily fulfill the requirements of treating back-produced polymer. An optimization of installed facilities based on prior assessment of limitations is a way to upgrade facilities with regard to future EOR operations.\u0000 Since its start-up in 2015, the main PWT plant comprised three separation stages: corrugated plate interceptors (CPIs), dissolved gas flotations (DGFs) and nutshell filters (NSFs). The plant processes up to 1,200 m3/h of conventional produced water at the Matzen field in Austria. Additionally, in 2009 a polymer injection pilot was initiated, with continuous polymer injection started in 2012, and now produces a segregated water stream containing back-produced polymer. Prior field tests with a pilot scale water treatment plant indicated operational issues with the existing set-up of facilities and the flotation chemicals used, with increasing polymer concentrations. At the end of 2018, severe injectivity issues were observed at injectors which were supplied with commingled conventional and polymer containing produced water. These were caused by a chemical interaction between the partially hydrolyzed polyacrylamide (HPAM) and alumina-based water clarifiers, which were applied in the dissolved gas flotation, finally leading to a loss of production.\u0000 Therefore, a strict segregation of polymer and conventional streams at the common well network has been developed and established, where the separated streams could be injected into different parts of the injection system without any issues. This experience pointed out the future risks and hurdles of an economic cEOR full field roll-out where up to 200 ppm back-produced polymer at all surface treatment facilities is expected. Several studies were performed to identify alternative technologies able to treat polymer containing water. A business case driven option was to initiate an optimization program to develop smart upgrades and ensure maximum use of the existing PWT facilities. The main task was to substitute or stop the current poly-aluminum chloride-based coagulant in the DGF with a dosage of 40 to 60 ppm due to its unfavorable interactions with the back-produced HPAM. A technology assessment, comprehensive measures and economic retrofits of the installed gas dissolving units, the circulation cycle and bubble injection points resulted in a 200% higher flotation bubble bed density.\u0000 Thanks to these improvements, the dosage of water clarifiers could be stopped, accomplishing similar or even better PWT performance values. In addition to the operational savings achieved, the existing treatment plant can now be used to treat cEOR fluids,","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"36 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79114071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Akram Younis, Mohammed Alshehhi, Haitham Al Braik, H. Uematsu, Mohamed El-Sayed, Muhammad Abrar Manzar, M. Ismail, Manjiri A. More
Production logging analysis is essential to understand and evaluate reservoir performance throughout the lifetime of an oil well. Data acquisition and analysis is known to be challenging in modern extended reach horizontal wells due to multiple factors such as conveyance difficulties, fluid segregation, debris, or open hole washouts. Advanced compact multiple array production logging tool (APLT) is proposed to minimize the uncertainties related to these challenges. The proposed sensor deployment method provides a comprehensive borehole coverage, thus maximizing the amount of subsurface information collected to evaluate the production performance of a horizontal well. Essential measurements are combined on six individual arms. Each arm is independently deployed which guarantees the best borehole coverage in a variety of borehole condition. Robust mechanical arm design minimizes damage, allows tolerance to decentralization, and provides greater confidence in determining the sensor locations. Each arm utilizes two fluid holdup sensors (Resistance, Optical) and one velocity sensor (Micro-Spinner). Co-location of the sensors minimizes the uncertainty related to sensor spacing when compared with previous generation of APLT. The new sensor deployment method and analysis results are discussed showing the added value in barefoot completion as well as advanced ICD completion. The holdup sensors response from previous generation APLT is compared to the advanced tool and how it relates to better borehole coverage. The results also illustrate use of high frequency optical probes for phase holdup determination. In addition, the optical probes are used to confirm bubble point pressure at in situ conditions by confidently detecting the first gas indication in the tubular. The results clearly show how a compact APLT maximizes the borehole coverage in highly deviated and horizontal wells. This is critical in collecting representative data of all segregated fluids which enables more accurate interpretation of the flow profile in the well and better understanding of reservoir performance. The novelty of the new instrument is the ability to maximize the amount of subsurface production logging information collected with low uncertainty and minimum operational risk.
{"title":"Overcoming Production Logging Challenges in Evaluating Extended Reach Horizontal Wells with Advanced Completions","authors":"Akram Younis, Mohammed Alshehhi, Haitham Al Braik, H. Uematsu, Mohamed El-Sayed, Muhammad Abrar Manzar, M. Ismail, Manjiri A. More","doi":"10.2118/207531-ms","DOIUrl":"https://doi.org/10.2118/207531-ms","url":null,"abstract":"\u0000 \u0000 \u0000 Production logging analysis is essential to understand and evaluate reservoir performance throughout the lifetime of an oil well. Data acquisition and analysis is known to be challenging in modern extended reach horizontal wells due to multiple factors such as conveyance difficulties, fluid segregation, debris, or open hole washouts. Advanced compact multiple array production logging tool (APLT) is proposed to minimize the uncertainties related to these challenges.\u0000 \u0000 \u0000 \u0000 The proposed sensor deployment method provides a comprehensive borehole coverage, thus maximizing the amount of subsurface information collected to evaluate the production performance of a horizontal well. Essential measurements are combined on six individual arms. Each arm is independently deployed which guarantees the best borehole coverage in a variety of borehole condition. Robust mechanical arm design minimizes damage, allows tolerance to decentralization, and provides greater confidence in determining the sensor locations. Each arm utilizes two fluid holdup sensors (Resistance, Optical) and one velocity sensor (Micro-Spinner). Co-location of the sensors minimizes the uncertainty related to sensor spacing when compared with previous generation of APLT.\u0000 \u0000 \u0000 \u0000 The new sensor deployment method and analysis results are discussed showing the added value in barefoot completion as well as advanced ICD completion. The holdup sensors response from previous generation APLT is compared to the advanced tool and how it relates to better borehole coverage. The results also illustrate use of high frequency optical probes for phase holdup determination. In addition, the optical probes are used to confirm bubble point pressure at in situ conditions by confidently detecting the first gas indication in the tubular. The results clearly show how a compact APLT maximizes the borehole coverage in highly deviated and horizontal wells. This is critical in collecting representative data of all segregated fluids which enables more accurate interpretation of the flow profile in the well and better understanding of reservoir performance.\u0000 \u0000 \u0000 \u0000 The novelty of the new instrument is the ability to maximize the amount of subsurface production logging information collected with low uncertainty and minimum operational risk.\u0000","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"3 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87390748","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Fighera, Ernesto Della Rossa, P. Anastasi, Mohammed Amr Aly, T. Diamanti
Improvements in reservoir simulation computational time thanks to GPU-based simulators and the increasing computational power of modern HPC systems, are paving the way for a massive employment of Ensemble History Matching (EHM) techniques which are intrinsically parallel. Here we present the results of a comparative study between a newly developed EHM tool that aims at leveraging the GPU parallelism, and a commercial third-party EHM software as a benchmark. Both are tested on a real case. The reservoir chosen for the comparison has a production history of 3 years with 15 wells between oil producers, and water and gas injectors. The EHM algorithm used is the Ensemble Smoother with Multiple Data Assimilations (ESMDA) and both tools have access to the same computational resources. The EHM problem was stated in the same way for both tools. The objective function considers well oil productions, water cuts, bottom-hole pressures, and gas-oil-ratios. Porosity and horizontal permeability are used as 3D grid parameters in the update algorithm, along with nine scalar parameters for anisotropy ratios, Corey exponents, and fault transmissibility multipliers. Both the presented tool and the benchmark obtained a satisfactory history match quality. The benchmark tool took around 11.2 hours to complete, while the proposed tool took only 1.5 hours. The two tools performed similar updates on the scalar parameters with only minor discrepancies. Updates on the 3D grid properties instead show significant local differences. The updated ensemble for the benchmark reached extreme values for porosity and permeability which are also distributed in a heterogeneous way. These distributions are quite unlikely in some model regions given the initial geological characterization of the reservoir. The updated ensemble for the presented tool did not reach extreme values in neither porosity nor permeability. The resulting property distributions are not so far off from the ones of the initial ensemble, therefore we can conclude that we were able to successfully update the ensemble while persevering the geological characterization of the reservoir. Analysis suggests that this discrepancy is due to the different way by which our EHM code consider inactive cells in the grid update calculations compared to the benchmark highlighting the fact that statistics including inactive cells should be carefully managed to correctly preserve the geological distribution represented in the initial ensemble. The presented EHM tool was developed from scratch to be fully parallel and to leverage on the abundantly available computational resources. Moreover, the ESMDA implementation was tweaked to improve the reservoir update by carefully managing inactive cells. A comparison against a benchmark showed that the proposed EHM tool achieved similar history match quality while improving the computation time and the geological realism of the updated ensemble.
{"title":"Unlocking Ensemble History Matching Potential with Parallelism and Careful Data Management","authors":"G. Fighera, Ernesto Della Rossa, P. Anastasi, Mohammed Amr Aly, T. Diamanti","doi":"10.2118/207606-ms","DOIUrl":"https://doi.org/10.2118/207606-ms","url":null,"abstract":"\u0000 Improvements in reservoir simulation computational time thanks to GPU-based simulators and the increasing computational power of modern HPC systems, are paving the way for a massive employment of Ensemble History Matching (EHM) techniques which are intrinsically parallel. Here we present the results of a comparative study between a newly developed EHM tool that aims at leveraging the GPU parallelism, and a commercial third-party EHM software as a benchmark. Both are tested on a real case.\u0000 The reservoir chosen for the comparison has a production history of 3 years with 15 wells between oil producers, and water and gas injectors. The EHM algorithm used is the Ensemble Smoother with Multiple Data Assimilations (ESMDA) and both tools have access to the same computational resources. The EHM problem was stated in the same way for both tools. The objective function considers well oil productions, water cuts, bottom-hole pressures, and gas-oil-ratios. Porosity and horizontal permeability are used as 3D grid parameters in the update algorithm, along with nine scalar parameters for anisotropy ratios, Corey exponents, and fault transmissibility multipliers.\u0000 Both the presented tool and the benchmark obtained a satisfactory history match quality. The benchmark tool took around 11.2 hours to complete, while the proposed tool took only 1.5 hours. The two tools performed similar updates on the scalar parameters with only minor discrepancies. Updates on the 3D grid properties instead show significant local differences. The updated ensemble for the benchmark reached extreme values for porosity and permeability which are also distributed in a heterogeneous way. These distributions are quite unlikely in some model regions given the initial geological characterization of the reservoir. The updated ensemble for the presented tool did not reach extreme values in neither porosity nor permeability. The resulting property distributions are not so far off from the ones of the initial ensemble, therefore we can conclude that we were able to successfully update the ensemble while persevering the geological characterization of the reservoir. Analysis suggests that this discrepancy is due to the different way by which our EHM code consider inactive cells in the grid update calculations compared to the benchmark highlighting the fact that statistics including inactive cells should be carefully managed to correctly preserve the geological distribution represented in the initial ensemble.\u0000 The presented EHM tool was developed from scratch to be fully parallel and to leverage on the abundantly available computational resources. Moreover, the ESMDA implementation was tweaked to improve the reservoir update by carefully managing inactive cells. A comparison against a benchmark showed that the proposed EHM tool achieved similar history match quality while improving the computation time and the geological realism of the updated ensemble.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"63 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90717447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nasser M. Al-Hajri, Akram R. Barghouti, Sulaiman T. Ureiga
Gas deviation factor (z-factor) and other gas reservoir fluid properties, such as formation volume factor, density, and viscosity, are normally obtained from Pressure-Volume-Temperature (PVT) experimental analysis. This process of reservoir fluid characterization usually requires collecting pressurized fluid samples from the wellbore to conduct the experimental work. The scope of this paper will provide an alternative methodology for obtaining the z-factor. An IR 4.0 tool that heavily utilizes software coding was developed. The advanced tool uses the novel apparent molecular weight profiling concept to achieve the paper objective timely and accurately. The developed tool calculates gas properties based on downhole gradient pressure and temperature data as inputs. The methodology is applicable to dry, wet or condensate gas wells. The gas equation of state is modified to solve numerically for the z-factor using the gradient survey pressure and temperature data. The numerical solution is obtained by applying an iterative computation scheme as described below:A gas apparent molecular weight value is initialized and then gas mixture specific gravity and pseudo-critical properties are calculated.Gas mixture pseudo-reduced properties are calculated from the measured pressure and temperature values at the reservoir depth.A first z-factor value is determined as a function of the pseudo-reduced gas properties.Gas pressure gradient is obtained at the reservoir depth from the survey and used to back-calculate a second z-factor value by applying the modified gas equation of state.Relative error between the two z factor values is then calculated and compared against a low predefined tolerance.The above steps are reiterated at different assumed gas apparent molecular weight values until the predefined tolerance is achieved. This numerical approach is computerized to perform the highest possible number of iterations and then select the z-factor value corresponding to the minimum error among all iterations. The proposed workflow has been applied on literature data with known reservoir gas properties, from PVT analysis, and showed an excellent prediction performance compared to laboratory analysis with less than 5% error.
{"title":"Gas Deviation Factor Calculation Made Easy and Accurate Using an IR 4.0 Tool","authors":"Nasser M. Al-Hajri, Akram R. Barghouti, Sulaiman T. Ureiga","doi":"10.2118/207999-ms","DOIUrl":"https://doi.org/10.2118/207999-ms","url":null,"abstract":"\u0000 Gas deviation factor (z-factor) and other gas reservoir fluid properties, such as formation volume factor, density, and viscosity, are normally obtained from Pressure-Volume-Temperature (PVT) experimental analysis. This process of reservoir fluid characterization usually requires collecting pressurized fluid samples from the wellbore to conduct the experimental work. The scope of this paper will provide an alternative methodology for obtaining the z-factor. An IR 4.0 tool that heavily utilizes software coding was developed. The advanced tool uses the novel apparent molecular weight profiling concept to achieve the paper objective timely and accurately.\u0000 The developed tool calculates gas properties based on downhole gradient pressure and temperature data as inputs. The methodology is applicable to dry, wet or condensate gas wells. The gas equation of state is modified to solve numerically for the z-factor using the gradient survey pressure and temperature data. The numerical solution is obtained by applying an iterative computation scheme as described below:A gas apparent molecular weight value is initialized and then gas mixture specific gravity and pseudo-critical properties are calculated.Gas mixture pseudo-reduced properties are calculated from the measured pressure and temperature values at the reservoir depth.A first z-factor value is determined as a function of the pseudo-reduced gas properties.Gas pressure gradient is obtained at the reservoir depth from the survey and used to back-calculate a second z-factor value by applying the modified gas equation of state.Relative error between the two z factor values is then calculated and compared against a low predefined tolerance.The above steps are reiterated at different assumed gas apparent molecular weight values until the predefined tolerance is achieved.\u0000 This numerical approach is computerized to perform the highest possible number of iterations and then select the z-factor value corresponding to the minimum error among all iterations. The proposed workflow has been applied on literature data with known reservoir gas properties, from PVT analysis, and showed an excellent prediction performance compared to laboratory analysis with less than 5% error.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"26 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91242096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Narwal, Kamlesh Kumar, Z. Alias, P. Agrawal, Zahir Abri, A. Al Hadhrami, Abdulbaqi Al Kindi, John White, Azlan Asidin
In Southern Oman, PDO is producing from several high pressure (500-1000 bar), deep (3-5 km) and sour fields (1-10 mol % H2S). Over time, wells from one field (S A3) started having asphaltene deposition in the wellbore. Recently, the impact on production became severe resulting in high deferment, increased HSE exposure with plugging and high intervention costs. Asset team kicked off an asphaltene management project to tackle this problem, with one initiative being a field trial of a new technology, Magnetic Fluid Conditioner (MFC) to avoid/delay asphaltene plugging in the wellbore. This paper discusses the asphaltene management strategy and field trial results from this new tool deployed to prevent/delay asphaltene deposition.
{"title":"Asphaltene Management Leading to Significant Reduction of Production Deferment Through New Technology Trial MFC Tool","authors":"T. Narwal, Kamlesh Kumar, Z. Alias, P. Agrawal, Zahir Abri, A. Al Hadhrami, Abdulbaqi Al Kindi, John White, Azlan Asidin","doi":"10.2118/208012-ms","DOIUrl":"https://doi.org/10.2118/208012-ms","url":null,"abstract":"\u0000 In Southern Oman, PDO is producing from several high pressure (500-1000 bar), deep (3-5 km) and sour fields (1-10 mol % H2S). Over time, wells from one field (S A3) started having asphaltene deposition in the wellbore. Recently, the impact on production became severe resulting in high deferment, increased HSE exposure with plugging and high intervention costs.\u0000 Asset team kicked off an asphaltene management project to tackle this problem, with one initiative being a field trial of a new technology, Magnetic Fluid Conditioner (MFC) to avoid/delay asphaltene plugging in the wellbore. This paper discusses the asphaltene management strategy and field trial results from this new tool deployed to prevent/delay asphaltene deposition.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"12 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89813275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pouyan Khalili, A. Saasen, M. Khalifeh, B. Aase, G. O. Ånesbug
Magnetic contamination of drilling fluid can impact the accuracy of a directional survey by shielding the magnetic field. Additionally, this contamination, such as swarf or finer magnetic particles, can agglomerate on the downhole tool or BOP and cause tool failure in the worst-case scenario. Thus, it is necessary to measure the magnetic content of drilling fluid. However, there is no recommended practice in API or ISO for this purpose. A simple experimental setup and measurement system was developed that can be easily deployed in the rig site to measure the magnetic contamination of drilling fluid. 47 drilling fluid samples were collected from a multilateral production well drilled with a semi-submersible drilling rig located in one of the North Sea's fields. The magnetic content of these samples was measured using the established method, and the microstructure of the collected content was analyzed using a scanning electron microscope (SEM) and x-ray diffraction analysis (XRD). Ditch magnets are commonly installed in the flowline on the rig to remove the swarf and finer magnetic particles, if the design is optimized. Ditch magnet measurement data of the well that the drilling fluid samples were collected from is presented. Operational details and common factors that might increase the production of the magnetic content were also investigated. By comparing the measured magnetic contamination of the drilling fluid samples and ditch magnet measurement data, it was possible to evaluate the efficiency of the ditch magnet system.
{"title":"Measuring and Analyzing the Magnetic Content of Drilling Fluid","authors":"Pouyan Khalili, A. Saasen, M. Khalifeh, B. Aase, G. O. Ånesbug","doi":"10.2118/207240-ms","DOIUrl":"https://doi.org/10.2118/207240-ms","url":null,"abstract":"\u0000 Magnetic contamination of drilling fluid can impact the accuracy of a directional survey by shielding the magnetic field. Additionally, this contamination, such as swarf or finer magnetic particles, can agglomerate on the downhole tool or BOP and cause tool failure in the worst-case scenario. Thus, it is necessary to measure the magnetic content of drilling fluid. However, there is no recommended practice in API or ISO for this purpose. A simple experimental setup and measurement system was developed that can be easily deployed in the rig site to measure the magnetic contamination of drilling fluid.\u0000 47 drilling fluid samples were collected from a multilateral production well drilled with a semi-submersible drilling rig located in one of the North Sea's fields. The magnetic content of these samples was measured using the established method, and the microstructure of the collected content was analyzed using a scanning electron microscope (SEM) and x-ray diffraction analysis (XRD).\u0000 Ditch magnets are commonly installed in the flowline on the rig to remove the swarf and finer magnetic particles, if the design is optimized. Ditch magnet measurement data of the well that the drilling fluid samples were collected from is presented. Operational details and common factors that might increase the production of the magnetic content were also investigated. By comparing the measured magnetic contamination of the drilling fluid samples and ditch magnet measurement data, it was possible to evaluate the efficiency of the ditch magnet system.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"56 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78162240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Objectives/Scope: During the period of two years the difference between sum of daily oil flow rate measurements of each oil production well using multiphase flow meter (MPFM) and cumulative daily oil production rate measured by custody transfer meter increased overall by 5%. For some wells inaccuracy of MPFM liquid rate measurement could reach 30-50%. The main goal of this research was to improve the accuracy of multiphase flow meter production rate measurements. Methods, Procedures, Process: More than 80 oil production wells were involved in the research, more than 100 flow rate tests were carried out. Machine learning methods such as supervised learning algorithms (linear and nonlinear regressions, method of gradient descent, finite differences algorithm, etc.) have been applied coupled with Integrated production modelling tools such as PROSPER and OpenServer in order to develop a function representing correlation between MPFM parameters and flow rate error. Results, Observations, Conclusions: The difference between cumulative daily oil production rate measured by custody transfer meter and multiphase flow meters decreased to 0.5%. The solution has been officially applied at the oil field and saved USD 500K to the Company. The reliability of the function was then proved by the vendor of MPFMs. Novel/Additive Information: For the first time machine learning algorithms coupled with Integrated Production modelling tools have been used to improve the accuracy of multiphase flow meter production rate measurements.
{"title":"Application of Machine Learning Algorithms and Integrated Production Modelling to Improve Accuracy of Liquid Production Rate Measurements Using Multiphase Flow Meters","authors":"M. Nazarenko, A. Zolotukhin","doi":"10.2118/207674-ms","DOIUrl":"https://doi.org/10.2118/207674-ms","url":null,"abstract":"\u0000 Objectives/Scope: During the period of two years the difference between sum of daily oil flow rate measurements of each oil production well using multiphase flow meter (MPFM) and cumulative daily oil production rate measured by custody transfer meter increased overall by 5%. For some wells inaccuracy of MPFM liquid rate measurement could reach 30-50%. The main goal of this research was to improve the accuracy of multiphase flow meter production rate measurements.\u0000 Methods, Procedures, Process: More than 80 oil production wells were involved in the research, more than 100 flow rate tests were carried out. Machine learning methods such as supervised learning algorithms (linear and nonlinear regressions, method of gradient descent, finite differences algorithm, etc.) have been applied coupled with Integrated production modelling tools such as PROSPER and OpenServer in order to develop a function representing correlation between MPFM parameters and flow rate error.\u0000 Results, Observations, Conclusions: The difference between cumulative daily oil production rate measured by custody transfer meter and multiphase flow meters decreased to 0.5%. The solution has been officially applied at the oil field and saved USD 500K to the Company. The reliability of the function was then proved by the vendor of MPFMs.\u0000 Novel/Additive Information: For the first time machine learning algorithms coupled with Integrated Production modelling tools have been used to improve the accuracy of multiphase flow meter production rate measurements.","PeriodicalId":10959,"journal":{"name":"Day 3 Wed, November 17, 2021","volume":"55 1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78264513","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}