There are mainly two types of solids in the oil field waters; Suspended Solids (SS) and Total Dissolved Solids (TDS). While it is easy to remove SS from water, removal of TDS requires the application of advance filtration techniques such as reverse osmosis or ultra-filtration. Because these techniques cannot handle high volumes of the oilfield waters with high TDS content, produced waters originated from hydraulic fracturing activities cannot be treated by using these advance technologies. Thus, in this study we concentrated on the pretreatment of these waters. We investigated the feasibility of the Coagulation, Flocculation, and Sedimentation (CFS) process as pretreatment method to reduce mainly SS in Produced Water (PW) samples. We collected samples from 14 different wells in the Permian Basin. First, we characterized the water samples in terms of pH, SS, TDS, Zeta potential (ZP), Turbidity, Organic matter presence and different Ion concentration. We tested varying doses of several organic and inorganic chemicals, and on treated water samples we measured pH, TDS, SS, Turbidity, ZP and Ions. Then, we compared obtained results with the initial PW characterizations to determine the best performing chemicals and their optimal dosage (OD) to remove contaminants effectively. The cation and anion analyses on the initial water samples showed that TDS is mainly caused by the dissolved sodium and chlorine ions. ZP results indicated that SS are mainly negatively charged particles with absolute values around 20 mV on average. Among the tested coagulants, the best SS reduction was achieved through the addition of ferric sulfate, which helped to reduce the SS around 86%. To further lessen SS, we tested several organic flocculants in which the reduction was improved slightly more. We concluded while high TDS in the Permian basin does not implement a substantial risk for the reduction of fracture conductivity, SS is posing a high risk. Our study showed, depending on components of the initial PW, reuse of the pretreated water for fracturing may minimize fracture conductivity damage.
{"title":"Pretreatment of Produced Waters Containing High Total Dissolved Solids","authors":"Damir Kaishentayev, B. Hascakir","doi":"10.2118/206371-ms","DOIUrl":"https://doi.org/10.2118/206371-ms","url":null,"abstract":"\u0000 There are mainly two types of solids in the oil field waters; Suspended Solids (SS) and Total Dissolved Solids (TDS). While it is easy to remove SS from water, removal of TDS requires the application of advance filtration techniques such as reverse osmosis or ultra-filtration. Because these techniques cannot handle high volumes of the oilfield waters with high TDS content, produced waters originated from hydraulic fracturing activities cannot be treated by using these advance technologies. Thus, in this study we concentrated on the pretreatment of these waters.\u0000 We investigated the feasibility of the Coagulation, Flocculation, and Sedimentation (CFS) process as pretreatment method to reduce mainly SS in Produced Water (PW) samples. We collected samples from 14 different wells in the Permian Basin. First, we characterized the water samples in terms of pH, SS, TDS, Zeta potential (ZP), Turbidity, Organic matter presence and different Ion concentration. We tested varying doses of several organic and inorganic chemicals, and on treated water samples we measured pH, TDS, SS, Turbidity, ZP and Ions. Then, we compared obtained results with the initial PW characterizations to determine the best performing chemicals and their optimal dosage (OD) to remove contaminants effectively.\u0000 The cation and anion analyses on the initial water samples showed that TDS is mainly caused by the dissolved sodium and chlorine ions. ZP results indicated that SS are mainly negatively charged particles with absolute values around 20 mV on average. Among the tested coagulants, the best SS reduction was achieved through the addition of ferric sulfate, which helped to reduce the SS around 86%. To further lessen SS, we tested several organic flocculants in which the reduction was improved slightly more.\u0000 We concluded while high TDS in the Permian basin does not implement a substantial risk for the reduction of fracture conductivity, SS is posing a high risk. Our study showed, depending on components of the initial PW, reuse of the pretreated water for fracturing may minimize fracture conductivity damage.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"40 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87006294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ethar H. K. Alkamil, A. A. Mutlag, Haider W. Alsaffar, Mustafa H. Sabah
Recently, the oil and gas industry faced several crucial challenges affecting the global energy market, including the Covid-19 outbreak, fluctuations in oil prices with considerable uncertainty, dramatically increased environmental regulations, and digital cybersecurity challenges. Therefore, the industrial internet of things (IIoT) may provide needed hybrid cloud and fog computing to analyze huge amounts of sensitive data from sensors and actuators to monitor oil rigs and wells closely, thereby better controlling global oil production. Improved quality of service (QoS) is possible with the fog computing, since it can alleviate challenges that a standard isolated cloud can't handle, an extended cloud located near underlying nodes is being developed. The paradigm of cloud computing is not sufficient to meet the needs of the already extensively utilized IIoT (i.e., edge) applications (e.g., low latency and jitter, context awareness, and mobility support) for a variety of reasons (e.g., health care and sensor networks). Couple of paradigms just like mobile edge computing, fog computing, and mobile cloud computing, have arisen in recently to meet these criteria. Fog computing helps to optimize services and create better user experiences, such as faster responses for critical, time-sensitive needs. At the same time, it also invites problems, such as overload, underload, and disparity in resource usage, including latency, time responses, throughput, etc. The comprehensive review presented in this work shows that fog devices have highly constrained environments and limited hardware capabilities. The existing cloud computing infrastructure is not capable of processing all data in a centralized manner because of the network bandwidth costs and response latency requirements. Therefore, fog computing demonstrated, instead of edge computing, and referred to as "the enabling technologies allowing computation to be performed at the edge of the network, on downstream data on behalf of cloud services and upstream data on behalf of IIoT services" (Shi et al., 2016) is more effective for data processing when data sources are close together. A review of fog and cloud computing literature suggests that fog is better than cloud computing because fog computing performs time-dependent computations better than cloud computing. The cloud is inefficient for latency-sensitive multimedia services and other time-sensitive applications since it is accessible over the internet, like the real-time monitoring, automation, and optimization of petroleum industry operations. As a result, a growing number of IIoT projects are dispersing fog computing capacity throughout the edge network as well as through data centers and the public cloud. A comprehensive review of fog computing features is presented here, with the potential of using it in the petroleum industry. Fog computing can provide a rapid response for applications through preprocess and filter data. Data that has been t
最近,石油和天然气行业面临着影响全球能源市场的几个关键挑战,包括Covid-19疫情、具有相当不确定性的油价波动、急剧增加的环境法规以及数字网络安全挑战。因此,工业物联网(IIoT)可以提供所需的混合云和雾计算来分析来自传感器和执行器的大量敏感数据,以密切监控石油钻井平台和油井,从而更好地控制全球石油生产。雾计算可以改善服务质量(QoS),因为它可以减轻标准孤立云无法处理的挑战,位于底层节点附近的扩展云正在开发中。由于各种原因(例如,医疗保健和传感器网络),云计算范式不足以满足已经广泛使用的工业物联网(即边缘)应用程序(例如,低延迟和抖动、上下文感知和移动性支持)的需求。最近出现了一些范式,如移动边缘计算、雾计算和移动云计算,以满足这些标准。雾计算有助于优化服务和创建更好的用户体验,例如对关键的、时间敏感的需求做出更快的响应。同时,它也会引起一些问题,例如过载、欠载和资源使用的差异,包括延迟、时间响应、吞吐量等。在这项工作中提出的综合审查表明,雾装置具有高度受限的环境和有限的硬件能力。由于网络带宽成本和响应延迟需求,现有的云计算基础设施无法以集中的方式处理所有数据。因此,雾计算证明,而不是边缘计算,被称为“允许在网络边缘执行计算的使能技术,代表云服务的下游数据和代表工业物联网服务的上游数据”(Shi et al., 2016)对于数据源靠近时的数据处理更有效。对雾和云计算文献的回顾表明,雾比云计算更好,因为雾计算比云计算更好地执行依赖于时间的计算。对于延迟敏感的多媒体服务和其他时间敏感的应用来说,云是低效的,因为它可以通过互联网访问,比如石油工业操作的实时监控、自动化和优化。因此,越来越多的工业物联网项目正在将雾计算能力分散到整个边缘网络以及数据中心和公共云。本文对雾计算的特点进行了全面的回顾,并介绍了雾计算在石油工业中的应用潜力。雾计算可以通过对数据进行预处理和过滤,为应用程序提供快速响应。修剪后的数据可以传输到云端,以便进行额外的分析和更好的服务交付。
{"title":"The Role of Hybrid IoT with Cloud Computing and Fog Computing to Help the Oil and Gas Industry Recover from Covid-19 and Face Future Challenges","authors":"Ethar H. K. Alkamil, A. A. Mutlag, Haider W. Alsaffar, Mustafa H. Sabah","doi":"10.2118/206067-ms","DOIUrl":"https://doi.org/10.2118/206067-ms","url":null,"abstract":"\u0000 Recently, the oil and gas industry faced several crucial challenges affecting the global energy market, including the Covid-19 outbreak, fluctuations in oil prices with considerable uncertainty, dramatically increased environmental regulations, and digital cybersecurity challenges. Therefore, the industrial internet of things (IIoT) may provide needed hybrid cloud and fog computing to analyze huge amounts of sensitive data from sensors and actuators to monitor oil rigs and wells closely, thereby better controlling global oil production. Improved quality of service (QoS) is possible with the fog computing, since it can alleviate challenges that a standard isolated cloud can't handle, an extended cloud located near underlying nodes is being developed.\u0000 The paradigm of cloud computing is not sufficient to meet the needs of the already extensively utilized IIoT (i.e., edge) applications (e.g., low latency and jitter, context awareness, and mobility support) for a variety of reasons (e.g., health care and sensor networks). Couple of paradigms just like mobile edge computing, fog computing, and mobile cloud computing, have arisen in recently to meet these criteria. Fog computing helps to optimize services and create better user experiences, such as faster responses for critical, time-sensitive needs. At the same time, it also invites problems, such as overload, underload, and disparity in resource usage, including latency, time responses, throughput, etc.\u0000 The comprehensive review presented in this work shows that fog devices have highly constrained environments and limited hardware capabilities. The existing cloud computing infrastructure is not capable of processing all data in a centralized manner because of the network bandwidth costs and response latency requirements. Therefore, fog computing demonstrated, instead of edge computing, and referred to as \"the enabling technologies allowing computation to be performed at the edge of the network, on downstream data on behalf of cloud services and upstream data on behalf of IIoT services\" (Shi et al., 2016) is more effective for data processing when data sources are close together. A review of fog and cloud computing literature suggests that fog is better than cloud computing because fog computing performs time-dependent computations better than cloud computing. The cloud is inefficient for latency-sensitive multimedia services and other time-sensitive applications since it is accessible over the internet, like the real-time monitoring, automation, and optimization of petroleum industry operations.\u0000 As a result, a growing number of IIoT projects are dispersing fog computing capacity throughout the edge network as well as through data centers and the public cloud. A comprehensive review of fog computing features is presented here, with the potential of using it in the petroleum industry. Fog computing can provide a rapid response for applications through preprocess and filter data. Data that has been t","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"12 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87085507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper considers the problem of steady-state optimal resource allocation in an industrial symbiotic oil production network, or in general, a large-scale oil production system network, where different organizations share common resources. These allocation problems are typically solved in a distributed optimization framework, where the optimization problem is decomposed into smaller subproblems, a central coordinator is used to coordinate the different subproblems. However, the use of a central coordinator may introduce additional practical challenges, such as impartiality issues, or additional operating costs, which is undesirable even in the technological selection phase. To eliminate the need for a central coordinator, this paper proposes a consensus-based optimal resource allocation, where each subproblem or organization is locally optimized, and the coupling constraints are negotiated among the different organizations over a fixed communication network with limited information exchange. The proposed approach is applied to a large-scale subsea oil production system, where the different wells are operated by different organizations. The simulation results of the application show that the proposed approach can optimally allocate the shared resources.
{"title":"Real-Time Optimal Resource Allocation and Constraint Negotiation Applied to A Subsea Oil Production Network","authors":"R. Dirza, S. Skogestad, D. Krishnamoorthy","doi":"10.2118/206102-ms","DOIUrl":"https://doi.org/10.2118/206102-ms","url":null,"abstract":"\u0000 This paper considers the problem of steady-state optimal resource allocation in an industrial symbiotic oil production network, or in general, a large-scale oil production system network, where different organizations share common resources.\u0000 These allocation problems are typically solved in a distributed optimization framework, where the optimization problem is decomposed into smaller subproblems, a central coordinator is used to coordinate the different subproblems. However, the use of a central coordinator may introduce additional practical challenges, such as impartiality issues, or additional operating costs, which is undesirable even in the technological selection phase.\u0000 To eliminate the need for a central coordinator, this paper proposes a consensus-based optimal resource allocation, where each subproblem or organization is locally optimized, and the coupling constraints are negotiated among the different organizations over a fixed communication network with limited information exchange.\u0000 The proposed approach is applied to a large-scale subsea oil production system, where the different wells are operated by different organizations. The simulation results of the application show that the proposed approach can optimally allocate the shared resources.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"35 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88931492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The development of automatable high sensitivity analytical methods for tracer detection has been one of the most central challenges to realize ubiquitous full-field tracer deployment to study reservoirs with many cross-communicating injector and producer wells. Herein we report a tracer analysis approach, inspired by strategies commonly utilized in the biotechnology industry, that directly addresses key limitations in process throughput, detection sensitivity and automation potential of state-of-the-art technologies. A two-dimensional high performance liquid chromatography (2D-HPLC) method was developed for the rapid fluorescence detection and simultaneous identification of a class of novel barcoded tracers in produced water down to ultra-trace concentration ranges (<1ppb), matching the sensitivity of tracer technologies currently used in the oil industry. The sample preparation process throughput was significantly intensified by judicious adaptations of off-the-shelf biopharma automation solutions. The optical detection sensitivity was further improved by the time-resolved luminescence of the novel tracer materials that allows the negation of residual background signals from the produced water. To showcase the potential, we applied this powerful separation and detection methodology to analyze field samples from two recent field validations of a novel class of optically detectable tracers, in which two novel tracers were injected along with a benchmarking conventional fluorobenzoic acid (FBA)-based tracer. The enhanced resolving power of the 2D chromatographic separation drastically suppressed the background signal, enabling the optical detection of a tracer species injected at 10x lower concentration. Further, we orthogonally confirmed the detection of this tracer species by the industry standard high-resolution accurate mass spectrometry (HRAM) technique, demonstrating comparable limits of detection. Tracer detection profile indicated that the transport behavior of the novel optical tracers through highly saline and retentive reservoir was similar to that of FBAs, validating the performance of this new class of tracers. Promising steps toward complete automation of the tracer separation and detection procedure have drastically reduced manual interventions and decreased the analysis cycle time, laying solid foundation to full-field deployment of tracers for better reservoir characterizations to inform decisions on production optimization. This paper outlines the automatable tracer detection methodology that has been developed for robustness and simplicity, so that efficient utilization of the resultant high-resolution tracer data can be applied toward improving production strategy via intelligent and active rate adjustments.
{"title":"Automatable High Sensitivity Tracer Detection: Toward Tracer Data Enriched Production Management of Hydrocarbon Reservoirs","authors":"Hooisweng Ow, Sehoon Chang, Gawain Thomas, Wei Wang, Afnan Mashat, Hussein Shateeb","doi":"10.2118/206338-ms","DOIUrl":"https://doi.org/10.2118/206338-ms","url":null,"abstract":"\u0000 The development of automatable high sensitivity analytical methods for tracer detection has been one of the most central challenges to realize ubiquitous full-field tracer deployment to study reservoirs with many cross-communicating injector and producer wells. Herein we report a tracer analysis approach, inspired by strategies commonly utilized in the biotechnology industry, that directly addresses key limitations in process throughput, detection sensitivity and automation potential of state-of-the-art technologies.\u0000 A two-dimensional high performance liquid chromatography (2D-HPLC) method was developed for the rapid fluorescence detection and simultaneous identification of a class of novel barcoded tracers in produced water down to ultra-trace concentration ranges (<1ppb), matching the sensitivity of tracer technologies currently used in the oil industry. The sample preparation process throughput was significantly intensified by judicious adaptations of off-the-shelf biopharma automation solutions. The optical detection sensitivity was further improved by the time-resolved luminescence of the novel tracer materials that allows the negation of residual background signals from the produced water.\u0000 To showcase the potential, we applied this powerful separation and detection methodology to analyze field samples from two recent field validations of a novel class of optically detectable tracers, in which two novel tracers were injected along with a benchmarking conventional fluorobenzoic acid (FBA)-based tracer. The enhanced resolving power of the 2D chromatographic separation drastically suppressed the background signal, enabling the optical detection of a tracer species injected at 10x lower concentration. Further, we orthogonally confirmed the detection of this tracer species by the industry standard high-resolution accurate mass spectrometry (HRAM) technique, demonstrating comparable limits of detection. Tracer detection profile indicated that the transport behavior of the novel optical tracers through highly saline and retentive reservoir was similar to that of FBAs, validating the performance of this new class of tracers. Promising steps toward complete automation of the tracer separation and detection procedure have drastically reduced manual interventions and decreased the analysis cycle time, laying solid foundation to full-field deployment of tracers for better reservoir characterizations to inform decisions on production optimization.\u0000 This paper outlines the automatable tracer detection methodology that has been developed for robustness and simplicity, so that efficient utilization of the resultant high-resolution tracer data can be applied toward improving production strategy via intelligent and active rate adjustments.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"9 7","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91427245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Matrix acidizing is a common but complex stimulation treatment that could significantly improve production/injection rate, particularly in carbonate reservoirs. However, the desired improvement in all zones of the well by such operation may not be achieved due to existing and/or developing reservoir heterogeneity. This paper describes how a new flow control device (FCD) previously used to control water injection in long horizontal wells can also be used to improve the conformance of acid stimulation in carbonate reservoirs. Acid stimulation of a carbonate reservoir is a positive feedback process. Acid preferentially takes the least resistant path, an area with higher permeability or low skin. Once acid reacts with the formation, the injectivity in that zone increases, resulting in further preferential injection in the stimulated zone. Over-treating a high permeability zone results in poor distribution of acid to low permeability zones. Mechanical, chemical or foam diversions have been used to improve stimulation conformance along the wellbore, however, they may fail in carbonate reservoirs with natural fractures where fracture injectivity dominates the stimulation process. A new FCD has been developed to autonomously control flow and provide mechanical diversion during matrix stimulation. Once a predefined upper limit flowrate is reached at a zone, the valve autonomously closes. This eliminates the impact of thief zone on acid injection conformance and maintains a prescribed acid distribution. Like other FCDs, this device is installed in several compartments in the wells. The device has two operating conditions, one, as a passive outflow control valve, and two, as a barrier when the flow rate through the valve exceeds a designed limit, analogous to an electrical circuit breaker. Once a zone has been sufficiently stimulated by the acid and the injection rate in that zone exceeds the device trip point, the device in that zone closes and restricts further stimulation. Acid can then flow to and stimulate other zones This process can be repeated later in well life to re-stimulate zones. This performance enables the operators to minimise the impacts of high permeability zones on the acid conformance and to autonomously react to a dynamic change in reservoirs properties, specifically the growth of wormholes. The device can be installed as part of lower completions in both injection and production wells. It can be retrofitted in existing completions or be used in a retrievable completion. This technology allows repeat stimulation of carbonate reservoirs, providing mechanical diversion without the need for coiled tubing or other complex intervention. This paper will briefly present an overview of the device performance, flow loop testing and some results from numerical modelling. The paper also discusses the completion design workflow in carbonates reservoirs.
{"title":"The New Flow Control Devices Autonomously Controlling the Performance of Matrix Acid Stimulation Operations in Carbonate Reservoirs","authors":"M. Moradi, M. Konopczynski","doi":"10.2118/205975-ms","DOIUrl":"https://doi.org/10.2118/205975-ms","url":null,"abstract":"\u0000 Matrix acidizing is a common but complex stimulation treatment that could significantly improve production/injection rate, particularly in carbonate reservoirs. However, the desired improvement in all zones of the well by such operation may not be achieved due to existing and/or developing reservoir heterogeneity. This paper describes how a new flow control device (FCD) previously used to control water injection in long horizontal wells can also be used to improve the conformance of acid stimulation in carbonate reservoirs.\u0000 Acid stimulation of a carbonate reservoir is a positive feedback process. Acid preferentially takes the least resistant path, an area with higher permeability or low skin. Once acid reacts with the formation, the injectivity in that zone increases, resulting in further preferential injection in the stimulated zone. Over-treating a high permeability zone results in poor distribution of acid to low permeability zones. Mechanical, chemical or foam diversions have been used to improve stimulation conformance along the wellbore, however, they may fail in carbonate reservoirs with natural fractures where fracture injectivity dominates the stimulation process. A new FCD has been developed to autonomously control flow and provide mechanical diversion during matrix stimulation. Once a predefined upper limit flowrate is reached at a zone, the valve autonomously closes. This eliminates the impact of thief zone on acid injection conformance and maintains a prescribed acid distribution. Like other FCDs, this device is installed in several compartments in the wells. The device has two operating conditions, one, as a passive outflow control valve, and two, as a barrier when the flow rate through the valve exceeds a designed limit, analogous to an electrical circuit breaker. Once a zone has been sufficiently stimulated by the acid and the injection rate in that zone exceeds the device trip point, the device in that zone closes and restricts further stimulation. Acid can then flow to and stimulate other zones This process can be repeated later in well life to re-stimulate zones.\u0000 This performance enables the operators to minimise the impacts of high permeability zones on the acid conformance and to autonomously react to a dynamic change in reservoirs properties, specifically the growth of wormholes. The device can be installed as part of lower completions in both injection and production wells. It can be retrofitted in existing completions or be used in a retrievable completion.\u0000 This technology allows repeat stimulation of carbonate reservoirs, providing mechanical diversion without the need for coiled tubing or other complex intervention. This paper will briefly present an overview of the device performance, flow loop testing and some results from numerical modelling. The paper also discusses the completion design workflow in carbonates reservoirs.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"42 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80737715","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marrie Ma, Jeremy D. Murphy, Nader Salman, Zhen Li, Crispin Chatar, Justin Chatagnier
One unique facet of digital technology is the merging of separate technologies for new workflows and products. Like other industries, energy is also doing this. This project will automate the bit inspection process and this system will reduce labor costs, increase product quality, and improve bit performance. The innovation center is working on various aspects of the project, which aims to join automation technologies with robotic capabilities. Industrial robots are used extensively in traditional high-volume manufacturing applications. The high-mix, low-volume nature of oil and gas manufacturing operations has impeded deployment of automation solutions. Recent advances in sensors, computers, and machine learning now enable integrating robotics and automation technologies into these flexible manufacturing workflows. Driven by digital transformation, an automated inspection system for polycrystalline diamond compact (PDC) drill bits has been developed. The system uses high-resolution robotic 3D scanning, 2D imaging, and artificial intelligence to improve inspection efficiency and product quality. In our user-experience- (UX-) focused approach, we streamlined the user interface (UI) research methods to develop the robotic inspection UI and successfully tested the design with end users. This paper introduces the inspection system and improved workflows for the PDC bits, illustrates the innovative UX/UI development process, and targeted evaluation with the end users, which is crucial before deploying the system in production. We also concluded with some recommended improvements to guide future work.
{"title":"Digitization of Drill Bit Inspections; User-Centered Design Methods to Automate Robotic Inspections","authors":"Marrie Ma, Jeremy D. Murphy, Nader Salman, Zhen Li, Crispin Chatar, Justin Chatagnier","doi":"10.2118/206261-ms","DOIUrl":"https://doi.org/10.2118/206261-ms","url":null,"abstract":"\u0000 One unique facet of digital technology is the merging of separate technologies for new workflows and products. Like other industries, energy is also doing this. This project will automate the bit inspection process and this system will reduce labor costs, increase product quality, and improve bit performance. The innovation center is working on various aspects of the project, which aims to join automation technologies with robotic capabilities.\u0000 Industrial robots are used extensively in traditional high-volume manufacturing applications. The high-mix, low-volume nature of oil and gas manufacturing operations has impeded deployment of automation solutions. Recent advances in sensors, computers, and machine learning now enable integrating robotics and automation technologies into these flexible manufacturing workflows. Driven by digital transformation, an automated inspection system for polycrystalline diamond compact (PDC) drill bits has been developed. The system uses high-resolution robotic 3D scanning, 2D imaging, and artificial intelligence to improve inspection efficiency and product quality. In our user-experience- (UX-) focused approach, we streamlined the user interface (UI) research methods to develop the robotic inspection UI and successfully tested the design with end users. This paper introduces the inspection system and improved workflows for the PDC bits, illustrates the innovative UX/UI development process, and targeted evaluation with the end users, which is crucial before deploying the system in production. We also concluded with some recommended improvements to guide future work.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"119 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86644836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
N. Reddicharla, Subba Ramarao Rachapudi, Indra Utama, F. A. Khan, Prabhker Reddy Vanam, Saber Mubarak Al Nuimi, Mayada Ali Sultan Ali
Well testing is one of the vital process as part of reservoir performance monitoring. As field matures with increase in number of well stock, testing becomes tedious job in terms of resources (MPFM and test separators) and this affect the production quota delivery. In addition, the test data validation and approval follow a business process that needs up to 10 days before to accept or reject the well tests. The volume of well tests conducted were almost 10,000 and out of them around 10 To 15 % of tests were rejected statistically per year. The objective of the paper is to develop a methodology to reduce well test rejections and timely raising the flag for operator intervention to recommence the well test. This case study was applied in a mature field, which is producing for 40 years that has good volume of historical well test data is available. This paper discusses the development of a data driven Well test data analyzer and Optimizer supported by artificial intelligence (AI) for wells being tested using MPFM in two staged approach. The motivating idea is to ingest historical, real-time data, well model performance curve and prescribe the quality of the well test data to provide flag to operator on real time. The ML prediction results helps testing operations and can reduce the test acceptance turnaround timing drastically from 10 days to hours. In Second layer, an unsupervised model with historical data is helping to identify the parameters that affecting for rejection of the well test example duration of testing, choke size, GOR etc. The outcome from the modeling will be incorporated in updating the well test procedure and testing Philosophy. This approach is being under evaluation stage in one of the asset in ADNOC Onshore. The results are expected to be reducing the well test rejection by at least 5 % that further optimize the resources required and improve the back allocation process. Furthermore, real time flagging of the test Quality will help in reduction of validation cycle from 10 days hours to improve the well testing cycle process. This methodology improves integrated reservoir management compliance of well testing requirements in asset where resources are limited. This methodology is envisioned to be integrated with full field digital oil field Implementation. This is a novel approach to apply machine learning and artificial intelligence application to well testing. It maximizes the utilization of real-time data for creating advisory system that improve test data quality monitoring and timely decision-making to reduce the well test rejection.
{"title":"A Novel Well Test Data Analyzer and Process Optimizer Using Artificial Intelligence and Machine Learning Techniques","authors":"N. Reddicharla, Subba Ramarao Rachapudi, Indra Utama, F. A. Khan, Prabhker Reddy Vanam, Saber Mubarak Al Nuimi, Mayada Ali Sultan Ali","doi":"10.2118/206137-ms","DOIUrl":"https://doi.org/10.2118/206137-ms","url":null,"abstract":"\u0000 Well testing is one of the vital process as part of reservoir performance monitoring. As field matures with increase in number of well stock, testing becomes tedious job in terms of resources (MPFM and test separators) and this affect the production quota delivery. In addition, the test data validation and approval follow a business process that needs up to 10 days before to accept or reject the well tests. The volume of well tests conducted were almost 10,000 and out of them around 10 To 15 % of tests were rejected statistically per year. The objective of the paper is to develop a methodology to reduce well test rejections and timely raising the flag for operator intervention to recommence the well test.\u0000 This case study was applied in a mature field, which is producing for 40 years that has good volume of historical well test data is available. This paper discusses the development of a data driven Well test data analyzer and Optimizer supported by artificial intelligence (AI) for wells being tested using MPFM in two staged approach. The motivating idea is to ingest historical, real-time data, well model performance curve and prescribe the quality of the well test data to provide flag to operator on real time. The ML prediction results helps testing operations and can reduce the test acceptance turnaround timing drastically from 10 days to hours. In Second layer, an unsupervised model with historical data is helping to identify the parameters that affecting for rejection of the well test example duration of testing, choke size, GOR etc. The outcome from the modeling will be incorporated in updating the well test procedure and testing Philosophy. This approach is being under evaluation stage in one of the asset in ADNOC Onshore.\u0000 The results are expected to be reducing the well test rejection by at least 5 % that further optimize the resources required and improve the back allocation process. Furthermore, real time flagging of the test Quality will help in reduction of validation cycle from 10 days hours to improve the well testing cycle process. This methodology improves integrated reservoir management compliance of well testing requirements in asset where resources are limited. This methodology is envisioned to be integrated with full field digital oil field Implementation.\u0000 This is a novel approach to apply machine learning and artificial intelligence application to well testing. It maximizes the utilization of real-time data for creating advisory system that improve test data quality monitoring and timely decision-making to reduce the well test rejection.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"5 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83639014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In an oil field, openhole multilateral maximum reservoir contact (MRC) wells are drilled. These wells are typically equipped with smart well completion technologies consisting of inflow control valves and permanent downhole monitoring systems. Conventional flowback techniques consisted of flowing back the well to atmosphere while burning the hydrocarbon and drilling fluids brought to surface. In an age of economic, environmental and safety consciousness, all practices in the petroleum industry are being examined closely. As such, the conventional method of flowing back wells is frowned upon from all aspects. This gives rise to the challenge of flowing back wells in an economic manner without compromising safety and the environment; all the while ensuring excellent well deliverability. By utilizing subsurface smart well completion inflow control valves, individual laterals are flowed to a separator system whereby solid drill cuttings are captured and discharged using a solids management system. Hydrocarbons are separated using a separation vessel and measured before being sent to the production line toward the field separation facility. Permanent downhole monitoring systems are used to monitor pressure drawdown and subsequently control the rate of flow to surface to ensure reservoir integrity. Following the completion of the solids and drilling fluid flowback from the wellbore, comprehensive multi-rate measurements at different choke settings are obtained to quantify the well performance. This paper looks at the economic and environmental improvements of the adopted zero flaring cleanup technology and smart well completions flowback techniques in comparison to conventional flowback methods. This ensures that oil is being recovered during well flowback and lateral contribution to overall flow in multilateral wells. In addition, it highlights the lessons learned and key best practices implemented during the cleanup operation to complete the job in a safe and efficient manner. This technique tends to set a roadmap for a better well flowback that fulfills economic constrains and protects the environment.
{"title":"Well Cleanup Utilizing Smart Well Completion and Zero Flaring Technology","authors":"M. Alkhalifah, Rabih Younes","doi":"10.2118/206246-ms","DOIUrl":"https://doi.org/10.2118/206246-ms","url":null,"abstract":"\u0000 In an oil field, openhole multilateral maximum reservoir contact (MRC) wells are drilled. These wells are typically equipped with smart well completion technologies consisting of inflow control valves and permanent downhole monitoring systems. Conventional flowback techniques consisted of flowing back the well to atmosphere while burning the hydrocarbon and drilling fluids brought to surface. In an age of economic, environmental and safety consciousness, all practices in the petroleum industry are being examined closely. As such, the conventional method of flowing back wells is frowned upon from all aspects. This gives rise to the challenge of flowing back wells in an economic manner without compromising safety and the environment; all the while ensuring excellent well deliverability.\u0000 By utilizing subsurface smart well completion inflow control valves, individual laterals are flowed to a separator system whereby solid drill cuttings are captured and discharged using a solids management system. Hydrocarbons are separated using a separation vessel and measured before being sent to the production line toward the field separation facility. Permanent downhole monitoring systems are used to monitor pressure drawdown and subsequently control the rate of flow to surface to ensure reservoir integrity. Following the completion of the solids and drilling fluid flowback from the wellbore, comprehensive multi-rate measurements at different choke settings are obtained to quantify the well performance.\u0000 This paper looks at the economic and environmental improvements of the adopted zero flaring cleanup technology and smart well completions flowback techniques in comparison to conventional flowback methods. This ensures that oil is being recovered during well flowback and lateral contribution to overall flow in multilateral wells. In addition, it highlights the lessons learned and key best practices implemented during the cleanup operation to complete the job in a safe and efficient manner.\u0000 This technique tends to set a roadmap for a better well flowback that fulfills economic constrains and protects the environment.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"8 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91080163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Solvent flooding is a well-established method of enhanced oil recovery (EOR), with carbon dioxide (CO2) being the most-often used solvent. As CO2 is both less viscous and less dense than the fluids it displaces, the displacement suffers from poor sweep efficiency caused by an unfavorable mobility ratio and unfavorable gravity number. Creating in-situ CO2 foam improves the sweep efficiency of CO2 floods. Another application is the injection of CO2 foam into saline aquifers for carbon capture and storage (CCS). The goal of the core flood experiments in this paper was to study the effectiveness of surface coated silica nanoparticles as an in-situ CO2 foaming agent. In each experiment, the pressure drop was measured across five separate sections in the core, as well as along the whole core. In addition, the saturation distribution in the core was calculated periodically using computed tomography (CT) scanning measurements. The experiments consisted of vertical core floods where liquid CO2 displaced brine from the top to the bottom of the core. A flood with surface coated silica nanoparticles suspended in the brine is performed in the same core and at the same conditions to a flood with no nanoparticles, and results from these floods are compared. In these experiments, breakthrough occurred 45% later with foamed CO2, and the final CO2 saturation was also 45% greater than with the unfoamed CO2. The study shows how nanoparticles stabilize the CO2 front. The results provide quantitative information on, as well as a graphical representation of, the behavior of the CO2 foam front as it advances through the core. This data can be used to upscale the behavior observed and properties calculated from the core-scale to the reservoir-scale to improve field applications of CO2 flooding.
{"title":"Enhanced Experimental Carbon Dioxide Sweep Using Surface Coated Silica Nanoparticles as a Foaming Agent","authors":"Ahmad Alfakher, D. DiCarlo","doi":"10.2118/206278-ms","DOIUrl":"https://doi.org/10.2118/206278-ms","url":null,"abstract":"\u0000 Solvent flooding is a well-established method of enhanced oil recovery (EOR), with carbon dioxide (CO2) being the most-often used solvent. As CO2 is both less viscous and less dense than the fluids it displaces, the displacement suffers from poor sweep efficiency caused by an unfavorable mobility ratio and unfavorable gravity number. Creating in-situ CO2 foam improves the sweep efficiency of CO2 floods. Another application is the injection of CO2 foam into saline aquifers for carbon capture and storage (CCS).\u0000 The goal of the core flood experiments in this paper was to study the effectiveness of surface coated silica nanoparticles as an in-situ CO2 foaming agent. In each experiment, the pressure drop was measured across five separate sections in the core, as well as along the whole core. In addition, the saturation distribution in the core was calculated periodically using computed tomography (CT) scanning measurements. The experiments consisted of vertical core floods where liquid CO2 displaced brine from the top to the bottom of the core. A flood with surface coated silica nanoparticles suspended in the brine is performed in the same core and at the same conditions to a flood with no nanoparticles, and results from these floods are compared. In these experiments, breakthrough occurred 45% later with foamed CO2, and the final CO2 saturation was also 45% greater than with the unfoamed CO2.\u0000 The study shows how nanoparticles stabilize the CO2 front. The results provide quantitative information on, as well as a graphical representation of, the behavior of the CO2 foam front as it advances through the core. This data can be used to upscale the behavior observed and properties calculated from the core-scale to the reservoir-scale to improve field applications of CO2 flooding.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"37 6 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91299694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jimmy Price, C. M. Jones, Bin Dai, Darren Gascooke, M. Myrick
Digital fluid sampling is a technique utilizing downhole sensors to measure formation fluid properties without collecting a physical sample. Unfortunately, sensors are prone to drift over time due to the harsh downhole environmental conditions. Therefore, constant sensor evaluation and calibration is required to ensure the quality of analysis. A new technique utilizes a virtual sensor as a digital twin which provides a calibration that can be utilized by the physical twin. Digital twin technology enables the end-user to operate and collaborate remotely, rapidly simulate different scenarios, and provide improved accuracy via enhanced up-to-date calibrations. With respect to downhole fluid identification, the contribution of harsh environmental conditions and sensor drift can also be mitigated by realizing a virtual implementation of the fluid behavior and the individual sensor components. Historically, the virtual behavior of a digital twin has been constructed by a combination of complex multi-physics and empirical modeling. More recently, access to large datasets and historical results has enabled the use of machine learning neural networks to successfully create digital twin sensors. In this paper, we explore the efficacy of constructing a digital twin on a single downhole optical fluid identification sensor using both the machine learning nonlinear neural network and the complex, multi-physics' based modeling approaches. Advantages and lessons to be learned from each individual method will be discussed in detail. In doing so, we have found a hybrid approach to be most effective in constraining the problem and preventing over-fitting while also yielding a more accurate calibration. In addition, the new hybrid digital twin evaluation and calibration method is extended to encompass an entire fleet of similar downhole sensors simultaneously. The introduction of digital twin technology is not new to the petroleum industry. Yet there is significant room for improvement in order to identify how the technology can be implemented best in order to decrease costs and improve reliability. This paper looks at two separate methods that scientists and engineers employ to enable digital twin technology and ultimately identify that a hybrid approach between machine learning and empirical physics'-based modeling prevails.
{"title":"Characterizing Downhole Fluid Analysis Sensors As Digital Twins: Lessons of the Machine Learning Approach, The Physics Approach and the Integrated Hybrid Approach","authors":"Jimmy Price, C. M. Jones, Bin Dai, Darren Gascooke, M. Myrick","doi":"10.2118/206291-ms","DOIUrl":"https://doi.org/10.2118/206291-ms","url":null,"abstract":"\u0000 Digital fluid sampling is a technique utilizing downhole sensors to measure formation fluid properties without collecting a physical sample. Unfortunately, sensors are prone to drift over time due to the harsh downhole environmental conditions. Therefore, constant sensor evaluation and calibration is required to ensure the quality of analysis. A new technique utilizes a virtual sensor as a digital twin which provides a calibration that can be utilized by the physical twin. Digital twin technology enables the end-user to operate and collaborate remotely, rapidly simulate different scenarios, and provide improved accuracy via enhanced up-to-date calibrations. With respect to downhole fluid identification, the contribution of harsh environmental conditions and sensor drift can also be mitigated by realizing a virtual implementation of the fluid behavior and the individual sensor components. Historically, the virtual behavior of a digital twin has been constructed by a combination of complex multi-physics and empirical modeling. More recently, access to large datasets and historical results has enabled the use of machine learning neural networks to successfully create digital twin sensors. In this paper, we explore the efficacy of constructing a digital twin on a single downhole optical fluid identification sensor using both the machine learning nonlinear neural network and the complex, multi-physics' based modeling approaches. Advantages and lessons to be learned from each individual method will be discussed in detail. In doing so, we have found a hybrid approach to be most effective in constraining the problem and preventing over-fitting while also yielding a more accurate calibration. In addition, the new hybrid digital twin evaluation and calibration method is extended to encompass an entire fleet of similar downhole sensors simultaneously. The introduction of digital twin technology is not new to the petroleum industry. Yet there is significant room for improvement in order to identify how the technology can be implemented best in order to decrease costs and improve reliability. This paper looks at two separate methods that scientists and engineers employ to enable digital twin technology and ultimately identify that a hybrid approach between machine learning and empirical physics'-based modeling prevails.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"19 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75333273","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}