The recent digital era has seen the emergence of unlimited and on-demand compute power. For subsurface characterization this offers the capacity to generate many more realizations of a geologic hypothesis, exploring a wider spread of the domain of possibilities at a fraction of the time. This has the potential to drastically reduce the field development planning cycles, which is much needed in a market that is more and more volatile. Such potential can only realize itself if the adequate analytical tools accompany petroleum engineers, in order to enable them to properly assess and investigate the hundreds or thousands of cases (static or dynamic).
A new generation of reservoir engineering tools and workflows are entering the market, such as a expert-guided machine learning well placement solution for the identification of connected and high saturated oil volumes. The latter technique works on ensembles of reservoir models for a robust field development optimization, under subsurface uncertainty and in a highly automated fashion. The methodology is embedded into a structured workflow design to improve a baseline well location design.
This work suggests an iterative improvement of well location designs using probabilistic well ranking to identify low performing wells, probability maps to understand reservoir performance and analytics-based optimization steps targeting large connected and high saturated oil volumes. The methodology is described, and application results are presented for a full optimization loop. The structured approach highlights the value of novel learning techniques to provide an efficient and manageable solution for optimizing a well location design under subsurface uncertainty.
The major technological disruptors, having reshaped the way we work in our industry in the past few years, are mostly happening in the field of digital. People, management, investors, all have extremely high hopes and expectations. The evolution of R&D spends over the past few years is a good indicator of this shift. The pace at which people must adopt and master new technologies can be scary, and this is exacerbated by the exponential growth of data being collected as well as the expectations of faster business returns.
Times have been tough in our industry lately. Though we all appreciate that the future opens towards new challenges where our pool of talent plays an essential role, some indicators are suggesting that additional pressure is going to come from a deficit of resources. This increases the need to transform the way we work; providing tools and solutions to our workforce that will enable them to absorb the additional load.
Geoscientists and engineers need to adapt to the new reality. To achieve transformation they need new tools. Tools that will give them the capability to manipulate data, slice and dice it, scripting their way, by using the language they feel comfortable with. Have solutions to easily visualize, analyze, and monitor their ever-increasing amount of data. And finally have the capacity to infer new data, build relationships, and find new ways to further automate tedious and repetitive tasks.
The democratization of statistical learning methods and analytical tools will increase individual autonomy in a very short time frame. This is achieved by the effective coupling of an AI platform together with BI analytical tools all integrated on top of a versatile data-ecosystem. This is achieved by the DELFI Data Science profile. It will equip engineers with the solutions they need to effectively automate time-consuming activities and shift their focus away from monitoring tasks into forward looking planning activities.
In recent years there has been significant interest in using machine learning (ML) techniques in reservoir description. Perhaps the most common application to date has been classification of geological features, such as faults from seismic data.
Another strand of activity has been the use of generative adversarial networks (GANs) based on large analogue training data sets for property modeling. There has been less activity in trying to leverage ML methods in conjunction with geostatistical techniques for property modeling. These techniques are used to provide estimates and simulated realizations of reservoir properties such as porosity and permeability. While they have been very successful and have become the standard way of producing models of reservoir properties, they do require some workarounds in real world situations.
The Ember algorithm combines classic geostatistical estimation and simulation techniques with ML to provide robust and fast estimate of reservoir properties. It is compared to a classical Gaussian co-simulation in a synthetic reservoir model. It is seen that the Ember model performs as well as the classic algorithm, when the required conditions for the Gaussian model are met, and outperforms it, when the Gaussian model is not a good fit (heteroscedasticity and incorrect variogram). When additional relevant data is available, or when the fit of the classic geostatistical algorithm is poor (such as in realistic situations where many aspects of the available data are poorly modeled with stationary hypotheses)..
Such innovative hybrid modeling solutions can substantially outperform any classical method. The algorithm can be used with minimal preparatory work and a reduced exposure to expert geo-statistical settings. It enables a step change in speed and reliability for reservoir model building and by extension field development planning.
Asset teams are tasked with proposing well production enhancement opportunities on a regular basis. While this process is crucial to sustaining asset production, the identification of potential workover candidates is limited by budgetary constraints and is historically a lengthy, manual process, which consumes several weeks of engineers’ time in every iteration. Moreover, best practices and lessons learned from past interventions are rarely captured, this prevents systematic improvement of the candidate selection methodology and choice of intervention.
Using artificial intelligence (AI) integrated with a production operations solution, a standardized and automated approach to rapidly screen and rank large well count assets (hundreds or thousands of wells) in a fraction of the time and repeatably, was proposed. The solution enabled the proactive management of existing wells, by keeping the production enhancement opportunity pipeline full and expediting potential candidates through the opportunity maturation process (OMP) in an integrated and collaborative framework. Engineers confirmed and validated the system-generated opportunities before escalating them for approval.
The application of the solution led to workover and intervention candidate evaluations being routinely performed on a weekly basis (compared to the bi-yearly approval reviews previously held) across almost 200 well completions. In its early evaluation, the solution, has provided 89% time savings in both the identification and review of intervention candidates. Additionally, 88% cost savings have resulted from the elimination of manual work.
Seismic interpretation has been a cornerstone of E&P activities for several decades, however the interpretation and extraction of value from such datasets still poses a major operational bottleneck in the overall geology and geophysics (G&G) workflow. Highly manual and subjective workflows, with a high degree of repetition combined with vast quantities of data, results in a time-consuming process which frequently yields inconsistent results.
The application of machine learning (ML) in fault identification, interpretation, and as a feeder to the reservoir modeling workflow, is optimizing reservoir development. The use of pretrained ML model fault prediction yielded a moderately accurate prediction of the total structures within the input seismic cube. This fault prediction provides a basis for initial fault extraction and structural modeling; whilst also highlighting areas where the prediction result could be improved.
Extending the application to a user-driven ML model, with user-defined fault labels, yielded a significant increase in the accuracy of the fault prediction. This set of faults was validated by the geophysicist and used to inform a second structural model realization.
Using an augmented approach to fault identification significantly improved the domain-science driven workflows and delivered substantial value to the interpretation loop—an average 80% time-reduction in the overall structural interpretation was achieved for this project.
Digital oilfield projects integrate a large variety of data for production surveillance and technical management. The typical data includes well and flowline operational parameters, production rate estimations, reservoir models, and sporadic events such as well intervention and well tests. All of these are housed within industry standard historians, SCADA systems and proprietary databases.
Access to high frequency records and sporadic data with their respective time and date , enabled the application of data analytics and machine learning solutions. During any incidence, the SCADA system provides the required data about operational conditions, such as well head pressures and temperatures, and well status and valve positions, this enables the usage of AI to measure the response from the well during a specific operational event.
Using an AI solution, we apply pattern recognition, machine learning, and data-driven analytics to common cases of petroleum production operations for naturally flowing wells. These cases include:
The automated identification of the reservoir environment (fractures, matrix, or dual porosity) based on bottom hole or wellhead pressure tendencies, which can identify radial or linear transient flow.
The diagnosis of water-cut instabilities based on changes of the pressure drop through the wellbore, altering the well head pressure declination rate.
The formation of hydrates based on surveillance of the flowline pressure variations
The identification of uncalibrated choke valves in the surface equipment.
The solution automatically interprets field data (24/7) and provides valuable information to the technical team—it has become a valuable assistant for the petroleum engineers. The solution enhances the supervision and interpretation levels, which has led to increases in efficiency, improved HSE levels, and several economic benefits.