Managing the Uncertainty Challenge in Reservoir Simulation

164
Managing the Uncertainty Challenge in Reservoir Simulation
Modern notebook computer with future technology media symbols

By: Irele Philip Evbomoen, P.Eng, Schlumberger

Uncertainty Challenge

A significant part of any reservoir simulation or production enhancement study involves adequately quantifying reservoir uncertainty, particularly when there is limited information available – as is often the case with most oil and gas fields worldwide. The inherent uncertainties start to manifest right from the hydrocarbon exploration phase and, whereas subsequent field development activities contribute to improving our understanding of the subsurface rock and fluid characteristics, there remains a huge amount of uncertainty that consequently challenge our degree of confidence in using the acquired data of a field to build models that predict how a reservoir will behave in the future and thereby optimize hydrocarbon production and recovery.

Traditional methods of reservoir simulation and production work only for a static problem.  They cannot account for the dynamic changes that occur in time in the connected system of reservoirs and wellbores during multiphase flow.  Moreover, the large number of uncertainties from reservoir to wellbore, cannot be accurately dealt with using traditional approaches. This results in many of the models not being predictive and necessitating a continuous “model reconstruction” loop that has only incrementally marginal impact on the final model predictability. However, further changes in uncertain parameters can quickly challenge the confidence factor of this model. The consequence of this traditional looped approach is depicted not only in the capital investment lost in unprofitable projects but also, in the non-productive time spent in the continuous model reconstruction. 

With today’s digital capabilities such as computing power, cloud computing, and recent software technology developments in stochastic data analysis and statistical optimization, it has become possible to quickly and exhaustively encompass all relevant uncertainty parameters in a given model. Reservoir Uncertainty Analysis software technology now commercially available enables the characterization of uncertainties in reservoir modeling and simulation, as well as production optimization. Some of these tools use experimental design concepts fully integrated with reservoir simulation software to investigate the effect of uncertain parameters in a rigorous statistical framework. 

Constructing the Proxy Model

The process begins with the construction of a Proxy Model. 

Some measured data or uncertain parameters might only have a limited influence on the Objective Function (for instance, Cumulative Well or Reservoir Oil Production), or might not deliver conclusive information for the objective to optimize a reservoir performance or back-allocate production in a Well for instance. Those parameters must be identified and eliminated from the analysis to increases the computational efficiency.

A program based on experimental design techniques is used to obtain maximum information at the lowest experimental cost by varying all uncertain parameters simultaneously.  Each experimental design corresponds to a small set of optimally chosen reservoir simulation runs to encompass the uncertainty domain. A response surface – a polynomial function with linear and quadratic terms – is generated that reflects the solution space.  The weight of each term allows ranking of the uncertainties and eliminating those with very limited influence on the response surface.

This response surface can reproduce the probabilistic flow of each Well or Completion or the entire Reservoir as a function of the input parameters and input uncertainties (Static and Dynamic reservoir uncertainties like Porosity, Permeability, Oil-Water Contact, Gas-Oil Contact; or Wellbore uncertainties like Intelligent Completion (IC) sensor measurements and flow control valve settings). In risk analysis, it is imperative to apply a distribution to the uncertain parameters identified. When little, or nothing, is known about a parameter, the distribution is assumed uniform, giving an equal probability between minimum and maximum values. When a reasonable amount of measurements define the boundaries and an average parameter value is known, triangular or normal distributions are assumed to represent uncertainty. Certainty is then defined as a singular point.

In summary, the experimental design process involves using the key uncertainty ranges in a detailed risk analysis exercise and transforming the resulting distribution of the objective function into a proxy model, which can then be used in a Monte Carlo optimization.

Training the Proxy model

A neural network is trained on the response surface and captures the stochastic character of the problem with a Monte Carlo simulation in which at each step, the neural network learns a particular relation. Trained over all simulations, the neural network can predict the completion flow stochastically.

Predicting Performance

The model in predictive mode is used with a full factorial optimization algorithm that allows stochastic model predictions by selecting samples from the distribution of the uncertainty parameters and combining them to derive an envelope of possible outcomes, the Objective Function.  An iterative optimization technique, a downhill simplex, is used overall defined uncertainty parameters and furnishes optimum values for the Objective Function as well as maximum and minimum values. Since this optimization method is exact, it can, therefore, provide a good starting point in case of production optimization application for a real-time problem.

The predicted performance of the field in a stochastic approach will produce the Objective Function as a probabilistic distribution which represents the possible outcomes after taking into account all possible ranges of the uncertainty variables. The P10, P50 and P90 scenarios from this distribution feed into the Economic analysis to generate the low, mid and high cases for the ROI or other economic parameter. Furthermore, an understanding of the uncertainty factors most impacting the Objective Function guides the multidisciplinary team in understanding what key data to acquire in order to improve model predictability, hence increasing the confidence factor in field development decisions and economic success. 

Experimental design methods reduce computation time by generating a response surface from trials of the actual simulation model to derive a proxy over which the optimizer will traverse. Optimization algorithms, however, utilize a different technique whereby the actual solution surface is traversed by the optimizer (not a proxy). It should be noted that proxy models are, by definition, approximates only, however they are extremely fast. There is, naturally, a trade-off between accuracy and speed of solution. In this regard, both experimental design and optimization algorithms occupy different places in any rigorous risk analysis workflow and are complementary in nature7.

Conclusion

The stochastic approach discussed above enables the multidisciplinary petrotechnical team to:

  • Determine, with greater certainty, the range of an asset value
  • Account for uncertainties in reservoir development economics
  • Rank the influence of uncertain parameters on a simulation model
  • Optimize data acquisition campaigns to focus on key data with maximum impact on uncertainty 
  • Optimize production schemes for probabilistic outcomes in a risk-prone environment
  • Increase the chances of economic success in an eventual field development plan.

LEAVE A REPLY

Please enter your comment!
Please enter your name here