home
goals
about sensor
optimization
climate
why sensor?
who's fastest?
p10 p50 p90
sensorpx
bayes and markov
drainage radius
dca
frac conductivity
tight & fractured
capillary pressure
miscible
primary_miscible
reserves
mmp
artificial intelligence
spe3
spe8
spe10
parallel?
gridding
fd vs fe
map2excel
plot2excel
third party tools
services
publications
q & a
ethics
contact us
Dr. K. H. Coats

 

 

www.spe.org/events/12fus2Thirty Years of Innovative Thought and Accelerated Results

Novel Techniques for Reservoir Management

4–9 November 2012

Santa Fe, New Mexico, USA

Session I: The Present as the Key to the Future:

Are we Limited to Our Current Tools?

Session Managers: Reza Fassihi and Bimal Parekh

Material balance, analytical and decline-curve models are the tools of choice for many applications, including reserves booking, because of their supposed consistency, simplicity and ease of use for certain applications. In addition, these tools complement the numerical modeling efforts for purposes like reservoir management and field development planning. Other approaches, such as data mining, have also gained traction in creating models in data-rich environments. Thus, probabilistic approaches have been accepted as the model of choice for certain applications. In this session, we will address what lies ahead in terms of the future applicability of currently popular tools. In particular, we will explore the following:

• Will operators and regulators continue to rely on decline curve analysis for reserves reporting?  Perhaps for primary production.  They are not otherwise applicable.

• Will analytical models continue to be the only reliable technology in the future?  Do you mean numerical models?  For conventional and unconventional post-primary applications, analytical models in general aren't reliable at all because they can't solve any real flow problems!  They are extremely limited in the questions they can answer.  They can't generally be used and are unreliable for prediction and optimization of control variables except perhaps in the simplest of cases that are of no general interest.  Numerical models are absolutely reliable where properly used by a qualified expert, as described by Brian Coats' comments in the Simtig discussion "The Role and Benefite of Simulation ..." and many others.  For questions that are properly asked and answered by modeling experts today, there is virtually no uncertainty in the answer.  Non-experts have many doubts and usually have great difficulty putting together valid models to answer any question.  Our numerical models are also applicable to tight unconventionals.  See Brian Coats' description of a coupled fracturing/production numerical model in the referenced discussion.

• What is the trend in the architecture of future numerical models? Serial, and as simple, fast and efficient with as few unknowns as possible.

• At what point can numerical modeling be the tool of choice?  It has been for decades, depending on the question.  See comments by Brian Coats in the Simtig discussion  "The Role and Benefit of Simulation ..." for details of tool selection and proper use.

• Can we overcome the inherent data-intensive nature?  Only by minimizing the size of our upscaled models.

• Will we need new workflows to make them fit-for purpose?  As described by Brian Coats in his Simtig (and LinkedIn) post  "The Solution (beyond production optimization) is in the Workflow"  It is simply automation of the 4 key virtually identical optimization problems in a continuous workflow generating as many history matches and predictions as possible - geological modeling , upscaling, history matching, and predictive optimization.

• What would be the role of data mining in our future “reliable technology” toolkit?

ion II: Review of Current Tools—A Look into

our Tool Chest

Session Managers: Basak Kurtoglu and Dilhan Ilk

Increased ability to produce from complex conventional reservoirs and tight/ultra-tight unconventional resources poses significant challenges to the analysis, modeling and forecasting of well/reservoir behavior. Conventional analytical methods still form the backbone of our core tools.  Although these methods are straightforward and easy to use, their limitations are obvious with increasing complexity in well geometry and reservoir description. The primary objective of this session is to facilitate discussion centered on the possible shortcomings of the present techniques and to lay the groundwork for achieving best practices to analyze and model well/reservoir behavior in the future. Further, the sufficiency of available data and data quality in the application of current techniques will be discussed to deliver a general understanding of the critical data needs for future methodologies.

"What is needed here is for some fine company to provide their single well shale simulation model with fine-scale frac and (partial) production bhp/rate history data for demonstration of existing solutions" predicting the rest of history, one of which is Sensor and is described in Brian Coats' Simtig post.

Recovery

Session Managers: Mohamed Soliman and Chih Chen

Various analytical models have been developed to study conventional reservoirs successfully over the years. With some modifications, these analytical models are extended to cover some of the nonlinear behaviors, such as gas flow, non-Darcy effect and reservoir compaction in unconventional reservoirs.

Several issues merit serious attention:

• Can all the modeling tools for conventional reservoirs be readily adapted to unconventional and hydrate reservoirs? Numerical models can.

• Should future analytical tools be developed to account for the leading factors, such as diffusion physics, in unconventional reservoirs?  Again, there is no such thing as analytical tools that can rigorously solve or optimize any real flow problem.  You must mean numerical models?  Our current models include diffusion where it is or can be significant.  To what extent are various decline-curve analyses valid?  In primary production where there are no possible well interference effects and no changes in boundary condtions.

• How can the importance of geomechanical changes in the formations affect and be modeled for primary recovery with analytic models?  They can't, or at least nobody has been able to demonstrate that they can and we don't think anyone ever will.  Coupled representation of flow is required in a numerical model, such as the one described in Brian Coats' Simtig post.

• Is it possible to develop new modeling tools with novel ideas of data collection in wells to improve modeling of reservoir behavior?  This session will address all these issues to explore the novel techniques in developing future tools for modeling primary recovery in various types of unconventional systems.  Of course it's possible! We will see when someone develops and substantiates them.  Only a reproducible model problem and the improved solution with which others can compare or attempt theirs is needed to determine that, as we have stated publicly many, many times.  This is not difficult.  When claimants do not willingly provide such evidence, generally the improvement or need does not exist.

Session IV: Emerging Tools for Modeling

Secondary Recovery

Session Managers: Harun Ates and Sheldon Gorell

The emphasis of this session will be on novel solutions and emerging tools to analyze the performance of reservoirs under waterflood, with a focus on exploring the following items:

• What are some of these new solutions?

• What are the premises and roles of these emerging tools in reservoir modeling? We'll see when they are properly substantiated.

• Can they predict performance at well, pattern and asset level? same as above.

• Will they address inherent issues such as inaccuracies in measurements, uncertainties in data and other operating variables to make reliable predictions? same as above

• How will they impact the way we manage water flooding? same as above

• Could they even predict events and enable proactive flood management? same as above

• How can we validate the solutions from the emerging tools?  Those making claims must substantiate them by providing the simplest possible reproducible example problem that demonstrates the claimed improvement.  Unfortunately our publishers no longer require any, or even willingness to provide any, leading to the question.

• By data-driven methods, such as matching of field results? same as above

• By reconciling with the traditional methods, such as grid-based flow simulations? same as above

Session V: Developing Efficient and Reliable

Tools for Modeling Tertiary Recovery

Session Managers: Dave Merchant and Reza Fassihi

Tertiary recovery processes may encompass CO2 injection, polymers, surfactants and other technologies. For the past 40 years, CO2 injection has been the most utilized tertiary technology for enhanced oil recovery (EOR). It has evolved from a partially understood process to a process based on proven technology and experience. In the 21st century, CO2 from anthropogenic sources may enable global expansion of this technology into basins that contain oil fields with EOR potential but lacked a CO2 source to make the tertiary recovery process economically attractive. This session will discuss the capability for new, fast tools to predict and manage the complex physics of tertiary recovery.

• How do we speed history matching for mature assets with decades of production history of dubious data quality and a large number of wells?   See the solution given by Brian involving automation of the 4 key  virtually identical optimization problems in "The Role and Benefite of Simulation ..."

• What solution models can represent the complex physics, such as capacitance resistance, streamline, and surrogate?  What is "capacitance resistance"?  Our numerical models of course model both capacitance and resistance.  Streamline and surrogate are both EXTREME SIMPLIFICATIONS OR APPROXIMATIONS OF COMPLEX PHYSICS.

• Can responses be managed with artificial intelligence relationships? We'll see when someone substantiates that ability.  We sincerely hope that someone claiming to do so can provide some simple demonstration that others can test, but this is supposed to be, or at least was, what publications are for.

Session VI: Unconventional Tools for

Unconventional Reservoirs

Session Managers: Li Fan and Jackson Bi

Unconventional reservoir development has ushered new challenges to predict oil and gas recovery. Conventional pressure buildup data are unavailable, and the geometries and conductivities of multiple, complex hydraulic fractures are not predicted accurately enough for performance predictions. In shale reservoirs, complex physics of gas desorption and of oil flow from matrix into fractures is not understood to the extent that they can be replicated by current numerical models. The session will explore the current use of pragmatic modeling tools for unconventional reservoir exploration and development to establish production drivers, well performance measure such as initial production rate (IP), decline rate, and estimated ultimate recovery (EUR). The session will also examine challenges facing the industry today:

The answers here are the same as those in the other session on unconventionals:

"What is needed here is for some fine company to provide their single well shale simulation model with fine-scale frac and partial production bhp/rate history data for demonstration" of existing solutions predicting the rest of history, one of which is described in Brian Coats' Simtig post.

 

• Predicting well performance from complex stimulations (complex fractures in complex formations)

• Whether “quick look” tools can model nanoDarcy and naturally fractured formations

• Integrating data gathered during stimulation and flowback monitoring into models

Session VII: Future for Surrogate Reservoir

Modeling

Session Managers: Eduardo Gildin and Benoit Couet

New technologies that rapidly and accurately simulate various and more sophisticated recovery processes are needed in our industry. Artificial intelligence, data mining, proxy and model reduction methods are being used to overcome some of our challenges. However, many questions still remain in developing surrogate models and data mining techniques. Indeed, the lack of historical applications using these techniques prevents us from determining their efficiency. In this session, we will discuss the path to the future applications of surrogate and data mining techniques and some of the daunting open questions:

New technologies can only be said to be needed if they can be shown to have value.  If that's the case for any of those technologies then that's wonderful and we'll use them, but the greatest needs by far are workflow capacity improvements, regardless of the type of model used.  See the future modeling workflow (which is completely independent of type of model used) described in one of Brian Coats' posts in the Simtig discussion "The Role and Benefits of Simulation ...". 

We can avoid these questions entirely by simply requiring substantiation of claims of improvement in published work.  The lack of the requirement has led to a flood of unsubstantiated published claims.  The consequences are escalating even faster than the U.S. national debt.  The inclusion or at least the willingness and to provide the simplest possible reproducible example problem for which an improved solution can be given should be an absolute requirement for publication and is an absolute requirement of validation.  By choosing not to require it, our publishers have made our literature almost totally unreliable and have actually made us incompetent when unsubstantiated claims are believed and acted upon.  No claims of improvement or superiority should ever be believed without such evidence.  Doing so leads to an incredible waste of time, money, research, and talk.

• Are rapid solutions based on artificial intelligence, data mining, proxy, and model reduction techniques viable and more desirable than grid-based modeling? We'll see when someone substantiates those abilities.

• How will surrogate models address our need to handle a large number of wells and complex well gathering systems?  We don't think they can.  Maybe someone will be able to substantiate them on the simplest example?

• Can we vet the results with high-frequency real-time data to gain confidence? same as above

• Can this approach be combined with other analytic tools?  same as above

Session VIII: Modeling in the Future—

Integration and Hybridization

Session Managers: Sanjay Srinivasan and Scott Meddaugh

At present, there are a variety of reservoir modeling approaches, workflows and tools. Many are specific to the type of study performed or to the recovery mechanism. Some are even currently specific to a particular data type or reservoir. This session will focus on the future of hybrid techniques, specifically for tertiary recovery applications and for integrated reservoir modeling. Topics to be addressed in this session include:

• What differences exist between modeling “green field” reservoirs with limited though generally high quality data, and “brown field” reservoirs with abundant data of varying quality? The level of uncertainty.  Proper use of reservoir models as described by us is not otherwise affected.

• How can real-time reservoir data be effectively incorporated within a fully integrated reservoir modeling/forecasting environment to facilitate efficient “real time” decision-making?  See the solution given by Brian.

• Will sufficient integration across all levels and disciplines involved in reservoir modeling as it is known today enable “management by exception” in the future?  What does that mean?

Session IX: Path to adoption

Session Managers: Shah Kabir and Stan Cullick

This session synthesizes nuggets from preceding sessions. In particular, we will explore how reservoir modeling for hydrocarbon exploitation can be used more efficiently in the future than practiced today. We will review the obstacles, challenges, and above all, explore ways to make a business case for the use of fit-for-purpose reservoir modeling tool in assets of various economic environments.

 


© 2000 - 2022 Coats Engineering, Inc.