Prospect Generation
Prospect generation in petroleum geology is the process of identifying and evaluating potential areas for oil and gas exploration and development. This process involves a combination of geological and geophysical data analysis, as well as engineering and economic assessments to determine the potential for hydrocarbon accumulation and the economic viability of developing these resources.
Frontier vs. Mature Basins
There are two general exploration situations that influence prospect generation. The first situation involves frontier, or less mature basins, where the presence of hydrocarbons has not been established. Here, our main concern is whether hydrocarbons have matured and been expelled from a source rock. The second situation involves more mature basins where we know a hydrocarbon source exists. Here, our main concerns include determining migration pathways, the types and occurrences of reservoirs, and the effectiveness of seals. For either situation, we must consider the impact of the types, quantity, and quality of data available for answering these questions.
Expert systems are designed to mimic, quantify and analyze the objective and subjective decision-making processes of humans. Analysis of decision-making processes allows us to quantitatively and qualitatively determine how explorationists generate prospects. Analysis of how explorationists evaluate prospects enables us to predict the potential impact of basin models on decisions about whether to drill a given exploration concept. Newly developed quantitative stratigraphic models show how quantitative models strongly affect the outcome of the decision-making process.
Estimating geologic processes, configurations and lithologic distributions is essential in prospect generation, and predicting lithology is perhaps the weakest link. Predicting lithology is especially difficult in many structural plays and in stratigraphic plays that lack an associated seismic anomaly. Stratigraphic and basin-fill models will significantly affect prospect generation because they offer more accurate technologies for predicting subsurface lithologies.
Impact of Information
Before generating a prospect, explorationists commonly believe that the prospect exists. This prior belief may be based on drilling history, statistical analysis or analog reasoning. Explorationists modify their prior belief based on the amount and quality of available data relating to the prospect. They may further modify their belief as new data become available. For example, a new seismic line may give favorable, but not definitive, information about the existence of an anticline (Figure 1, Modification of a prior belief after new information is introduced.
In Bayes Theorem, the final odds that a prospect exists given seismic information is equal to the likelihood of the prospect given seismic data multiplied by the prior odds that the prospect exists.). We express probabilities as subjective degrees of belief. At any given time, there is a probability that an event will occur. As new evidence is introduced, we perceive the original probability to increase, decrease, or remain the same.
The extent to which we modify our belief in the prior existence of a particular prospect depends on the predictiveness and favorableness of evidence for the prospect. Predictiveness relates to the sense and magnitude of change in belief that occurs when new evidence is introduced. Favorableness depends on predictiveness and relates to the degree of certainty that the evidence exists or is optimally developed for a specific example. Predictiveness is an abstract property that relates to the certain existence or optimal development of evidence in general, while favorableness depends on its quality or uncertainty of existence for a particular situation.
The rule-based expert system GEORISKTM, developed at the Colorado School of Mines, uses Bayes Theorem to quantify the predictiveness and favorableness of geological, geophysical and geochemical information for prospect evaluation (Lessenger, 1988a, 1988b). The system was built by quantifying the decision-making processes of three exploration managers. Rules are in the form of hypothesis updates that have premise, action and certainty factors:
Premise IF Evidence
Action THEN Hypotheses (Certainty Factors)
Within GEORISKTM, hypotheses are predictions of causal processes and configurations, or assessments of information that we can use to infer causal processes and configurations. Evidence can be observed data or inferences from data. Certainty factors quantify the predictiveness of positive and negative evidence for the hypotheses, and can range from -100 to +100. For example, consider the following GEORISKTM rule:
Premise IF Thermal history is (is not) well-constrained
Action THEN Thermal model is accurate +90 (-90)
A prior belief that the thermal model is accurate is modified with a certainty factor of +90 if the thermal history is perfectly constrained and -90 if it is not at all established. We assign ranges in values between +90 and -90 according to the degree to which we consider the thermal history constrained.
According to Bayesian analysis, the greater the predictiveness of evidence, the more the prior belief will be modified and will result in a new belief or outcome (Tversky and Kahneman, 1982). The magnitudes of certainty factors control the amount a prior belief is modified. We should therefore look at the spread in certainty factors for favorable and unfavorable evidence to qualify the perceived predictiveness of evidence.
The impact of favorable evidence is termed the sufficiency and the impact of unfavorable evidence is termed the necessity for an hypothesis (Figure 2, The necessity and sufficiency of evidence for a hypothesis.
The spread in the necessity and sufficiency of evidence controls the spread in potential certainty values for the hypothesis. This is a measure of the predictiveness of evidence for a hypothesis.). To prove an hypothesis, both the sufficiency and necessity must be high enough to increase confidence that the hypothesis is true. For example, the predictiveness of thermal history constraints for an accurate thermal model is qualitatively high (Δ=180); thermal history constraints are both highly necessary and highly sufficient to ensure thermal model accuracy. If the sufficiency and necessity have magnitudes approaching zero, our belief in the hypothesis is modified very little with either positive or negative evidence, and we can not prove the hypothesis.
Many sources of information are available for prospect generation, including seismic data, well logs, maturation and geodynamic models, core analyses, facies models, analog field studies, etc. There are three types of information: 1) observational, 2) inferential and 3) quantitative. Observational information requires only objective measurement; interpretation and inference is minimal (e.g., TOC values). Inferential information requires a conceptual interpretation of data (e.g., depositional environment). Quantitative information requires manipulation of data based on descriptions of the data (e.g., an isopach map) or on a quantitative model (e.g., a McKenzie stretching model) rather than a conceptual model (e.g., a facies model). Seismic data supply explorationists with all three types of information: delineating seismic faults is observational; interpreting seismic facies based on reflectors within a depositional sequence is inferential; and converting seismic times to depth based on a velocity model is quantitative.
Different types of information are perceived as having different degrees of predictiveness for either the same source of information or for predicting the same causal process. Information perceived as highly predictive has a greater impact on modifying the prior belief than does information perceived as less predictive. Quantitative methodologies are perceived as most predictive, inferential methodologies least predictive, and observational methodologies of intermediate predictiveness. Predictiveness also increases when multiple types of information mutually predict the same causal process or favorable condition.
For example, to predict a favorable structural configuration, quantitative depth conversion (Δ=120) is more predictive than observational seismic data quality (Δ=110) (Figure 3, Perceived relative predictiveness of different types of information used to assess structural configuration.).
Both are more predictive than a structural cross section (Δ=80), an interpretational inference from well logs, even though well logs imply drilling and therefore larger amounts of, and potentially more reliable, information near the prospect. We often perceive well logs as providing more direct, and thus more reliable, information for predicting the presence and magnitude of faulting than seismic data. But because well logs lack lateral information and lateral changes in structural configuration can be dramatic, we actually perceive seismic data as more reliable for predicting faulting than well logs. The reliability of information depends on how we use it. Consequently, indirect information may be considered more reliable than direct, despite the opposite conclusion reached through intuition.
Maturation and source potential analyses have large spreads in necessity and sufficiency (Δ=180), and may be perceived as very predictive of the presence of hydrocarbons on a basin-wide scale (Figure 4, Perceived relative power and predictiveness of different types of information, representing three major elements used to assess a prospect.).
These analyses are highly quantitative, involving, for example, thermal maturation modeling and geohistory estimates. The evidence and methodologies assessed increase, both in number and variety, as compared to other analyses (e.g., estimating the distribution of reservoir lithology).
Structural analyses depend primarily on the observation of data, particularly seismic. There is a moderate spread in the necessity and sufficiency (Δ=100) of structural information for predicting the presence of a hydrocarbon trap. The number and variety of available analyses is less than is available for maturation studies. While structural and stratigraphic analyses are equally necessary for predicting the existence of a prospect (-50), structural analyses are far more sufficient than stratigraphic analyses (+50 compared with +20).
Methods for predicting lithology, in contrast, rely on inferential data types, unless there is a seismic anomaly associated with reservoir development. The presence of lithologic conditions for predicting a hydrocarbon reservoir and trap has a relatively low necessity and sufficiency spread (Δ=70).
The methodologies we use to predict lithology depend on specialized conditions. If we have a well nearby (i.e. within one mile) that indicates the presence of the target lithology, predictiveness increases. If there is a seismic anomaly, predictiveness increases further, and if additional sources of information, such as well ties to seismic data and models, are available, predictiveness is maximized. As with structural analyses, there are different degrees of direct information for predicting lithology: core data are most direct, well logs less direct and seismic anomalies least direct. But, seismic data contain lateral information and are therefore more predictive of lithology than either core or well logs, even though both core and well logs are more direct measurements of lithology. As with structural analyses, direct measurements are not necessarily the most predictive.
We often do not have a nearby well or seismic anomaly. Our lithologic analyses are reduced to evaluations of regional or subregional trends that, when combined with the low certainty levels in trend data, can substantially reduce the effective necessity and sufficiency of predicting lithologic conditions. Without seismic anomalies and nearby wells, we have only very weak methods for predicting the lateral distribution of specific lithologies from well and core information. Currently, these weak methods are derived from concepts or empirical constructions, such as facies models, that are non-quantitative, weakly predictive, and that provide no estimate of the accuracy or confidence level of their prediction.
At present, there is a large imbalance in the amounts and types of information available for predicting the necessary conditions for a hydrocarbon accumulation. Seismic information is widely available and is highly predictive of structural configurations. Consequently, predicting structural traps has a greater certainty than predicting stratigraphic traps or reservoir facies distribution. A variety of information sources exist for predicting source, maturation and migration. Although source-maturation-migration technologies are not completely predictive of hydrocarbon availability, their use has greatly increased predictiveness.
The amounts, types and perceived qualities of information available for predicting the distribution of specific lithologies are limited, and our ability to predict lithology is low. In the common situation where we have neither a stratigraphic seismic anomaly nor nearby wells, predicting the distribution of lithologies is the weakest link in prospect generation. The situation is somewhat analogous to the prediction of hydrocarbon availability before source-maturation-migration technologies were introduced and we relied primarily on play-specific, empirical, non-quantitative models. Without seismic anomalies, predicting lithology currently relies on empirical trends and conceptual facies models.
Impact of Stratigraphic Models
Stratigraphic and basin-fill models will significantly affect prospect generation decisions because they simulate what is perceived as reliable information and because of their potential for increasing our ability to predict lithology.
Three conditions affect the reliability of information for prospect generation. First, multiple information types that support the same predictions are more reliable than a single information type, even if that single type is a strong one. Due to the sparsity of information types currently used in lithology prediction, the future introduction of new quantitative methods, such as those developed through the use of 3-D seismic data, will have a significant impact. In addition to adding a quantitative information type, other analyses necessary for modeling — such as correlation theory, sequence analysis, volumetric facies partitioning, stratigraphic simulation inversion, geohistory and geodynamics — will also add to the types and sources of information available for constraining predictions of lithology.
Second, reliable information must be consistent with other types of information. Stratigraphic models require coherent information from many different types and sources of information because they incorporate many different kinds and scales of controlling processes. For example, a simulation model must be consistent with seismic data and well logs. Separating eustatic from subsidence variations requires knowledge of synchronous stratigraphic sequences in different geographic locations. We can constrain subsidence variations with modeling of regional geodynamic processes and plate tectonic configurations. Sediment supply and dispersal may require knowledge of gross climatic conditions, basin configurations and sedimentary processes. Because we run basin-fill models in time and not depth, methods for extracting and constraining temporal relationships such as biostratigraphy, age-dating and physical sequence stratigraphy are critical. These separate evaluations must all be coherent and geologically reasonable. Coherent, multiple information types increase our confidence in the model results and positively modify our prior belief.
Third, we perceive quantitative models as more predictive than conceptual models because they allow us to quantitatively relate processes to responses and to explicitly state assumptions and initial and boundary conditions, allowing us to establish degrees of confidence in predictions. Quantitative models are necessary for understanding the sensitivity of the stratigraphic response to changes in processes. Geologic assumptions and biases are explicit in quantitative models. By building, testing and using quantitative models, we commonly uncover unreasonable assumptions that were previously not obvious. Sensitivity analyses and explicit statements of assumptions and biases allow us to estimate degrees of non-uniqueness and confidence in model predictions. In contrast, non-quantitative models require only conceptual inferences of responses. Assumptions and boundary and initial conditions are not explicitly stated. Therefore, we cannot reliably estimate degrees of non-uniqueness, nor our confidence in model predictions.
We use only two sources and two types of information in predicting lithology. One source, well logs, is inferential, so we perceive this source as having low predictiveness for estimating the distributions of specific lithologies. The other source, seismic data, is primarily inferential, except when we use quantitative seismic models to increase our confidence in the geologic significance of waveform anomalies. If within a particular play, seismic anomalies prove to be reliable, we perceive them as highly predictive. Unfortunately, seismic anomalies for predicting lithology are rare. By incorporating stratigraphic models, the perceived value and ability to predict lithologies (i.e., alter a prior belief) will be increased, at least to a level similar to predictions of structure and hydrocarbon presence.
Because we may perceive stratigraphic models as highly predictive, we may also expect highly accurate predictions. Stratigraphic models are in their infancy and are only now being built and tested. We need to calibrate models to stratigraphic data to determine what conditions models are applicable to, and to estimate the confidence limits in predictions, given the approximations and measurement errors inherent in the model and the sparse sampling of stratigraphic data. At this stage of model development, we need an iterative process of model building, testing and rebuilding to gain confidence in model predictions.