# Decision Support

Suppose that we have obtained a subset of measurement outcomes **m ^{0}**, yielding a distribution P(

**v|m**). One may subsequently ask the question which tool

^{0}*should be deployed next in order to gain as much information as possible?*

**t**When asking this question, one is often interested in a specific subset of minerals and fluids. Here we assume this interest is actually in one specific component u. The question then reduces to selecting the most informative tool(s) * t* for a given mineral

*.*

**u**We define the informativeness of a tool as the expected decrease of uncertainty in the distribution of **v***_{u}* after obtaining a measurement with that tool. Usually, entropy is taken as a measure for uncertainty [22], so a measure of informativeness is the expected entropy of the distribution of

**v***after measurement with tool*

_{u}*,*

**t**

Note that the information of a tool depends on the earlier measurement results since the probabilities in (34) are conditioned on m^{0}.

The most informative tool for mineral * u* is now identified as that tool

*which yields in expectation the lowest entropy in the posterior distribution of*

**t***

**v***:*

**u**

In order to compute the expected conditional entropy using HMC sampling methods, we first rewrite the expected conditional entropy (34) in terms of quantities that are conditioned only on the measurement outcomes m*^{0}*,

Now the HMC run yields a set * V =* {vj,v2

**,---,v**^{]}

_{K}*of compositional samples (conditioned on m*

**}**^{0}). We augment these by a set

**M = {m**

**j**

**= /**

_{1}

**(v***) + Ц, ...,*

^{j}

**m**^{J}

_{Z}*/*

**=**_{z}(v

^{j}) + <^} of synthetic tool values generated from these samples (which are indexed by j) by applying equation (31). Subsequently, discretized joint probabilities

**P(v**

_{u}

**,m***|m°) are obtained via a two-dimensional binning procedure over*

_{t}

**v***and*

_{u }

**m***for each of the potential tools*

_{t}*. The binned versions of*

**t**

**P(v**

_{u}

**, m***|m*

_{t}^{0}) (and

**P(m***|m*

_{t}^{0})) can be directly used to approximate the expected conditional entropy using a discretized version of equation (35).

The outcome of our implementation of the decision support tool is a ranking of tools according to the expected entropies of their posterior distributions. In this way, the user can select a tool based on a trade-off between expected information and other factors, such as deployment costs and feasibility.

# The Application

The application is implemented in C++ as a stand alone version with a graphical user interface running on a Windows PC. The application has been validated by petrophysical domain experts from Shell E&P. The further use by Shell of this application is beyond the scope of this chapter.

# Summary

This chapter described a Bayesian network application for petrophysical decision support. The observation models are based on the physics of the measurement tools. The physical variables in this application are continuous-valued. A naive Bayesian network approach with discretized values would fail. We remained in the continuous domain and used the hybrid Monte Carlo algorithm for inference.