The weight of argument in the light of theory of decision under uncertainty

To assess in depth the practical relevance of the Keynesian concept of weight of argument, it is sensible to take into account the remarkable advances of decision theory under uncertainty (henceforth DTU) since the publication of TP (see Camerer and Weber, 1992).

DTU reaches a stage of maturity with von Neumann and Morgenstern (1944), who succeed in providing sound foundations by axiomatizing it from the point of view of objective probabilities. Its empirical scope, however, is limited to what we have called soft uncertainty. Probabilities are considered by the DM as 'known', that is, as fully reliable. In this case the weight of argument does not have any practical role since by assumption is always equal to 1.

A few years later Savage (1954), building on ideas put forward by Ramsey (1931) and de Finetti (1937), suggests a different axiomatized DTU that pretends to be applicable to any situation characterized by uncertainty. In this subjectivist theory, often called Bayesian, probabilities are conceived of as epistemic weights that assure the consistency of the decisions of a rational agent. De Finetti and Savage believe that the distinction between different modalities of uncertainty, and thus also concepts such as the weight of argument that presuppose it, are inconsistent with rationality. The main argument has been put forward by de Finetti, who has developed in the form of a theorem intuitions put forward by Ramsey. He shows that, if the beliefs of the DM are not represented in the form of a unique distribution of additive probabilities, as in Bayesian theory, he is vulnerable to accept a 'Dutch book', that is, a system of bets whose acceptance is irrational as it does not involve a possible positive pay-off. The assumption that the beliefs are represented by a unique distribution of additive probabilities implies that the DM has complete relevant knowledge, so that his uncertainty is soft. Therefore, in this view only weak uncertainty is consistent with rationality, and this precludes any normative role for the weight of argument. Savage reinforces this conclusion by observing that the introduction of a second-order measure of uncertainty would trigger an infinite regress that in his opinion would be unacceptable from a logical point of view (Savage, 1954).

The state-of-the-art textbook exposition of decision theory under uncertainty makes a basic distinction between 'known' and 'unknown' probabilities to articulate a simplistic division of labour between objec- tivist and subjectivist theories of decision (Vercelli, 1999). According to this approach, when the probabilities are 'known' (as in the case of a 'roulette game') the use of the objectivist theory introduced by von Neumann and Morgenstern is prescribed, while when the probabilities are 'unknown' (as in the case of a 'horse race') the use of the subjectivist theory introduced by Savage is prescribed. This widespread eclectic view seems to introduce a distinction between two different modalities of uncertainty, providing an opportunity for the use of the weight of argument seen as degree of knowledge of probabilities. However, a deeper analysis shows that the distinction between known and unknown probabilities is confined to their source (stable frequencies in the objective approach and coherent subjective assessment in Bayesian theory), and does not affect the modality of uncertainty that in both cases is represented by a unique distribution of additive probabilities (Vercelli, 1999). As well, the axioms of the two theories are expressed in a different language but are substantially equivalent. In particular, in both cases the axioms preclude the possibility that a rational agent makes systematic mistakes. It is assumed that the probability distribution is not modified by the choices of the agents and that its structural characteristics are perfectly known by them: the DM knows the complete list of possible world states, the complete list of the available options and the consequences of each option or act in each possible state of the world. These assumptions presuppose that the world is closed and stationary and that the agent has fully adapted to such a 'world' (Vercelli, 2002; 2005). In this case, the weight of argument is maximum and its explicit i ntroduction would be irrelevant.

This common approach of mainstream DTU explains why, up the mid-1980s, most economists and decision theorists expressed sheer hostility against the concept of weight of argument or any other concept presupposing different modalities of uncertainty. However, since the second half of the 1980s, a series of innovative contributions to DTU has progressively generated a climate of opinion that is more favourable to understanding the Keynesian insights on the weight of argument (Kelsey and Quiggin, 1992). First, the obstructive arguments by de Finetti and Savage proved to be weaker than they were originally believed to be. The Dutch book argument by Ramsey and de Finetti is based upon implicit assumptions that are quite implausible in situations in which the weight of argument has a role, that is, when uncertainty is hard, reflecting an open and non-stationary world. This is true in particular for the assumption that the DM is expected to bet for or against a certain event; this does not take account of the fact that a refusal to bet could be altogether rational when the weight of argument is far from its extreme values. In addition, Savage's argument about the infinite regress is not convincing since the introduction of a second-order measure of uncertainty implies only the possibility of a higher-order measure, not its necessity: whether it is useful to introduce a measure of uncertainty of a higher order is a pragmatic question, not a logical one.

The opinion is now gaining ground that there are in principle no binding objections that preclude the analysis of different modalities of uncertainty. The use of the concept of weight of argument is thus fully legitimate. This shift of attitude is both cause and effect of the emergence of new DTUs, no less rigorous than the aforementioned classical ones that presuppose, or at least are consistent with, hard uncertainty and a weight of argument different from its extreme values. Some of these DTUs assume that the beliefs of DMs are to be expressed through a plurality of probability distributions, none of which is considered fully reliable. This amounts to evaluating the probability of the occurrence of an event or a state of the world through an interval of probability. Other DTUs assume that the beliefs of DMs may be expressed through a unique distribution of non-additive probabilities. This expresses the awareness of the DM that his relevant knowledge is incomplete; in the sub-additive case it reveals that the list of possible states or events is not exhaustive (see Vercelli, 1999). The latter assumption may clarify the theoretical and empirical scope of the Keynesian theory of the weight of argument. It is possible to demonstrate that the measure of uncertainty aversion advanced by Dow and Werlang (1992) within this theory

where A is an event and Ac is its complement, is strictly related to the weight of argument as here defined. In fact, we may interpret the equation (4) as a measure of relevant ignorance; this is true in general of measures of sub-additivity of the probability distribution. In this case, by utilizing the normalization mentioned in section 7.2, we obtain that the weight of argument is the complement to unity of the measure of uncertainty aversion suggested by Dow and Werlang:

We can thus conclude that the recent advances of DTU are rediscovering, in the context of a different language and formalization, the importance of the ideas underlying the Keynesian concept of weight of argument.

< Prev   CONTENTS   Source   Next >