Ethical values and economic sciences

At the end of the last chapter, I stated that the refusal to consider ends in economic theory has to do with maintaining the value-free thesis in science. Ends often entail values, and if values are outside science, defining or deliberating about ends is not a task of economics. This chapter will present diverse arguments against the value-free thesis. In this way, it will not only contribute to validating the argument put forth in the previous chapter, but it will also prepare the ground for the next chapter on normative economics.

The history of normative economics goes back a long way. I recall that John Stuart Mill affirmed that there is an art that defines and proposes an end to itself (1882: 653). I also recalled in the previous chapter Mill’s (1882: 657) statement about the nature of this ‘art’: ‘There must be some standard by which to determine the goodness or badness, absolute or comparative, of ends, of objects of desire’. Neville Keynes treats normative economics as a part of Political Economy, ‘a normative or regulative science as a body of systematized knowledge relating to criteria of what ought to be, and concerned therefore with the ideal as distinguished from the actual’ ([1890] 1955: 34—35,italics in the original). He wonders whether or not we should place under Political Economy ‘a branch of ethics which may be called the ethics of political economy, and which seeks to determine economic ideals' ([1890] 1955: 36), or the ideals with which normative economics is concerned.

Mill’s vision of the art of defining ends and Keynes’ normative economics are closely related to ethics. However, normativity is broader than ethical normativity. Normativity is not necessarily ethical (see Hands 2012a, 2012b). Ethical judgements constitute a subset of normative judgements and values, which, in turn, are also a subset of normativity. There is a technical normativity - if you want to achieve a result you must put in specific means - that is only indirectly linked with ethics. The art of economics deals with this technical normativity. For Keynes, however, the art of economics cannot be isolated from ethics, ‘for no solution of a practical problem, related to human conduct, can be regarded as complete, until its ethical aspects have been considered’ ([1890] 1955: 60).That is, the two kinds of normativity operate together in the art of economics.

Values can be aesthetical, cultural, epistemic or theoretical, and ethical.1 When analysing normative economics, we can be interested in ascertaining its relation to ethical values: should economics be value neutral according to the so-called value-free ideal, or should it consider ethical values? According to Philippe Mongin (2006: 258—261), there are four possible positions or theses in this regard:

1 A strong neutrality position claims that economists qua economists should always refrain from making value judgements. Robbins, for example, distinguishes economics, the economic value-free science, and ‘political economy’, a ‘branch of intellectual activity’ that includes value judgements (1981: 9). For him (1981), welfare economics (a twentieth-century form of normative economics) includes value judgements and, consequently, he dismisses it as a science. Robbins holds that ethics has no place in economics. Maynard Keynes disagrees with him: ‘As against Robbins, Economics is essentially a moral science and not a natural science. That is to say, it employs introspection and judgment of value’ (Letter to Roy Harrod, 4 July 1938, 1973: 297).

In accordance with this strong neutrality position, if welfare economics is a part of science, it should be neutral.This is the position endorsed by G. C. Archibald (1959), for example. However, John Davis (2016) shows how the first fundamental theorem of welfare economics, that is, every Walrasian equilibrium is Pareto efficient, involves at least four value judgements: first, Pareto judgements assume that all individuals’ preferences have the same weight; second, this implies that distributional issues are not relevant; third, Pareto efficiency ignores preference contents; and, finally, the Pareto principle defines well-being as preference satisfaction, which is just one possible interpretation (see also Kenneth Boulding 1969: 5). Davis concludes,‘Economists consequently promote one ethical vision of the world, while claiming that economics is a positive valueneutral subject, and extol the positive-normative distinction while systematically violating it’ (2016: 213).

As Tony Atkinson shows, since the 1970s,‘welfare economics was sidelined’ (2009: 792, and see 2001). Faruk Gull and Wolfgang Pesendorfer state that ‘standard welfare economics functions as a part of positive economics’ (2008: 5): it consists in an uncommitted ‘external’ appraisal of the efficiency of institutions or lack thereof. However, Atkinson notes, ‘economists have not ceased to make welfare statements’ (2009: 793). These often include value judgements which are not scrutinized - for example, to consider efficiency as a goal is a value judgement in itself. Indeed, there are goods that are not efficient (for example, goods motivated by Sen’s 1977 and 2002 concept of commitment), and not all efficiencies are good.


A second weaker neutrality position holds that there are specific, well-defined value judgements that economists might or even should make; for example, Pareto optimality and Pareto superiority. Pursuant to this thesis, positive economics would be value neutral, while normative economicswould entail some specific value judgements. Mongin (2006: 260) regards Samuelson (1947) as an example of the latter. Samuelson asserts that ‘it is a legitimate exercise of economic analysis to examine the consequences of various value judgments, whether or not they are shared by the theorist’ (1947: 220).

  • 3 Third, a strong non-neutrality thesis denies that economists should refrain from making value judgements and maintains that they should be made as openly as possible. Gunnar Myrdal (1958) and some philosophers of economics are good representatives of this position. Mongin (2006: 274) interprets Myrdal as holding that all economic predicates are evaluative.
  • 4 Last, a weak non-neutrality thesis defended by Mongin asserts that we must distinguish between a value statement and a judgement. We should first ask whether a statement is evaluative, and if this is the case, whether the economist is responsible for the judgement associated with the statement: ‘the economist makes a judgment of value if the statement [for example, “X is a good policy”] is logically evaluative and the economist sincerely asserts it’ (2006: 263). I will come back to Mongin’s arguments for this thesis in the last section of this chapter.

From theses 3 and 4, it can be deduced that ethical values can be present not only in normative economics but also in positive economics. Depending on which thesis you uphold, ethical values are or are not present in every economic discipline. I will advance what I defend as an ethical normative economics against the so-called value-free ideal of science and more pragmatic visions of the positive-normative division. 1 will also seek to discern whether ethical value judgements can properly be involved not only in normative but also in positive economics. This task requires explaining why ethical values can legitimately intervene in science: that is, rejecting the so-called value-free ideal.This will be the topic of the next section.3 In the last section, I will come back to the relation between economics and ethics.

In the next chapter, having shown the relevance of ethics to economics, I will develop a proposal regarding the ends or ethical values, the ‘ideals’ that normative economics should define, and I will concisely assess the new normative current in economics called‘libertarian paternalism’.

Values in science

In this section, I will proceed in the following way. First, I will review the main arguments against the value-free ideal. Then, I will briefly propose an alternative argument.

As Reiss and Sprenger (2014, 7) explain, what they call contextual values -personal, moral, or political values — may bear an impact on science at four points:

i) the choice of a scientific research problem; ii) the gathering of information; iii) the acceptance of a scientific hypothesis or theory as an adequate answer to the problem based on the evidence and iv) the proliferation and application of scientific research results.

Most philosophers of science — and economists — concur on the influence of contextual values in i) and iv). Accordingly, Reiss and Sprenger define the value-free ideal as follows (2014, 9),‘Scientists should strive to minimize the influence of contextual values on scientific reasoning, e.g., in gathering evidence and assessing/accepting scientific theories’ — that is, in steps ii) and iii). Reiss and Sprenger also distinguish the ‘value-neutrality thesis’, which asserts that the value-free ideal is attainable, as well as its counterpart, the ‘value-laden thesis’, which argues that it is not possible to evade values in those steps.These last two theses are descriptive, while the first — the value-free ideal - is normative.4 What are the main arguments against the value-free ideal?

Mapping key arguments

Kincaid, Dupre, and Wylie (2007: 14; henceforth, ‘Kincaid et al.’) describe three types of arguments for the value-laden thesis (thus, against the value-free ideal): 1) arguments that deny the distinction between fact and value, 2) arguments based on underdetermination, and 3) arguments drawing from the social processes of science. My argument falls within the first category: basically, I will argue that values are natural facts within a broad concept of nature. Consequently, science cannot be accused of dealing with metaphysically absurd properties, as John Mackie (1977: 38ff.) does.

Arguments that deny the distinction between fact and value

To explain these arguments, first, we have to understand where this distinction conies from. The value-free ideal is connected with the idea that values are not facts and that science deals only with facts, leading to a ‘fact-value dichotomy’ that dates back to Hume’s dichotomy between ‘is’ and ‘ought’ judgements (Kincaid et al. 2007: 5).As Hilary Putnam (2002:14ff.) explains, this dichotomy stems from Hume’s division between ‘matters of facts’ and ‘relations of ideas’, stating that ‘Hume’s metaphysics of “matters of facts” constitutes the whole ground of the underivability of “oughts” from “ises”’ (2002: 15). This ‘metaphysics’ is conditioned by what Putnam calls a ‘pictorial semantics’: ideas are pictorial. There are no matters of facts about virtue or vice because these cannot be envisioned like an apple, and, therefore, they are sentiments. Thus, values are not facts but something subjective that escapes science.

Putnam (2002: 2 and passim) argues that the fact-value dichotomy depends on the separation between analytic and synthetic propositions and relies on Willard van Orman Quine’s (1951) challenge to this distinction. Facts provide not only verifiable propositions but also theoretical terms in the context of a scientific theory where values also matter.

Simultaneously, Putnam revisits classical pragmatists’ ideas regarding the links between facts and values; for them, value and normativity permeate all experience. Putnam (2017) refers to John Deweys position regarding facts and values and looks at the notions presented by Ruth Anna Putnam (2017) with William James and C. S. Peirce on the same subject, returning to Dewey once again. In short, these authors regard fact and value statements as intertwined, and we can apply reason to values.

For Putnam, facts and values become entangled as a result of the use in science of notions dubbed as ‘thick’ ethical concepts in the meta-ethical literature, that is, terms with both descriptive and normative content.5 He illustrates his point with the example of the word ‘cruel’. Dupre (2007), for his part, provides other examples, such as the concepts of rape in evolutionary psychology and inflation in economics, arguing that, in topics that especially concern us, the use of both normative and factual notions combined together proves unavoidable and necessary.The human or social relevance of some topics necessarily forces on us the consideration of values. For Dupre (2007), value-free situations appear in uninteresting cases, whereas hypotheses and conclusions that matter to us are not value free and include ‘thick’ ethical terms.

Anna Alexandrova’s (2017: Chapter 4) proposal for legitimately considering ‘mixed claims’ in sciences (though she tries to avoid meta-ethical discussions such as those surrounding thick ethical concepts) could also be grouped here: they are causal or correlational claims with normative presuppositions. She argues that these value-laden claims do not go against objectivity. She proposes three rules to guarantee it: unearthing the value presuppositions in constructs and measures, checking if they are controversial and, in this case, consulting the relevant parties (2017: 99—105).

While I regard Richard Rudner (1953) - probably the most cited author when it comes to values in science — as part of the second category of arguments, his views on the presence of values in science also hinges on the importance of the subject under analysis. Also heeding the relevance of research topics, Carla Bagnoli’s (2017) recent ‘constructivist’ position revolves around the idea that emotions and reason shape facts, bringing what matters into view. She states that ‘ [t]o some important extent, then, the facts are not fully separable from the concerns of the agents in their perspective’ (2017: 137).

Elizabeth Anderson can also be considered to partake in this category of arguments. Her‘pragmatic account of how we can objectively justify our value judgments’ (1993: 91) is achieved by appealing to reasons - that is, there is no room for scepticism or subjectivism. What is valuable is not merely liking something (take Facebook or Instagram) but the object of a rational argument. For Anderson, this does not mean that values can be everywhere in science, as ‘a bias in relation to the object of inquiry is inevitable’ (2004: 19). However,‘a bias in relation to hypotheses is illegitimate. If a hypothesis is to be tested, the research design must leave open a fair possibility' that evidence will disconfirm it’ (2004: 19). Values are thus facts and can be known, but they do not replace empirical evidence.

Constructivist approaches such as Christine Korsgaard’s can also be included in this group. While a Kantian, she also draws from Aristotle, Wittgenstein, and

Rawls. Human beings’ reflexive nature drives their personal identity, which provides reasons for acting appropriately. She states:

A human being is an animal who needs a practical conception of her own identity, a conception of who she is that is normative for her. Otherwise she could have no reasons to act, and since she is reflective she needs reasons to act.

(1992: 92)

Human nature is a value in itself, and it is ‘open’ to different possible identities: ‘a human being is an animal whose nature it is to construct a practical identity that is normative for her’ (1992: 105). Practical identity', Korsgaard argues, ‘is better understood as a description under which you value yourself, a description under which you find your life worth living and your actions to be worth undertaking’ (1992: 83). Practical identity' bears strong ties to other human beings and animals, because, she states,‘human beings are social animals in a deep way’ (1992: 101).

Summing up, most authors in this category' refute the value-free ideal by' arguing that values are facts that can be rationally known and, therefore, have a legitimate role in science, which does not replace but, rather, complements data and theoretical reasoning.

Arguments from underdetermination

Kincaid et al. (2007: 15) take into account both data-based theory underdetermination (multiple hypotheses can be compatible with data) and theory choice underdetermination based on epistemic values (different scientists may weigh the various epistemic values differently and, as result, choose different hypotheses).

While Rudner’s (1953) argument does not come from underdetermination as defined above but from the impossibility of complete empirical induction, his above-mentioned paper proves useful here as well.6 For Rudner, accepting or rejecting a hypothesis always incurs a risk of error, since it will never be completely verified, due to the intrinsic imperfection of inductive inferences. Hence, scientists must consider whether there is enough evidence to accept or reject a hypothesis, and this depends on how ethically' serious the potential consequences of error are, which entails a value judgement (1953: 2—3); thus, the acceptance or rejection of a hypothesis involves an ethical decision, because it always carries the risk of error.

Heather Douglas (2009) also highlights the role of inductive risk in the context of the current authority of science in our culture (2000: 563). She notes that values are recognized as relevant in these steps: problem selection, knowledge utilization, and methodology limitations. Yet, she also points out,‘where the weighing of inductive risk requires the consideration of non-epistemic consequences, non-epistemic values have a legitimate role to play in the internal stages of science’ (2000: 565). She holds that this role is indirect, as it involves considering the consequences of a possible error. With low uncertainty, it is not necessary to consider these potential outcomes (2000: 577).

Douglas (2016) discusses three challenges to the value-free ideal. First, she considers the descriptive challenge, which was originally raised by feminist critics, who noted the presence of value-laden presuppositions even in the gathering of empirical evidence. Douglas’ argument rests on underdetermination: the evidence available allows for all plausible theories and assumptions. There is a gap between theory and evidence that should be filled by value judgements. Second, the ‘boundary challenge’ stems from the lack of a clear distinction between epistemic and contextual values (an argument explored by Helen Longino, 1990, see the following subsection). Third, comes the normative challenge. Uncertainty brings about inductive gaps and risks, and establishing sufficient evidence proves necessary. Longino makes a distinction between the direct (choice of topic, method and application) and indirect roles of values in science (evidence sufficiency). Epistemic values reveal the degree of uncertainty at play, while non-epistemic values indirectly appraise whether the evidence suffices or not. However, she also points out that epistemic values, such as simplicity or elegance, sometimes do not apply to or suit our complex social world.

In addition, Douglas notes, values are embedded in the language we use in science, in the construction and testing of some models and in the use and dissemination of science. All these considerations highlight the need to take into account the impact of science on society (see also Douglas 2009). She states: ‘With values openly on the table as part of the scientific process, scientists and policy-makers can include both evidence and values, in their legitimate roles, as part of the public discussion’ (2014: 181). These considerations take Douglas’ arguments into the next category.

Mary Hesse (1980) carefully develops the underdetermination of theories via the factual argument. In the case of the social sciences, she speaks about a second type of value judgement that does not apply to the natural sciences (where the ‘pragmatic criterion’ — predictability — can filter out a first type of ‘basic’ value judgements), and that establishes value goals as the criterion to overcome underdetermination (1980: 195).

Longino also upholds the underdetermination argument, positing a gap:

This gap, created by the difference in descriptive terms used in the description of data and in the expression of hypotheses, means that evidential relations cannot be formally specified and that data cannot support one theory or hypothesis to the exclusion of all alternatives. Instead, such relations are mediated by background assumptions [...] the only check against the arbitrary dominance of subjective (metaphysical, political aesthetic) preference in such cases is critical interaction among the members of the scientific community or among members of different communities.

(2015: 11)

Indeed, this makes Longino a suitable participant for the next category as well.

Arguments from the social processes of science

Several arguments stem from the social processes of science. Longino questioned the division between epistemic and non-epistemic values because, as she sees it, epistemic values are embedded in social and political perspectives. In turn, these outlooks, also ingrained in scientific research, claim that no distinction can be made between cognitive and non-cognitive elements in science (2004: 128). Nonetheless, Longino argues that this embeddedness does not imply denying objectivity but understanding it as built into a social context. Longino views objectivity as‘a characteristic of a community’s practice of science’ (1990: 179). She states,

[Scientific knowledge is, therefore, social knowledge. It is produced by processes that are intrinsically social, and once a theory, hypothesis, or set of data has been accepted by a community, it becomes a public resource. It is available to use in support of other theories and hypotheses and as a basis of action. Scientific knowledge is social both in the ways it is created and in the uses it serves.

(1990: 180)

Thus, individual values are sifted out, and values become good for science (2004: 127). Moreover, Longino stresses,‘the objectives of the value-free ideal are better achieved if the constructive role of values is appreciated and the community' structured to permit their critical examination’ (2004: 140). In a nutshell, she thinks that values do not prevent objectivity but secure it when they are socially discussed and set.

While holding that values do not affect the acceptance of data, hypotheses, and theories, Hugh Lacey identifies a moment in the scientific process in which they‘often play indispensable roles’ (2003: 209): the ‘adoption of strategy’. He adds that most modern sciences adopt ‘materialist strategies’ that cast other possible strategies aside: for example, ‘agroecological strategies’ include ecological and social categories typically neglected by ordinary materialist strategies. He states,‘the value judgments that are part of the grounds for adopting a strategy play a causal role in enabling the conditions under which factual judgments can be made, but they are not part of the evidence’ (2003: 217). He suggests relying on multiple strategies according to different values and researching them empirically (2002).

Philip Kitcher’s reflections on values and science are deeply associated with the need to link scientific topics and their development with social requirements and values established via public discussion. He introduces ‘an ideal of “well-ordered science”, intended to capture what inquiry is to aim at if it is to serve the collective good’ (2001: xii;also see Chapter 10). For Kitcher,the categories used to characterize reality are ‘consequential’, that is, they' play a causal role. Science cannot turn a blind eye to its consequences; ‘because I believe no such conception can be found, I take moral and social values to be intrinsic to the practice of the science’ (2001: 65). The major issues facing society should be taken into account when designing the categories used: ‘the aim of the sciences’, he states, ‘is to address the issues that are significant for people at a particular stage in the evolution of human culture’ (2001: 59). Hence, moral and social values are intrinsic to the practice of science (2001: 65). However, Kitcher also believes that the presence of values in science does not challenge the objectivity of reality' (2001: 53 and 66). He argues that‘value-judgments are deeply embedded in the practice of science’ (2011: 34), while he ‘resist[s] the suspicion that the incursion of values inevitably undermines scientific authority’ (2011: 40).

< Prev   CONTENTS   Source   Next >