Theorizing regulatory performances

Overall, the portrayal of financial regulation as a purely technical matter took a hard knock as result of the crisis. If attempts to rehabilitate, reinstall and extend technical governance are to succeed, they will have need of new legitimizations. Candidate legitimizations can be found in two developments: a new emphasis on flexibility, sometimes referred to as a 'judgement-based approach', in place of rigid adherence to models and data; and the much more vigorous enforcement of financial conduct regulation, against insider trading, conflict of interest and such abuses. In the remainder of this chapter, we explore these developments.

The judgement-based approach in financial regulation (HM Treasury 2010) is an emergent phenomenon and, as such, is clearly not capable of being assessed in terms of past experience or data. To analyse its prospects we have need of theory, and here we draw first upon sociology, political economy and the social science of finance. Economic sociology posits an 'embeddedness' of actors and their thinking within middle-level social structures, networks and cultures, wherein 'reasonable expectations' emerge on a collective basis (Cetina and Preda 2004; Granovetter 2005; Swedberg 2011). Bringing in technical systems, MacKenzie (2009) and others have developed a social studies of finance, meaning attention not only to practices and mentalities in particular financial sectors, circuits and sites, but also to the development and wide take-up of technical tools by market participants. Such approaches have more commonly been applied to market behaviour than to regulation, although a related approach (Beunza and Garud 2007) has been applied to equity and bond analysts, whose outputs, alongside those of ratings agencies, flow into regulators' thinking. Nevertheless, since financial regulation itself involves both technical processes (systems and tools) and cultural aspects (episteme and herding), it does seem appropriate for analysis of regulation to be attentive to both of these (interrelated) aspects. Tools can have intended and/or unintended consequences. The concept of 'performativity' (MacKenzie 2009: pp. 30-31; see also Callon 2007) implies that, when technical assumptions and beliefs become concretized in widely observed habits, in material practices of communities and in the technical systems on which they rely, then these assumptions can have self-fulfilling effects, at least in the medium term. Equally, institutional design and social and technical processes can have outcomes that are neither expected nor intended ('counter-performativity'). This can happen if and as the performances of communities of actors, enacting the same routines as guided by their shared (or very similar) toolkits, change the situation - so making the assumptions in which the tools are based no longer valid - yet the tools may continue to be widely used. Although the situation so created may be rather serious, it can be comically summarized by reference to cartoons in which Mickey Mouse has run over a cliff edge and has yet to look down.

To take an example, insurance-type products such as Credit Default Swaps (CDS) may 'work' well enough when the market in them is small. The small number of contracts taken out does not materially change the assumptions on which the design of the instrument is based. However, as traders become familiar with such a product line, they increasingly take it as a point of reference. In effect, traders' thinking converges on the assumptions embedded in the CDS (an example of performativity). Yet if the market not only thrives but actually balloons - such that the value of all CDS in the market is several times greater than their reference contracts (the things 'insured') - then clearly the market is no longer the same as was assumed within the model. In 2006-2007, as traders came to notice that prices were increasingly out of line with expectations, some (more prescient) traders tried to trade against the model. But they were hampered in doing so by the weight of conformity to the models, such that some had to relinquish their bearish positions - which eventually were proved correct but meanwhile were deviant and expensive to maintain (Lewis 2010). There may be a long period of market denial, within which any volatility is seen as a 'blip' and a buying opportunity: 'still dancing', as CitiGroup's Chuck Prince memorably put it. Thus, the stage is set for sudden and violent reversal (counter-performativity), as the community finally recognizes multiple signs that the market is no longer in accord with model assumptions.

There are considerable and concerning implications for our understanding of financial regulation and regulatory knowledge. In particular, wariness about market-based models and quantitative modelling seems justified. Just as market participants employ technical models and information systems, so do regulators. The technical models and information systems may be capable of aggregating, manipulating and summarizing huge amounts of data. Yet of course, all systems are critically reliant on the assumptions embedded in them, as well as on the quality of data inputs and the possibilities for interpreting the outputs. Generalizing from Carruthers and Kim (2011), we can say that the possibilities for interpretation are mediated by

  • (a) regulators' policy assumptions, defining the field of interpretive possibilities;
  • (b) their conceptual maps (means of interpretation); and
  • (c) their social distance from the points of production of the data.

The less distant interpreters are, the better their ability to understand how the data was generated in practice and hence what it could mean. Taking a critical stance, Carruthers and Kim (op cit) link deregulation as a policy stance, 'market fundamentalism' as a corresponding mentality, and the replacement of local (direct, social, face-to-face networks) by global, large-scale and complex human-technical systems. Such a critique suggests how regulators' pre-crisis entrancement with data and models could give them only an illusion of analytical power.

At first glance, 'anti-model' critiques might be taken to support the move away from the pre-crisis reliance on models and data and, thus implicitly or indirectly, to offer support to (or at least not to undermine) the emergent judgement-based approach in financial market regulation. If we are to take seriously the proposition that tools and techniques can initially be performative (up to a point, because they induce collective behaviour), and subsequently counter-performative (when widespread adoption shifts reality away from model assumptions), then we may begin to wonder what assumptions, tools and techniques are represented in 'judgement'. On the one hand, 'judgement' sounds as if it might open the door to wider diversity in regulatory analysis and decision-making. On the other hand, were judgement to coalesce around globally shared cultural assumptions and shared techniques - as it might, in a context of international regulatory networking, peer review, benchmarking and commitment to convergence - then judgement might turn out to carry considerable potential for (short-term) performativity and (eventual) counter-performativity.

 
Source
< Prev   CONTENTS   Source   Next >