Science and society in step - phase 1
In the post-Second World War years the scale of the challenge facing a world recovering from those terrible events led to broad recognition that bold decisions were needed in both technical and social spheres. Such decisions could be taken rapidly by the elected government of the day: the late 1940s onwards saw one of the biggest social/economic revolutions in British history through the nationalisation of major tranches of the UK economy, including in 1948 electricity supply. Many centralised power generation plants were built to serve growing demand, including the nine commercial Magnox nuclear power stations which began operating in the ten years between 1962 and 1971. There was an attitude among those in authority that members of the public, or pressure groups, who opposed a particular bit of scientific/technical ‘progress’ were either out of touch with the norms of society or would soon come round. The sense of shared purpose among people who had recently united to face an external enemy engendered a broadly utilitarian societal ethic, seeking ‘the greatest good for the greatest number’.
This meshed well with many of the tenets which underlie statistical science, in which wider principles are generally of more interest than individual cases. If we follow the best available scientific advice in pursuing a dozen technologically innovative routes then it is likely that, in say one or two cases, events may subsequently show that the best available scientific interpretation underestimated the risks and some individuals might suffer. Nonetheless, a thoroughgoing utilitarian might argue that society would well be significantly better off by following all twelve opportunities rather than by abandoning all of them at an early stage and bending to the prejudices of those motivated by more ideological, mystical or obscure sources of belief. (It is hard to find a form of words that does not make utilitarianism seem rather cold and insensitive but utilitarians would argue that they are motivated purely by a desire to increase the total sum of human well-being - that is indeed the philosophical basis of utilitarianism - and that it is actually rather heartless to pursue the interests of those who shout loudest or are most photogenic at the expense of the unheard majority.) Jeremy Bentham, the founder of modern utilitarianism, proposed a ‘felicific calculus’ to determine mathematically the morality of various courses of action.3 The highly visible success of science and technology in delivering longer, healthier, more leisurely lives for many people added to the enthusiasm. That it did not do so for everyone could be accommodated within the tenets of utilitarianism.
When society is relatively stressed and pursuing a utilitarian ethic, politicians often seem happy to delegate decision-making to the technical community. The Eisenhower regime in the US in the early days of civil nuclear technology, for example, took the view that nuclear science was by its nature too difficult for laypeople, including politicians, to understand so the responsibility not only for carrying out policy but also largely for forming it should lie with the Nuclear Regulatory Commission (NRC). It was perhaps inevitable then that a technocratic mode of decision-making became dominant, to the detriment of dialogue with and control by the normal democratic structures. The secrecy associated with military uses of nuclear materials doubtless exacerbated this.
Enthusiasm for science did not entirely evaporate as austerity came to an end. For example, at the Labour Party conference in 1963, Harold Wilson, leader of the opposition, famously declared that a New Britain would be ‘forged in the white heat of [the scientific and technological] revolution. Coming to power the next year, Wilson established a new Ministry of Technology which, in the words of its first minister, Tony Benn, would provide Britain with the role it was searching for since the demise of the Empire.4
However, the lack of serious challenge to the technocrats in charge of policy, either from alternative technical viewpoints or from different social values, had a darker side. In at least some cases, such as the development of the two Windscale nuclear piles, the first of which caught fire in 1957, it can be argued that government pressure forced, or perhaps permitted, scientists and engineers to take dangerous risks in the pursuance of urgent political demands. Such decisions were inevitably subject to certain systematic biases in favour of the technical mindset or financial vested interests, or to error as potential critics were drowned out by the ‘experts’. The vast sums spent on nuclear research in the UK in the 1960s, resulting in the creation of a large number of apparently dead-end designs - at the end of which, in the view of many commentators, government chose the wrong one - do not now appear to have represented the best possible public policy. More recently, a significant contributor to the Chernobyl accident in 1986, in a society where challenge to the government and its institutions was even more limited, seems to have been overt and assumed demands on the operators from local politicians. The absence of a culture of challenge, either from society or from those working within the industry, did perhaps deliver some benefits from ‘pushing the envelope’ in terms of increased output. It also resulted in the plant being operated well outside its design parameters with fatal consequences.5