Person and System Models: Getting the Balance Right

We have already discussed the weaknesses of the person model at some length, all of which relate to the 'human-as-hazard' perspective. The 'human-as-hero' view is quite another matter and will be considered extensively in the next part of this book.

Although the system models seem, on the face of it, to be far more appropriate ways of considering accident causation, both in terms of understanding the contributing factors and in their remedial implications, they too have their limitations when taken to extremes. This was first brought home to me by the brilliant essays of Dr Atul Gawande, a general surgeon at a large Boston hospital and a staff writer on science and medicine for the New Yorker.[1]

In an essay entitled 'When doctors make mistakes',[2] Dr Gawande recounts the many successes of American anaesthesiologists in reducing adverse events through various systemic measures involving the redesign and standardisation of anaesthetic machines, the use of pulse oximeters, carbon dioxide monitors and improved training measures employing high-fidelity anaesthesia simulators. In ten years, the death rate dropped to one-twentieth of what it had been. He concludes his discussion with the following very telling passage:

But there are distinct limitations to the industrial cure, however necessary its emphasis on systems and structures. It would be deadly for us, the individual actors, to give up our belief in human perfectibility. The statistics may say that someday I will sever someone's main bile duct [a recurrent error in laparoscopic cholecystectomy procedures], but each time I go into a gallbladder operation I believe that with enough will and effort I can beat the odds. This isn't just professional vanity. It's a necessary part of good medicine, even in superbly 'optimized' systems. Operations like the lap chole have taught me how easily error can occur, but they've also showed me something else: effort does matter; diligence and attention to the minutest details can save you.[3]

This brings us to the nub of the problem with regard to an excessive reliance on system measures. People on the frontline of health care or any other hazardous enterprise generally have little opportunity to bring about rapid system improvements, or any kind of global change, but they can resolve to go the extra mile. Health care, in particular, has a one-to-one or few-to-one delivery. It's a hands-on and very personal business. Personal qualities do matter. To think otherwise is to fall prey to 'learned helplessness' - saying to oneself 'What can I do? It's the system'.

I also believe that we can train frontline people in the mental skills that will make them more 'error wise'. That is, to help them to 'read' situations so that they can identify circumstances having high error potential, and act accordingly. This is a perception of 'sharp-end' individuals that accentuates the positive so that we can exploit their 'human-as-hero' potential. This is what we mean by individual mindfulness: being aware of the hazards and having contingencies in place to deal with them; being respectful of the dangers and possessing a 'feral vigilance' in their presence. We will discuss these issues in detail in the final part of this book.

PART III Accidents

  • [1] Atul Gawande's articles for the New Yorker are collected in twowonderful books: Complications: A Surgeon's Notes on an Imperfect Science (NewYork: Metropolitan Books, 2002) and Better: A Surgeon's Notes on Performance(New York: Profile Books, 2007).
  • [2] Gawande (2002), pp. 47-74.
  • [3] Ibid, p. 73.
 
Source
< Prev   CONTENTS   Source   Next >