Part of a Whole


Erik Hollnagel

“Because there nearly always is too little time and too much information relative to what needs to be done, it is inevitable that what do will be a compromise between what we must do in order not to be left behind, and what we should do in order to avoid unnecessary risks. In other words, a compromise or trade-off between efficiency and thoroughness.” (p. 3)

“A common example is the difference between the explicit policy that “safety is the most important thing for us,’ and the implicit policy that production takes precedence when conflicts arise. “(p. 39”)

“In order for the technology to keep working, humans (and organisations) must function as a buffer both between subsystems and between the system and its environment, as something that absorbs excessive variability when there is too much of it and provides variability when there is too little.” (p. 57)

“It would be unreasonable to assume that people, or organisations, behaved in one way when things went wrong and in another when they went right. ‘Right’ and·’wrong’ are judgements made after the fact, and it would be miraculous, to say the least, if the ‘machinery of the mind’ could know the actual outcome of an action ahead of time and change its mode of functioning accordingly.” (p. 86)

“It means that the goal of safety must change from reducing the number of adverse events to enhancing the ability to succeed under varying conditions.” (p. 100)

“[W]ork at the blunt end often suffers from a lack of information, simply because managers often are removed – in time and in space – from the actual operations.” (p. 127)

“The risks of socio-technical systems can neither be analysed nor managed by considering only the system components and their failure probabilities.” (p. 128)