Karl Weick / Kathleen Sutcliffe

"Resilience is a combination of keeping errors small and of improvising workarounds that allow the system to keep functioning. Both of these pathways to resilience demand deep knowledge of the technology, the system, one’s coworkers, and most of all, oneself." (p. 14)

"Rigid hierarchies have their own special vulnerability to error. Errors at higher levels tend to pick up and combine with errors at lower levels, thereby making the resulting problem bigger, harder to comprehend, and more prone to escalation." (p. 16)

"[It] is impossible to manage any organization solely by means of mindless control systems that depend on rules, plans, routines, stable categories, and fixed criteria for correct performance. No one knows enough to design such a system so that it can cope with a dynamic environment." (p. 39)

"Let culture do the controlling. A strong culture, held together by consistent values and enforced by social pressure, is all the control you need. Most managers overdo control. They heap hierarchy on top of rules on top of routines on top of job descriptions on top of culture and then wonder why people feel constrained and put forth less than their best efforts." (p. 150)

"Don’t overdo lean, mean ideals. The lean, mean organization may sparkle in the short run, but it may also crash and bum at the first unexpected jolt because leanness strips the organization of resilience and flexibility. Realize that when managers eliminate “redundant” positions, they sacrifice experience and expertise. That loss can limit the repertoire of responses available to the organization." (p. 157)


James Reason

"The purpose (...) is to explore the human contribution to both the reliability and resilience of complex well-defended systems. The predominant mode of treating this topic is to consider the human as a hazard, a system component whose unsafe acts are implicated in the majority of catastrophic breakdowns." (back cover)

"But there is another perspective, one that has been relatively little studied in its own right, and that is the human as hero, a system element whose adaptations and compensations have brought troubled systems back from the brink of disaster on a significant number of occasions." (back cover)

"So why is the criminal justice system so set upon punishing errant professionals? (...)  The fact is that 'honest errors' to which we are all prone are now considered criminal in circumstances where public safety is at stake. The law has yet to adopt a systems view of human error." (p. 92)

"A good safety culture has to be CEO-proof. CEOs are, by nature, birds of passage: changing jobs frequently is how they got to where they are today -and there is no reason to suppose that they are going to behave any differently in the future." (p. 274)


Nassim Taleb

"It is far easier to figure out if something is fragile than to predict the occurrence of an event that may harm it. Fragility can be measured; risk is not measurable (outside of casinos or the minds of people who call themselves “risk experts”)." (p. 4)

"A complex system, contrary to what people believe, does not require complicated systems and regulations and intricate policies. The simpler, the better. Complications lead to multiplicative chains of unanticipated effects."(p. 11)

"Every plane crash brings us closer to safety, improves the system, and makes the next flight safer -those who perish contribute to the overall safety of others. (…) these systems learn because they are antifragile and set up to exploit small errors; the same cannot be said of economic crashes, since the economic system is not antifragile the way it is presently built." (p. 72)

"Someone who predicts will be fragile to prediction errors. An overconfident pilot will eventually crash the plane. And numerical prediction leads people to take more risks." (p.150)

"This brings us to the difference between doing and thinking. The point is hard to understand from the vantage point of intellectuals. As Yogi Berra said, 'In theory there is no difference between theory and practice; in practice there is.' ” (p. 213)

"[A] model is by its very nature a simplification. You just don’t want the simplification to distort the situation to the point of being harmful." (p. 296)

"[A]s if reality cared about opinions and narratives. There are secrets to our world that only practice can reveal, and no opinion or analysis will ever capture in full" (p. 335)

Sharing data
This website does not collect any data and does not share any data. However, emails from visitors can be checked by an automated spam detection service and when spamming occurs the email address of the sender can be added to a list of spam sources.

This website does not place any cookies, not even the commonly used _ga cookie from Google Analytics. The site is deliberately kept free of advertisements and we do not want this site to contribute to the collection of information for targeted advertising on other sites.

You are free to download, use, copy, embed and distribute the DEGAS-content on this site, including the DEGAS-videos, but you may not alter or sell it.

Recognition of our intellectual ownership through the use of appropriate references would be appreciated.

Creative Commons license BY-NC-ND 4.0