What does the collapse of sub-prime lending have in common with a broken jackscrew in an airliner s tailplane? Or the oil spill disaster in the Gulf of Mexico with the burn-up of Space Shuttle Columbia? These were systems that drifted into failure. While pursuing success in a dynamic, complex environment with limited resources and multiple goal conflicts, a succession of small, everyday decisions eventually produced breakdowns on a massive scale. We have trouble grasping the complexity and normality that gives rise to such large events. We hunt for broken parts, fixable properties, people we...
What does the collapse of sub-prime lending have in common with a broken jackscrew in an airliner s tailplane? Or the oil spill disaster in the Gulf o...
When faced with a human error problem, you may be tempted to ask 'Why didn t these people watch out better?' Or, 'How can I get my people more engaged in safety?' You might think you can solve your safety problems by telling your people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure and demanding compliance. These are all expressions of 'The Bad Apple Theory' where you believe your system is basically safe if it were not for those few unreliable people in it. Building on its successful predecessors, the third edition of The Field Guide to Understanding...
When faced with a human error problem, you may be tempted to ask 'Why didn t these people watch out better?' Or, 'How can I get my people more engaged...