The Field Guide to Understanding Human Error: Edition 2

CRC Press
3
Free sample

When faced with a human error problem, you may be tempted to ask 'Why didn't they watch out better? How could they not have noticed?'. You think you can solve your human error problem by telling people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure. These are all expressions of 'The Bad Apple Theory', where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere. The new view, in contrast, understands that a human error problem is actually an organizational problem. Finding a 'human error' by any other name, or by any other human, is only the beginning of your journey, not a convenient conclusion. The new view recognizes that systems are inherent trade-offs between safety and other pressures (for example: production). People need to create safety through practice, at all levels of an organization. Breaking new ground beyond its successful predecessor, The Field Guide to Understanding Human Error guides you through the traps and misconceptions of the old view. It explains how to avoid the hindsight bias, to zoom out from the people closest in time and place to the mishap, and resist the temptation of counterfactual reasoning and judgmental language. But it also helps you look forward. It suggests how to apply the new view in building your safety department, handling questions about accountability, and constructing meaningful countermeasures. It even helps you in getting your organization to adopt the new view and improve its learning from failure. So if you are faced by a human error problem, abandon the fallacy of a quick fix. Read this book.
Read more

Reviews

5.0
3 total
Loading...

Additional Information

Publisher
CRC Press
Read more
Published on
Apr 7, 2017
Read more
Pages
252
Read more
ISBN
9781351889759
Read more
Language
English
Read more
Genres
Business & Economics / Human Resources & Personnel Management
Political Science / Labor & Industrial Relations
Technology & Engineering / Industrial Health & Safety
Read more
Content Protection
This content is DRM protected.
Read more
Read Aloud
Available on Android devices
Read more
Eligible for Family Library

Reading information

Smartphones and Tablets

Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.

Laptops and Computers

You can read books purchased on Google Play using your computer's web browser.

eReaders and other devices

To read on e-ink devices like the Sony eReader or Barnes & Noble Nook, you'll need to download a file and transfer it to your device. Please follow the detailed Help center instructions to transfer the files to supported eReaders.
Sidney Dekker
What does the collapse of sub-prime lending have in common with a broken jackscrew in an airliner’s tailplane? Or the oil spill disaster in the Gulf of Mexico with the burn-up of Space Shuttle Columbia? These were systems that drifted into failure. While pursuing success in a dynamic, complex environment with limited resources and multiple goal conflicts, a succession of small, everyday decisions eventually produced breakdowns on a massive scale. We have trouble grasping the complexity and normality that gives rise to such large events. We hunt for broken parts, fixable properties, people we can hold accountable. Our analyses of complex system breakdowns remain depressingly linear, depressingly componential - imprisoned in the space of ideas once defined by Newton and Descartes. The growth of complexity in society has outpaced our understanding of how complex systems work and fail. Our technologies have gotten ahead of our theories. We are able to build things - deep-sea oil rigs, jackscrews, collateralized debt obligations - whose properties we understand in isolation. But in competitive, regulated societies, their connections proliferate, their interactions and interdependencies multiply, their complexities mushroom. This book explores complexity theory and systems thinking to understand better how complex systems drift into failure. It studies sensitive dependence on initial conditions, unruly technology, tipping points, diversity - and finds that failure emerges opportunistically, non-randomly, from the very webs of relationships that breed success and that are supposed to protect organizations from disaster. It develops a vocabulary that allows us to harness complexity and find new ways of managing drift.
Sidney Dekker
The second edition of a bestseller, Safety Differently: Human Factors for a New Era is a complete update of Ten Questions About Human Error: A New View of Human Factors and System Safety. Today, the unrelenting pace of technology change and growth of complexity calls for a different kind of safety thinking. Automation and new technologies have resulted in new roles, decisions, and vulnerabilities whilst practitioners are also faced with new levels of complexity, adaptation, and constraints. It is becoming increasingly apparent that conventional approaches to safety and human factors are not equipped to cope with these challenges and that a new era in safety is necessary.

In addition to new material covering changes in the field during the past decade, the book takes a new approach to discussing safety. The previous edition looked critically at the answers human factors would typically provide and compared/contrasted them with current research and insights at that time. The edition explains how to turn safety from a bureaucratic accountability back into an ethical responsibility for those who do our dangerous work, and how to embrace the human factor not as a problem to control, but as a solution to harness.

See What’s in the New Edition:

New approach reflects changes in the field Updated coverage of system safety and technology changes Latest human factors/ergonomics research applicable to safety

Organizations, companies, and industries are faced with new demands and pressures resulting from the dynamics and nature of the modern marketplace and from the development and introduction of new technologies. This new era calls for a different kind of safety thinking, a thinking that sees people as the source of diversity, insight, creativity, and wisdom about safety, not as the source of risk that undermines an otherwise safe system. It calls for a kind of thinking that is quicker to trust people and mistrust bureaucracy, and that is more committed to actually preventing harm than to looking good. This book takes a forward-looking and assertively progressive view that prepares you to resolve current safety issues in any field.

Sidney Dekker
Increased concern for patient safety has put the issue at the top of the agenda of practitioners, hospitals, and even governments. The risks to patients are many and diverse, and the complexity of the healthcare system that delivers them is huge. Yet the discourse is often oversimplified and underdeveloped. Written from a scientific, human factors perspective, Patient Safety: A Human Factors Approach delineates a method that can enlighten and clarify this discourse as well as put us on a better path to correcting the issues.

People often think, understandably, that safety lies mainly in the hands through which care ultimately flows to the patient—those who are closest to the patient, whose decisions can mean the difference between life and death, between health and morbidity. The human factors approach refuses to lay the responsibility for safety and risk solely at the feet of people at the sharp end. That is where we should intervene to make things safer, to tighten practice, to focus attention, to remind people to be careful, to impose rules and guidelines. The book defines an approach that looks relentlessly for sources of safety and risk everywhere in the system—the designs of devices; the teamwork and coordination between different practitioners; their communication across hierarchical and gender boundaries; the cognitive processes of individuals; the organization that surrounds, constrains, and empowers them; the economic and human resources offered; the technology available; the political landscape; and even the culture of the place.

The breadth of the human factors approach is itself testimony to the realization that there are no easy answers or silver bullets for resolving the issues in patient safety. A user-friendly introduction to the approach, this book takes the complexity of health care seriously and doesn’t over simplify the problem. It demonstrates what the approach does do, that is offer the substance and guidance to consider the issues in all their nuance and complexity.

Sidney Dekker
Building on the success of the 2007 original, Dekker revises, enhances and expands his view of just culture for this second edition, additionally tackling the key issue of how justice is created inside organizations. The goal remains the same: to create an environment where learning and accountability are fairly and constructively balanced. The First Edition of Sidney Dekker’s Just Culture brought accident accountability and criminalization to a broader audience. It made people question, perhaps for the first time, the nature of personal culpability when organizational accidents occur. Having raised this awareness the author then discovered that while many organizations saw the fairness and value of creating a just culture they really struggled when it came to developing it: What should they do? How should they and their managers respond to incidents, errors, failures that happen on their watch? In this Second Edition, Dekker expands his view of just culture, additionally tackling the key issue of how justice is created inside organizations. The new book is structured quite differently. Chapter One asks, ’what is the right thing to do?’ - the basic moral question underpinning the issue. Ensuing chapters demonstrate how determining the ’right thing’ really depends on one’s viewpoint, and that there is not one ’true story’ but several. This naturally leads into the key issue of how justice is established inside organizations and the practical efforts needed to sustain it. The following chapters place just culture and criminalization in a societal context. Finally, the author reflects upon why we tend to blame individual people for systemic failures when in fact we bear collective responsibility. The changes to the text allow the author to explain the core elements of a just culture which he delineated so successfully in the First Edition and to explain how his original ideas have evolved. Dekker also introduces new material on ethics and on caring
Sidney Dekker
When faced with a ’human error’ problem, you may be tempted to ask 'Why didn’t these people watch out better?' Or, 'How can I get my people more engaged in safety?' You might think you can solve your safety problems by telling your people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure and demanding compliance. These are all expressions of 'The Bad Apple Theory' where you believe your system is basically safe if it were not for those few unreliable people in it. Building on its successful predecessors, the third edition of The Field Guide to Understanding ’Human Error’ will help you understand a new way of dealing with a perceived 'human error' problem in your organization. It will help you trace how your organization juggles inherent trade-offs between safety and other pressures and expectations, suggesting that you are not the custodian of an already safe system. It will encourage you to start looking more closely at the performance that others may still call 'human error', allowing you to discover how your people create safety through practice, at all levels of your organization, mostly successfully, under the pressure of resource constraints and multiple conflicting goals. The Field Guide to Understanding 'Human Error' will help you understand how to move beyond 'human error'; how to understand accidents; how to do better investigations; how to understand and improve your safety work. You will be invited to think creatively and differently about the safety issues you and your organization face. In each, you will find possibilities for a new language, for different concepts, and for new leverage points to influence your own thinking and practice, as well as that of your colleagues and organization. If you are faced with a ’human error’ problem, abandon the fallacy of a quick fix. Read this book.
Sidney Dekker
When faced with a human error problem, you may be tempted to ask 'Why didn't they watch out better? How could they not have noticed?'. You think you can solve your human error problem by telling people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure. These are all expressions of 'The Bad Apple Theory', where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere. The new view, in contrast, understands that a human error problem is actually an organizational problem. Finding a 'human error' by any other name, or by any other human, is only the beginning of your journey, not a convenient conclusion. The new view recognizes that systems are inherent trade-offs between safety and other pressures (for example: production). People need to create safety through practice, at all levels of an organization. Breaking new ground beyond its successful predecessor, The Field Guide to Understanding Human Error guides you through the traps and misconceptions of the old view. It explains how to avoid the hindsight bias, to zoom out from the people closest in time and place to the mishap, and resist the temptation of counterfactual reasoning and judgmental language. But it also helps you look forward. It suggests how to apply the new view in building your safety department, handling questions about accountability, and constructing meaningful countermeasures. It even helps you in getting your organization to adopt the new view and improve its learning from failure. So if you are faced by a human error problem, abandon the fallacy of a quick fix. Read this book.
Sidney Dekker
What does the collapse of sub-prime lending have in common with a broken jackscrew in an airliner’s tailplane? Or the oil spill disaster in the Gulf of Mexico with the burn-up of Space Shuttle Columbia? These were systems that drifted into failure. While pursuing success in a dynamic, complex environment with limited resources and multiple goal conflicts, a succession of small, everyday decisions eventually produced breakdowns on a massive scale. We have trouble grasping the complexity and normality that gives rise to such large events. We hunt for broken parts, fixable properties, people we can hold accountable. Our analyses of complex system breakdowns remain depressingly linear, depressingly componential - imprisoned in the space of ideas once defined by Newton and Descartes. The growth of complexity in society has outpaced our understanding of how complex systems work and fail. Our technologies have gotten ahead of our theories. We are able to build things - deep-sea oil rigs, jackscrews, collateralized debt obligations - whose properties we understand in isolation. But in competitive, regulated societies, their connections proliferate, their interactions and interdependencies multiply, their complexities mushroom. This book explores complexity theory and systems thinking to understand better how complex systems drift into failure. It studies sensitive dependence on initial conditions, unruly technology, tipping points, diversity - and finds that failure emerges opportunistically, non-randomly, from the very webs of relationships that breed success and that are supposed to protect organizations from disaster. It develops a vocabulary that allows us to harness complexity and find new ways of managing drift.
David D. Woods
Human error is cited over and over as a cause of incidents and accidents. The result is a widespread perception of a 'human error problem', and solutions are thought to lie in changing the people or their role in the system. For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label 'human error' is prejudicial and hides much more than it reveals about how a system functions or malfunctions. This book takes you behind the human error label. Divided into five parts, it begins by summarising the most significant research results. Part 2 explores how systems thinking has radically changed our understanding of how accidents occur. Part 3 explains the role of cognitive system factors - bringing knowledge to bear, changing mindset as situations and priorities change, and managing goal conflicts - in operating safely at the sharp end of systems. Part 4 studies how the clumsy use of computer technology can increase the potential for erroneous actions and assessments in many different fields of practice. And Part 5 tells how the hindsight bias always enters into attributions of error, so that what we label human error actually is the result of a social and psychological judgment process by stakeholders in the system in question to focus on only a facet of a set of interacting contributors. If you think you have a human error problem, recognize that the label itself is no explanation and no guide to countermeasures. The potential for constructive change, for progress on safety, lies behind the human error label.
©2017 GoogleSite Terms of ServicePrivacyDevelopersArtistsAbout Google
By purchasing this item, you are transacting with Google Payments and agreeing to the Google Payments Terms of Service and Privacy Notice.