The story of how James Rose became an authority on the psychology of human error begins with a teapot.
It was the early 1970s. He was a professor at the University of Leicester in the UK and studied motion sickness. This revealed what they spin around and sometimes ate for breakfast.
One afternoon he was boiling tea in the kitchen, so his cat, a brown Burmese man named Rusky, settled down for the food. “I opened a can of cat food,” he later recalled.
After swearing at Rusky, Professor Reason denounced himself. Why did you do something so stupid?
This question seemed more intellectually attractive than making people dizzy, so he abandoned his movement intoxication to study the reasons why humans make mistakes, especially in high-risk environments.
By analyzing hundreds of accidents in aviation, rail travel, medicine and nuclear, the reason concluded that human error is usually a by-product of the situation.
That was how he arrived at the Swiss cheese failure model. This is a metaphor for analyzing and preventing accidents that assume situations in which multiple vulnerabilities of safety measures (cheese holes) line up to create tragic recipes.
“Some scholars play an important role in establishing an entire field of research in psychology. Sigmund Freud. Noam Chomsky, of Linguistics. Robert L. Sumwalt, former chairman of the National Transportation Safety Board, wrote in a 2018 blog post. “In the field of safety, Dr. James Rose has played such a role.”
Professor Reason passed away on February 4th in Slough, a town about 20 miles west of London. He was 86 years old.
His death in the hospital was caused by pneumonia, his family said.
The reason he is a talented storyteller is that he finds a vivid and resourceful way to explain complex ideas. In consultation with conferences, television news programs, and government safety officials around the world, he sometimes deployed slices of cheese as props.
In one educational video, he was sitting at a table in the dining room. He was ready for a romantic dinner, with a bottle of wine, two glasses and a cutting board with layers of cheese.
“In an ideal world, each defense would look like this,” he said, lifting up a slice of cheese with no holes. “It's solid and unharmed.”
He then reached for another slice. “But in reality, each defense is something like this,” he said. “There's a hole.”
The Philosopher was easy to understand.
“Every defense has a hole,” continued Professor Reason. “Sometimes, holes are lined up, so there's a trajectory of accident opportunities.”
To explain how holes occur, he places them in two categories. For example, normal aggressive obstacles, or mistakes by people who grab cat food instead of tea leaves. Potential conditions, or errors committed in construction, written instructions, or design of the system. Store two easily scooping substances in a cabinet, for example.
“Nearly every organizational accident involves a complex interaction between these two sets of factors,” he writes in his autobiography, The Life of Error: From Small Slips to Big Disasters (2013).
In the Chernobyl nuclear accident, he identified a potential condition that had existed for years. Inadequate management of the organization; and inadequate training procedures and supervision for frontline operators have caused a catastrophic explosion by making the mistake of turning off several safety systems at once.
“The operator tends to be heirs to a system defect, rather than the main instigator of an accident,” he wrote in Human Error (1990). “Their part is adding final decorations to the deadly brew.
Professor Reason's model is widely used in healthcare.
“When I was in medical school, the error means you got screwed, and you should just try hard not to ruin it,” said Robert Wachter, chairman of the medical school at the University of California, San Francisco. in an interview. “And if it's really bad, you'll probably be sued.”
In 1998, the doctor he recently hired for a fellowship said he wanted to specialise in patient safety strategies. His hospital (or most others) had no formal systems or methods to analyze and prevent errors, but he had many responsibilities, most of which were aimed at doctors and nurses.
This particular doctor was trained at Harvard Medical School. So they incorporated the reason professor's idea into their patient safety program. Dr. Wachter, who began reading Professor Reason's journal articles and books, said that the Swiss cheese model is “epiphany” and that he “likes wearing new glasses.”
We realized that those given the wrong dosage of the drug could be a victim of poor syringe design rather than an inadvertent nurse. Another patient may have died of cardiac arrest as a defibrillator, which was normally stored in a hallway, was transported to another floor and placed on a malfunctioning floor.
“When an error occurs, our instincts can't see this at the end,” Dr. Wachter said.
When you do that, he adds, “These layers of protection are quite porous in ways that we couldn't understand until we looked at it.”
James Tour was born on May 1, 1938 in Garston, the village of Hertfordshire, northwest London. His father, Stanley Tour, died in 1940 during World War II. His mother, Hilda (Reason) Tortel, died when he was a teenager.
The reason for his grandfather, Thomas Augustus, raised James, who took his last name.
In 1962 he graduated from the University of Manchester with a degree in psychology. He received his PhD from the University of Leicester in 1967 and taught and conducted research before enrolling in 1977 as a faculty member at the University of Manchester.
He married educational psychologist Rhea Jali in 1964. She survived him with her daughter Paula Reason and Helen Moss and three grandchildren.
Throughout his career, Professor Reason's surname was a reliable and light source.
“The word “reason” is of course widely used in English, but does not explain what Jim is right and famous, that is, what is famous for “error,” writes Eric Holnagel, an editor of an international magazine, in the introduction to the autobiography of a professor of theory. “It's true, “error” is almost the opposite of “reason.” ”
Still, that makes sense.
“Jim certainly gave reasons to researching errors,” he wrote.