Mistakes primarily arise from the vagueness of thought that is experienced by a similarity between different situations. While we may believe what we remember about a long past event as accurate, oftentimes it is the opposite. Drawing from such background, we tend to misstep while judging tasks or during the course of action.
Jens Rasmussen, an engineer from Denmark has classified behavior into three modes: Skill-based, Rule-based and Knowledge-based.
Skill-based behaviors involve expertise in conducting a given task. An error within such behaviors is a slip. Rule-based behaviors are routines that follow a pattern or steps in completing a task. An error is mostly a slip, like skipping a step unknowingly. There is, however, a chance for the wrong diagnosis to take place, in which case, a wrong rule is applied and a mistake occurs. Knowledge-based behaviors come into the picture in events that are unfamiliar in nature. Planning for a new task, tasting or changes triggered by feedbacks, all involve reasoning and problem mitigation.
Rule-based mistakes can result in large-scale disasters, like in situations when genuine danger is thought of as otherwise and necessary action is halted. In industries, some mistakes might occur in spite of choosing the correct rule. In such cases the rules are formulated for ideal situations, rendering them inefficient while tackling critical real-world problems. Wrongful interpretation of rules can also result in unforeseen mistakes. As discussed earlier, vague similarities with past events might trigger selection and application of certain rules but the brain can paint a different picture. Such matching of events is biased in nature, influenced by how recently the past event occurred, how regular such events happen and how different the event at hand itself is in relation to others.
Rule-based mistakes occur in such situations where a deluge of information hits the person who has to make important decisions. In other cases, minor issues compounding over time, converge to become an inescapable tsunami that hits out of nowhere. As an example, consider a home where some light bulbs have loosely fitted that need to be shaken a bit to switch on, a faucet that leaks, and things we postpone since they might not need immediate attention.
Now in a manufacturing unit or other important establishments like a power-plant, a little deviation from normal doesn’t warrant immediate attention. The workers provide themselves a certain range to work with comfort. Else, they might not be able to complete normal everyday work. But in times of impending disaster, small deviations can add-up and pose a difficult question for the operators, who under such enormous stress can err. Surely, such events are very rare, but when they do occur, everyone is caught unaware.
Investigations might reveal that a very simple mistake on the part of the operator could’ve made the difference, except placing oneself in the shoes of the said person in such situations might help us empathize with those deemed guilty of failure to prevent a disaster. Design changes need to be brought in to provide data for easy interpretation to mitigate any more of such events.
Knowledge-based behaviors come to play on untested grounds. Skills and rules are more behavioral in nature, making most of it subconscious and automatic. Knowledge-based actions are slower and consciously undertaken. Good Manuals can help us wade through unchartered territory on most days. However, an intelligent computing assistant provides the promise to offload high-level problem-solving and aid us in better decision making.
An interruption in the earlier stages of carrying out a task can make us forget the final goal of the work we’re undertaking. As such, we may end up continuing later, but towards a wrong goal. This is a normal premise for a memory-lapse based mistake. Similar to handling memory-slips, relevant information needs to be displayed and communicated to the user for easy recognition.
An often overlooked aspect contributing to design and workplace errors is social pressure. Although it might seem counterintuitive, even with regards to design, social pressure needs to be understood before for successfully solving human errors. Teams are constituted in certain cases to overcome individual bias. However, if the diagnosis of the issue itself is flawed, then the entirety of steps taken henceforth will lead to a wrong conclusion. It is observed that taking breaks between discussions as well as introducing third parties into the analysis helps greatly. The former is especially effective as small breaks can refresh a person and help them focus with new vigor.
In commercial workplaces, there are certain expectations that need to be met. Many installations like a power plant have to be kept running at all times. A slight shutdown will have huge consequences. And so, for a worker, the priorities are in a way where small deviations aren’t as important as ensuring that there is continuous work-flow. In other instances, seniority will play a critical role in the path that is followed. It is observed that before many well-known accidents, the junior, although was correct, had to eventually follow the flawed directions of the senior. This holds especially true in some accidents involving airplanes, either aground or mid-flight.
An innovative solution is to incentivize safety above everything else. Should a worker halt the operation to attend to a safety-related issue, he/she should be rewarded and a culture must be built upon such acts, wherein it isn’t always necessary to complete a task. A notable example is seen in the world of scuba diving. Weights are normally used to help the divers overcome buoyancy. However, it was often found that scuba-divers would hold on to those weights even when the latter would obstruct their way back to the surface. One of the reasons for this is the high cost of weights and a perceived social pressure to not come across as physically weak. But when faced with alarming statistics about the safety of such actions, an instructor decided to replace weights for those divers who let-go theirs in order to get to the surface, safe and secure. In summary, overcoming social and cultural pressures are the hardest areas of preventing errors, since this involved changes not just in design but the way a company itself operates.
This series is a summary of Chapter 5 (Human Error? No, it’s Bad Design) from the book ‘The Design of Everyday Things’ by Don Norman
You can also listen to this insightful podcast by NPR Hidden Brain to understand how our minds work under stress – https://n.pr/2KtFwLB