top of page

Checklists, Error reporting and the fallacy of explanation (Design for Error – Part 4)

While checklists can help in minimizing mistakes, should they be handled by one or two people? Also, is there a bad design for checklists? Why don’t companies employ healthy practices if they the latter doesn’t seem so difficult?


Check, check


Checklists are effective tools for increasing the accuracy of action and prevent errors. They come in handy especially in those situations when the worker is interrupted mid-way to completing a process. Experts opine that the task of verifying with the checklist should be handled by two members, with one going through the list and the other completing the steps. But this is not what one observes in the industry. Many a time, an operator completes a task and other checks for the completion of the steps. Leniency can set-in pretty early in this exercise. The same is applicable in cases when multiple people are given the responsibility of going through checklists and confirming proper completion of tasks. Each one of those tasked with checking the list may assume the rest of them have completed and they need not go through it again.


It is in the airline industry that the correct form of checklists is used. Both the pilots work in tandem while completing tasks, following a checklist. It is seen as a critical component of safety. It is certainly surprising however that other industries are yet to follow this practice although the results are clearly visible. The medical industry can certainly use more checklists, especially during operation procedures as well as while administering medicine.


Following the general trend of this subject, let’s see how good checklists can be designed. An aspect of bad checklists is a fixed order of the steps to follow. In the rare event that a step needs to be skipped in the hopes of getting back at a later point in time, there is a risk that never happens. Electronic checklists offer better management of lists since one can complete all the sub-tasks in a non-linear manner and be alerted, should they miss a step.


“Hi, I would like to report an Error”


Reporting an error, when diagnosed, might sound like an easy task by itself. Remember social and cultural pressures? They ensure that in most cases errors aren’t reported or sometimes not actively sought to be identified. This is especially true in situations when a worker fears backlash for his co-workers or worse, contempt for actively pointing out imperfections in the workplace or other errors. Even workplaces, in general, fear the thought of appearing as imperfect, should they accept mistakes on their part. Most institutions work tirelessly to maintain an image that shows otherwise. The only solution to all of this is to just admit the presence of errors and work diligently to the bottom of each of them, determine the root causes, correct them and prevent future errors.


As discussed earlier in this series, the Toyota Production Systems are the leaders in quality management. ‘Jidoka’, loosely translating as “automation with a human touch”, is a philosophy that encourages error detection by workers, reporting of the same and in some cases, halting the workflow until the said error is rectified. The ‘five whys’ method is implemented to understand the cause of an error so it can be fixed. Another system that ensures correct placement of parts during assembly is called ‘Poka-Yoke’. Here, special jigs and fixtures are employed to ensure proper placement of parts. A simple block of wood can also be effective to ensure the correct angle of placement.


NASA is known as the pioneer in space exploration and related technology. In one of the previous blogs here on Mind Brunch, we had discussed the work carried out in the field of Biocapsules, those smart things that can be placed under the skin to maintain specific stats in the body. But what might be completely surprising is the NASA also runs a voluntary error reporting service for the airline industry. And this has proved to be just the right kind of facility the pilots needed. Given the organization of this particular industry, it isn’t beneficial to report to one’s employer. So, enter NASA, a neutral organization that promises complete anonymity.


Here’s how it works. When a pilot notices or commits a mistake, he/she reports it. NASA will use the related contact information to get more details. Once a case is made, the intended party is notified while the details of the pilot who reported are erased from the system. In this manner, semi-autonomous pilots can bravely report and the entire industry benefits from this service, reducing many future accidents.


Do you smell that Error?


So an error has been committed. But how do we know that there has been one? Action slips are relatively easier to detect since they involve a step being missed or wrongly done. However, this happens when there’s a feedback loop to let one know of the said slip. If the output of an action cannot be observed, slips go unnoticed. Memory slips are difficult to detect since they involve no visible signs, except in cases when inaction leads to unexpected outputs, in which case they can be detected.


Mistakes are also immune to detection as they involve setting a wrong goal. All the actions that follow lead to the intended goal, except the latter itself are flawed. Even a careful monitoring of every step fails to throw up any wrongdoings. In such cases, initial feedback of actions might lead to a misdiagnosis that the goal will be ultimately reached. Memory-based mistakes are the hardest to detect the thing that is forgotten is the entire plan or a considerable portion of it.


Let me explain that Error to you


While detecting mistakes can take a while, explaining their cause can be a tough task, as in many cases, personal biases intrude in the place of objective judgment. This might include providing reasons that are more familiar with nature or discounting issues taught be too trivial to influence a mistake.


Let’s take an example to explore further. Imagine you are driving to a city. As you are on the highway, unknowingly you take a wrong turn. As you continue on the changed route, you observe billboards and other visual aids that indicate you may be heading elsewhere and not the intended destination. Many people in this situation come up with some logic to explain them, like the billboard for a city you are driving to might be a commercial advertisement and not actually indicative of the actual place you are in right now. Or when you are faced with unfamiliar buildings and topology, you try to explain to yourself that things have changed since the last time you were there. Has it happened to you? How far did you travel before realizing your mistake?


Hindsight is 20/20


Baruch Fischhoff, a psychologist, explained through a series of tests how people often make complete sense of the events once outcomes are known. They can easily make out how the series of events led to an outcome. While testing with just the events at hand, participants mostly missed predicting the right outcome. What this shows is the bias investigators might have while studying accidents. It is preemptive that the investigative team place themselves in the exact situation before the event has happened, in the midst of the ongoing action. This is also one of the reasons why some reports take long times to complete the aforementioned activity is difficult to simulate.


Notes


  • This series is a summary of Chapter 5 (Human Error? No, it’s Bad Design) from the book ‘The Design of Everyday Things’ by Don Norman

  • You can also listen to this insightful podcast by NPR Hidden Brain to understand how our minds work under stress – https://n.pr/2KtFwLB

留言


Post: Blog2_Post
bottom of page