Wellcome / EPSRC Centre for Interventional and Surgical Sciences


Examples of the need for Human Factors

People have a conceptual model of how they expect a system to work, based on prior experience with similar systems. When something does not work as expected, mistakes can occur or less reliable processes may be undertaken that the designers did not account for. Errors in a system can also be the result of poor design, such as a complex user interface or using iconography on buttons that do not easily portray the intended action. The following examples are medical devices that reached market without sufficient human factors input. In some cases, they had not been properly tested resulting in the loss of life; in others, the impact that the medical device would have on the surgical team was not discovered until used in practice.

Example of the infusion pump

In 2006 Denise Melanson, who was receiving chemotherapy treatment, went to hospital to get more medication for her infusion pump. She subsequently died from an overdose of fluorouracil and cisplatin, a drug used to treat her tumour, because the quantity to deliver per hour was miscalculated by not taking into account the number of hours per day, so instead of receiving 1.2ml per hour, she received 28.8ml per hour. She returned to the hospital four hours later with an empty medication bag, instead of four days later, but there was no way to mitigate the already administered lethal dose of fluorouracil and cisplatin. A local news article reported that investigators had concluded that the fatality was a result of an overdose of fluorouracil, poor design of the chemotherapy protocol, and the inability to rectify the situation after the lethal dose was administered. However, a human factors investigation (pg 57-63) replicated the scenario with five nurses from the same hospital using the same pump; three entered incorrect data, all of them were confused by the setup or selection of mL/hr, two were confused by the programming of the device, and three were confused with the placement of the decimal point. It was also discovered that there had been eight similar incidents prior to Denis Melanson’s, but the lessons learned from these incidents were either difficult to find or unavailable resulting in them not having a global impact as per this investigation. A human factors study should have been carried out prior to the device being approved or marketed, and the device and/or clinical protocol should have been designed to minimise the risk.

Example of Therac-25

The Therac-25 was a medical device used to destroy remaining tumour growth, after patients had had the majority removed through manual surgery, by firing electrons or x-rays at a targeted location. This machine was computer controlled and remotely operated. For shallow growths a low rad mode ‘e’ was used; for deep growths a high rad mode ‘x’ was used in conjunction with a metal plate to transform the electrons into x-rays. The Therac-25 also automated more of the safety features, which in previous models had been manually operated. In 1986, Ray Cox went in for one of his follow-up treatments, of which he had already had some. The operator accidently set the machine to ‘x’, but immediately realised his error and changed the setting from ‘x’ to ‘e’. Due to this quick change in settings, the metal plate used to change the electrons to x-rays retracted; however, the machine was still set to high rad mode. The operator, in another room, delivered a dose to the patient, but due to the setup the computer responded with an error. The operator, going off prior knowledge, believed this to mean that the machine did not deliver the dose, so they delivered it again. A second time the error message came up, and so the operator delivered another dose. At this point, Ray Cox removed himself from the machine after receiving three painful blasts. Due to untested software, no human-based safety checks, and no hardware interlocks Ray Cox died 4 months later due to major radiation burns. This was just one of many cases in which fatal levels of radiation were delivered to patients over the lifespan of the machine. Had a human factors study been carried out the risk involved with an operator quickly changing from one mode to another and putting the machine into an unknown state should have been discovered, which would have prevented the accidental loss of lives.

Example of da Vinci robot

The da Vinci robot is a minimally invasive surgical system that can support a number of surgical procedures. The complete system comprises three main parts, the patient cart, the surgeon console, and the vision cart. The patient cart contains the moving arms of the robot, which can have multiple joints moving at any given time as the surgeon controls them via the surgeon console, which is tethered to the system. The surgeon console is a fully immersive system, providing better vision for the surgeon and control of the robotic arms. However, this setup has unintentionally created barriers between the surgeon and the rest of their surgical team, as the team is no longer co-located beside the patient, which has resulted in communication challenges as well as a change in the distribution of tasks. In one example, the surgeon had provided instructions to the scrub nurse, but they were getting frustrated that it wasn't being carried out. However, the nurse had performed the action, but not considered a need for a verbal confirmation that the task had been completed as prior to the presence of the robot this was not required. In this instance a gulf of frustration developed between the two. It has also been noted that with the introduction of the da Vinci robot, scrub nurses are no longer aware of the actions surgeons are carrying out as they control the robot. This causes those situated by the robot to ‘dance’ with the robot to ensure they aren't accidentally struck by a swinging arm. Conducting an ‘in the wild’ evaluation could have highlighted the communication issues that the surgical team now face with the introduction of the robot. Discovering this issue early would have allowed the development team to mitigate or resolve the problem, producing an uninhibited workflow between team members.