Technology Glitches and Patient Health

Mundane Data Breaches

Mistakes usually occur after a conflagration of seemingly small, quotidian errors. Often no one seems to own the problem, it's simply a "glitch". In our technological world, we're quite accustomed to glitches and large data integrity losses. We stick the newly issued credit card into our wallet before even knowing (or caring) about the details of why it was replaced.

Technology "glitches" are not to be trifled with though, they shut down metropolitan train systems, admit ~32,000 students instead of ~16,000, and compromise the most private data of 31,000 people, 100,000 people, 4 million people...They're just boring news.

In medicine, repercussions from computer glitches make train outages seem trivial. From August 2008 through February 2009, a computer glitch in the Veteran's Affairs record system tied patients to the wrong medical records, leading to incorrect dosing, delays in treatment, and other errors. A computer glitch in another case incorrectly cleared women of breast cancer after mammogram screens showed they actually had tumors.

Bodily Injury and Death

Imagine the most unimaginable "glitch" and it's probably already happened. In one, famous 1980's case (PDF), cancer patients undergoing radiation treatments from the Therac 25, manufactured by Atomic Energy of Canada (AECL), intermittently received radiation doses 100X the prescribed dose. The resulting radiation could burn through the torso and leaving a burn marks on the victim's back. The trauma from radiation trauma killed some patients within weeks.

An investigation of the Therac 25 history showed how multiple errors begot fatal injuries. The high doses occurred when a technician first entered an "X" to incorrectly select a certain dose of high beam photon mode; then "cursor up"; then "E" to correctly select a certain dose in the electron mode; then "Enter", all within 8 seconds. This accidental series of keystrokes activated the high beam instead of the low-beam, but the high beam diffuser wasn't in place, so intense radiation burned ears, breasts, groins, clavicles.

When it happened to one patient, the sound system between the treatment room and the operator wasn't functioning. He had been treated multiple times in the past, so knew something was wrong when as he lay on the table for treatment he suddenly heard unusually loud noises, saw a flash of light, and felt a searing burn. Pause. Then it happened again. The technician only learned something was wrong when the patient pulled himself off the treatment table and began banging on the locked door.

Because the burns happened infrequently, because the error messages were imprecise or oblique, and because technicians, engineers and managers couldn't believe the Therac 25 was malfunctioning, operators continued to injure patients until 1987. In a letter to one hospital physicist AECL explained that their machines couldn't be malfunctioning because of modifications that improved the "hazard rate" by "at least five orders of magnitude! With such an improvement in safety (10,000,000%) we did not believe there could have been any accelerator malfunction."

A glitch -- an "accelerator malfunction"? Or errors attributable to peoples' actions?

Errors Upon Errors

The persistence of medical physicists at several hospitals quickened Therac-25 problem solving, but clumsy safety processes, a reluctant manufacturer, and slow FDA action impeded resolution. In the final analysis, a long list hardware, software, quality assurance and process issues such as these, contributed to the injuries and fatalities:

  • The hardware and software were built by two different companies and only tested when the system was installed in the hospital.
  • Code wasn't independently reviewed.
  • Some engineering errors permitted overrides after malfunctions, other errors allowed for safety check bypasses.
  • The FDA hadn't thoroughly tested the Therac 25 (a medical device) because previous models had a reasonable safety record. But the Therac 25 had undergone numerous changes, for instance manual control systems transitioned to software controlled systems.
  • The company recalled the machines at one point, but because the first patients didn't die, the FDA under-classified the severity of the problem. But an intense radiation beam to the head could result in a more lethal dose than another body part, so later incidents were fatal.
  • The medical physicists and the FDA made recommendations to AECL. The company complied with some safety directives, but ignored others.
  • Technicians incorrectly diagnosed issues, for instance in one case a problem was wrongly attributed to a switch. The company replaced the switches. The problem recurred.
  • AECL wrongly told some institutions who reported incidents that theirs was the first report. So each hospital thought their case[s] unique.

Elusive Intangible Injury

In the Therac-25 case, each contributor -- a software programer, an engineer, a technician, someone in quality assurance, a safety officer, staff at the FDA, a company executive -- made a small mistake. Lawsuits, FDA investigations, out of court settlements, and eventually national media exposure brought the case attention. The entire compendium of errors in the Therac-25 case is so classic and dramatic that it's used as a case study. But what about computer glitches where less harm is done to fewer patients over a shorter period of time? Or what if so many are hurt - millions, say - that the plight of any one individual gets diffused? What if the evidence is unclear - there there are no burn marks on the front and back of the body?

Can injured patients be made whole? In Therac-25 cases, the lawyers of families of patients with terminal cancer argued that patients died sooner and suffered more because of their Therac-25 injuries.

What if doctors delay cancer treatment and the person dies an early death from breast cancer, as in the case we mentioned above? What can lawyers prove, how can victims be compensated? In the case where Veteran's Administration patients were matched with the wrong record, the VA denies that any negative outcomes. No harm reported, no harm done?

What about still "lesser" glitches, everyday database breaches?

Patients: Students of Misfortune?

The US HIPAA laws protect a person's medical data, file, or record from being accessed by an unauthorized person. Therefore someone couldn't enter your doctor's office, grab your paper record from the thousands stuffed floor to ceiling, and forward it on. Sometimes the law seemed overly strict. In the name of HIPAA, unmarried lifelong partners of hospitalized patients were forbidden from learning about their loved one's health.

Although HIPAA has provisions for electronic records, today's larger, more frequent mishaps leaves this regime seeming quaint. Consider the recent data breach at Stanford, where the emergency room records of 20,000 patients were posted on line. A New York Times article details how it happened. One billing contractor dealt with one marketing contractor, who interviewed one potential employee who leaked the data. The marketing contractor received got the data from Stanford Hospital, "converted it to a new spreadsheet and then forwarded it" to a job applicant, challenging them to

"convert the spreadsheet -- which included names, admission dates, diagnosis codes and billing charges -- into a bar graph and charts. Not knowing that she had been given real patient data, the applicant posted it as an attachment to a request for help on, which allows students to solicit paid assistance with their work. First posted on Sept. 9, 2010, the spreadsheet remained on the site until a patient discovered it on Aug. 22 and notified Stanford."

Would any of these patients know if they were harmed? What if they had some condition that an insurance company, employer, teacher or other would use to disqualify them as in this Stanford case? Will the class action lawsuit that's filed make them whole? What if someone recognized the value of such data and stole it, as in a recent Orlando, Florida case where hospital employees forwarded emergency room data for over 2,000 accident victims to lawyers? In the old days, hauling 20,000 patient files out of a doctor's office unobtrusively would be a challenge. Not so much with electronic data, all you need is a glitch.

HIPAA specifies that each responsible individual can be fined $250,000. Will the job applicant who outsourced her Excel Worksheet problem to pay $250,000? The marketing contractor? The billing contractor? Stanford?

Often the public has no idea about medical injuries resulting from glitches, physical or otherwise, just as they didn't with the Therac-25. If someone dies, as in the Therac-25 case, perhaps the news will get out. But the more common the incidents, the more data is lost, the more are made to seem benign, the more harm is done, the less we learn about any particular incident.

You can read all this as a depressingly negative outlook on technology and health, but my view is different. Injuries and deaths due to vague "glitches" can be prevented by fixing small, but very tangible errors. The outsourcing of everything has increased the number of contractors, and with all these people, looser interpretations of rules and diffuse culpability. But it's not just contractors. Many employees are also very cavalier about data. Walk-in or call any major medical center and you will see glaring errors. Fixing such errors, attention to detail, and yes - support for regulatory oversight, can reduce harm.

follow us on twitter!