March 2, 2019
We all recently learned that RaDonda Vaught, a nurse working at Vanderbilt University Medical Center, has been charged with reckless homicide in the death of a patient, Charlene Murphey. To many in the safety community, it is a setback. How could the State of Tennessee be prosecuting a nurse for what the safety community calls a medication error?
Given the dialogue, I thought it important to speak to this prosecution through the lens of a just culture. Within a just culture, should there be a call to criminally prosecute RaDonda Vaught for her role in the death of Charlene Murphey?
Download - Reckless Homicide at Vanderbilt?
Send download link to:
The pertinent facts of the event are these: A patient at Vanderbilt University Medical Center, Charlene Murphey, was about to have a positron emission tomography (PET) scan. Like many undergoing a scan, Charlene became anxious. This prompted her physician to order an intravenous (IV) sedative, VERSED (midazolam), to calm Charlene’s nerves. Charlene’s primary nurse was busy, so a float nurse was sent to administer the drug. That float nurse, RaDonda Vaught, was working in the neurological intensive care unit (neuro-ICU), training a new nurse, about to head to the emergency department (ED) to conduct a swallow study. While in the neuro-ICU, she was notified to administer the IV VERSED to Charlene Murphey.
RaDonda Vaught attempted to retrieve the VERSED through an automated medication dispensing cabinet in the neuro-ICU under Charlene Murphey’s patient profile. Not finding VERSED on the profile (it was there but under its generic name), RaDonda Vaught overrode the patient’s profile to search the dispensing cabinet for the VERSED. In the override search, she typed in the first two letters “VE” and clicked on the first drug that appeared. That drug was not VERSED, but rather, VECURONIUM, a paralytic drug that would take Charlene Murphey’s life. These essential facts don’t seem to be in dispute. Vaught overrode the automated dispensing software to get at the VERSED, but unintentionally pulled the VECURONIUM (see the Institute for Safe Medication Practice’s (ISMP) discussion of just how frequently providers conduct overrides to obtain medications for administration). She did not confirm the drug at the time of removing it from the cabinet, she did not confirm the drug at bedside, and she did not stay with the patient to monitor her reaction to the drug. Further, she apparently did not recognize the warning sign provided by the physical difference between the two drugs: VECURONIUM is available in powder form (requiring reconstitution), unlike VERSED, which is only available in liquid form.
Now, before proceeding to a discussion of the criminal prosecution of RaDonda Vaught, we should all agree there were considerable system factors leading to Charlene Murphey’s death. RaDonda Vaught apparently worked in a hospital that required neither bedside barcode scanning of drugs at the PET scanner, nor a second set of eyes on drugs obtained via override. Additionally, she worked at a hospital that had likely failed in its oversight of critical safety behaviors. The system and management deficiencies were recognized, at least in part, by the Centers for Medicare and Medicaid Services (CMS) which put the hospital in jeopardy of losing CMS funding. Understanding that there were considerable system and managerial issues, we need not expend energy entertaining the typical post-event, faultfinding, employee v. system debate.
Instead, let’s focus on the justice of RaDonda Vaught’s criminal prosecution. Let’s thoughtfully and respectfully look at the good people of Tennessee who, through their criminal law, have sought to influence the conduct of their fellow citizens (and visitors) by creating expectations around the risks that citizens (and visitors) impose upon one another. Tennessee has numerous laws around the big “common law” crimes— murder, rape, arson, etc. None are applicable here. We’ll focus more sharply on three provisions of the Tennessee criminal code related to unintended outcomes: two related to recklessness, the third related to negligence.
The first law is Tennessee’s codification of the natural law principle of not putting fellow humans at imminent danger of serious bodily injury or death. It’s called reckless endangerment, and in Tennessee it is a misdemeanor, carrying with it a fine and prison sentence not greater than 11 months, 29 days. Reckless endangerment is a common provision across criminal codes, with modern society having moved beyond the frontier days when drunken cowboys could leave the saloon with guns firing into the air, merely for their own amusement. The second provision is similar, except that in this case, the reckless conduct causes death. Tennessee calls this reckless homicide, a felony carrying the penalty of a hefty fine and two to twelve years in prison. In a classic display of severity bias, the people of Tennessee say that if we are unlucky enough in our reckless conduct to actually kill someone, the maximum penalty will be enhanced from serving up to one year in prison to serving up to twelve years in prison. The third and final law is called criminally negligent homicide, a felony carrying a penalty of one to six years in prison. Negligence is a lesser crime than recklessness, as we’ll explore in a bit.
Through these laws, the people of Tennessee have put themselves (and their visitors) on notice that conduct endangering others will come with the threat of fines and prison sentences. Specifically, these laws were created “to prevent conduct that unjustifiably and inexcusably causes or threatens harm to individual, property, or public interest.” (Tennessee Code Title 39. Criminal Offenses § 39- 11-101). Note that even the criminal law is not about retribution, retaliation, or vengeance. Those objectives are long gone from mature systems of law (at least in theory). In the context of the prosecution of RaDonda Vaught, Tennessee criminal prosecutors are now safety professionals, and their sword, the criminal law, is about ensuring the safety of the public by deterring unsafe acts. As do other states, Tennessee considers these laws to be an integral part of a modern, highly-reliable society. On the surface, these laws seem a logical approach to a safe and orderly society.
Now, what do the people of Tennessee mean by the terms recklessness and negligence? Recklessness, simply, is the conscious disregard of a substantial and unjustifiable risk. It is seeing, in the conscious part of our brain, a substantial and unjustifiable risk, and with that knowledge, choosing to follow through with our conduct. Negligence, in contrast, is the failure to see a substantial and unjustifiable risk that we should have seen. In recklessness, we ignore the risk we see; in negligence, we don’t see the risk that we should have seen.
The people of Tennessee are prosecuting RaDonda Vaught under the law of reckless homicide, deciding, based upon the evidence presented to them, that she had to have seen and disregarded a substantial and unjustifiable risk of harm to her patient. And because her alleged reckless conduct resulted in death, reckless homicide was the most appropriate charge.
How does this map into the Just Culture model? First, in our Outcome Engenuity Just Culture model, we focus more on the quality of one’s choices, less on the triumph or tragedy that those choices produce. Unlike Tennessee law (and that of many other states), we do not advocate for harsher penalties when the injury is greater, because doing so simply rewards the lucky and punishes the unlucky. Additionally, justice systems that hinge penalties on harm cause us, the people, to believe that our risky choices are somehow validated by the absence of harm.
Second, our model identifies five levels of intention toward harm. These are, in descending order of culpability: (1) purpose, (2) knowledge, (3) reckless, (4) at-risk, and (5) human error.
Purpose is the express goal of causing harm, while knowledge is knowing that harm is going to occur (think of posting compromising photos of your hated co-worker as a purpose to cause harm, whereas theft of your nifty work computer is knowingly causing harm). While purpose and knowledge are very similar, we split the two because knowingly causing harm is sometimes justified (breaking into a car to save an overheating baby), whereas a purpose to cause harm is never justified. These two levels of intent, purpose and knowledge, are there in the Tennessee criminal law; there’s just no reason to believe they are relevant to the case at hand.
Recklessness, again, is the conscious disregard of a substantial and unjustifiable risk (think drunk driving). In other just culture models, many crafted by non-lawyers, you will see the words reckless and negligence used interchangeably for intentional risk-taking. In fact, they are two very different terms in the law, with negligence more tied to simple human error than any kind of willful risk-taking behavior. In our Just Culture model, we abandon the term negligence because notions of negligence on the street are far more culpable than how the law defines it. In our model, we split negligence into two types of behavior: human error and at-risk behavior. At-risk behavior is the choice, but where the risk is not seen, or mistakenly believed to be justified (think of choosing to drive 9 mph over the speed limit). Human error is the unintended behavior, the slip, lapse, or mistake (think of the stop sign you did not see). In short, and much simplified, under the Just Culture model we propose the following actions: accept/console the human error, coach the at-risk behavior, and leave sanction/ punishment for the reckless, knowledge, and purpose to cause harm.
Given these levels of intention toward harm, let’s parse RaDonda Vaught’s conduct. First, we need to separate the outcome and the error from the choices that RaDonda Vaught made. The outcome was Vaught giving Charlene Murphey the wrong drug; Murphey’s death is the tragic consequence. This outcome was clearly unintended by RaDonda and Vanderbilt as a whole. Further, we see only one human error within this scenario. That occurred when RaDonda Vaught overrode the dispensing cabinet, typed in “VE”, and selected the wrong drug to administer – VECURONIUM instead of VERSED. She made a mistake in believing the first drug on the screen was the drug that she had intended. For the outcome and the error, the Just Culture model proposes that we support and console the nurse.
For many in the safety community, this would be the end of the analysis. Unintended actions led to an unintended outcome: mourn the death, console the nurse, and fix the system. These three actions lead the safety community to wonder why RaDonda Vaught is being prosecuted for a medication error. How can the State of Tennessee prosecute a nurse for a mistake and outcome she did not intend? This is a flawed understanding of reckless homicide. RaDonda Vaught is not being prosecuted because she made a mistake. She is being prosecuted because the people of Tennessee believe they see a link between a nurse’s choices and a dead patient. RaDonda Vaught is facing prosecution for being what the prosecutor and grand jury see as the healthcare equivalent of the drunk driver who runs a red light, killing a helpless pedestrian in the crosswalk.
Thus, it is RaDonda Vaught’s choices that we need to evaluate: her choice to obtain the medication via override, her choice not to confirm the drug at the dispensing cabinet and at the point of administration, and her choice not to monitor the patient after administration of the drug. It is worth pointing out that some of Vaught’s sequential failures here might have been human error. Perhaps Vaught’s tunnel vision on the powdered form of the drug distracted her from her ordinary confirmation when she took the drug out of the dispensing cabinet. Sometimes events are indeed the rare sequence of statistically unlikely errors all happening at the same time. That said, it’s much more likely that one or two errors connected with a combination of latent behavioral norms. We’ll proceed in our analysis, in parallel to the prosecution, with the idea that the override, failed confirmations, and failure to remain with the patient were all choices Vaught made along the way.
To understand how Vaught’s choices fit within our five levels of intention, we have to explore in greater depth the difference between reckless and at-risk behavior. At-risk behavior is the risky choice we make, but with no conscious recognition of the unacceptability of the risk. At-risk behavior, usually in the form of a decision to deviate from a standard or rule, generally makes good sense to us at the time. Consider, as instructive, the scenario of driving down the freeway, going with the flow, at our standard 9 mph over the speed limit. We change lanes, but don’t signal our lane change, all while we’re deeply engrossed in the audio edition of our favorite John Grisham novel. At no time on our journey are we thinking we are taking risks with the lives of those around us. Our risk monitor, that little voice on our shoulder (see Dave’s Subs for further discussion of the risk monitor), is not knocking on the door of our conscious thoughts, screaming at us, letting us know that we’re endangering the lives of those around us. Our risk monitor is silent; our conscious thought is blissfully focused on our book. Yet, we are knowingly violating rules, from the speed limit to the requirement to signal lane changes. Suddenly, interrupting of traffic, 40 mph faster than all of us going 9 mph over the speed limit. Our risk monitor immediately fires, judgment brews: there, in front of us, pulling away, is a reckless driver. We’re both violating the rules of the road, yet we qualitatively see the Corvette driver as different from us. We, speeding, not signaling lane changes, listening to John Grisham, see ourselves as safe, but that Corvette driver is unacceptably dangerous. We see him as reckless.
Can we judge RaDonda Vaught’s choices through the same lens? Were RaDonda Vaught’s choices more similar to our driving 9 mph over the speed limit, or to the Corvette driver weaving between lanes, going 40 mph faster than the rest of us?
Was RaDonda Vaught just moving through a normal day? She was training a new nurse, on her way to the ED, helping out an anxious patient having a PET scan—all relatively routine activities for a float nurse. Why would she, in direct view of a nurse she was training, engage in choices she thought to be reckless? Could it be that, throughout this course of events, her risk monitor never fired? Could her decisions and practices on the day she cared for Charlene Murphey be consistent with what she did on many other days? Could it be her conscious brain never recognized the significant and unjustifiable risk she was taking? The answer here is yes, if you are open to recognizing our natural propensity to drift into at-risk behaviors.
I have often said that the Institute of Medicine got the title wrong in its seminal report “To Err is Human.” The title should have been, “To Drift is Human.” Our focus should not be on RaDonda Vaught’s error, but on her apparent drift—obtaining the medication via override, skipping a number of drug confirmations that could have made her error visible, and not remaining with the patient to monitor the drug’s affects after it was administered.
Granted, it’s easy in hindsight, with a bit of self-righteous indignation, to see RaDonda Vaught’s choices as other than at- risk. “Well-meaning humans cannot in good faith choose to violate rules,” we might say to ourselves. “It would gut the whole concept of a rule.” Yet, that is the collective “us” on the road where “we” kill 40,000 people per year. If we magically removed drivers who knowingly violate the rules, the roads would be nearly empty. The same is true in the typical hospital. If we criminally prosecuted every healthcare provider who has knowingly deviated from a safety protocol, a very large portion of our providers would be in jail rather than providing care. Now, before you skewer me for this assertion, consider a choice to skip two-patient identifiers and a choice to skip hand hygiene entering a patient’s room. Are these not overrides of basic safety protocols? The occasional decision to deviate from these two safety protocols alone would wipe out the majority of a typical hospital’s clinical staff. Now, imagine every nurse who ever overrode a patient’s medication profile being charged with reckless endangerment. Imagine every provider who chose not to wash her hands going into a patient’s room being charged with reckless endangerment. Imagine an empty hospital—and a very full prison.
The propensity to drift is part of our human nature. In our Just Culture model we call it at-risk behavior. Along with system design, we believe it should be the primary focus of a hospital’s patient safety program. The inescapable human error is less the issue; at-risk behavior is where our focus should be. That said, it requires some intellectual honesty about our propensity to drift. Honesty that the Tennessee jury, and our healthcare system, might be unwilling to entertain.
Twenty years ago, I had the opportunity, along with a few airlines, to speak with the Federal Aviation Administration’s (FAA) Office of Chief Counsel. We were there to talk about intellectual honesty within the FAA’s voluntary disclosure program. In the program, airlines are incentivized to self-report safety violations. In return, the FAA foregoes imposing any fine on the airline. One provision of the voluntary disclosure program requires that airlines report that the violation of a Federal Aviation Regulation (FAR) was “inadvertent.” Airlines were not allowed to report into the program if one of its employees chose to violate the rules. Seems reasonable. But here is where a seemingly reasonable requirement bumps into intellectual honesty. Our work with US airlines showed that roughly half of all maintenance mistakes involved a choice to violate the FARs. Those violations are the shortcuts that make the system work. It’s a common joke in aviation that if a labor union wants to shut down an airline during contract negotiations, it simply asks its members to work to the rules.
So here we were, in the interest of safety, confessing to the FAA’s Office of Chief Counsel that knowing violations of Federal Aviation Regulations were routine business in aviation, and that if we wanted to further improve safety, we had to stop lying about what was really happening. This led to a critical moment in the discussion. One astute attorney pointed out that it’s one thing to say to the public that pilots and technicians make mistakes, it’s wholly another to confess to the public that professional aviators choose to violate the rules. Sure, we’ll all do 9 mph over the speed limit on the freeway, but we cannot tell the public that commercial airline pilots and technicians do the same. To this day, the FAA still only accepts voluntary disclosures where the violation of the regulations is reported as inadvertent. (FAA Advisory Circular 00- 58B, Voluntary Disclosure Reporting Program, 4/29/09.)
Our model of Just Culture puts considerable focus on our propensity to drift. Improving system design and managing at-risk behavior are the center of our work to build highly-reliable outcomes. That said, most of today’s purported “just culture models” reject the idea of at-risk behavior. It’s just too messy. Rather, it’s human error or reckless behavior. It’s “violators” versus well-meaning human beings who made a mistake. It’s good versus evil. There is no middle ground. Many who follow the work of Sidney Dekker or use James Reason’s unsafe acts algorithm have adopted this more rule-based approach. If it’s an inadvertent violation, it’s for the safety professional; if it’s a choice to violate, off to the human resources office (or prosecutor) you go.
The distinction between at-risk and reckless is at the heart of RaDonda Vaught’s fate. There is no reckless homicide if RaDonda Vaught’s choices were at-risk. RaDonda was focused on training another nurse and getting to the ED. Her immediate mission was to administer a drug to a patient undergoing a PET scan. Did she, by conscious choice (or developed habit), deviate from safety standards? Yes. In doing so, did her risk monitor fire? Did it creep into her conscious thought that she was taking a substantial and unjustifiable risk with her patient? Was she reckless? Likely not, if we are intellectually honest about how we operate as human beings.
That said, the Tennessee jury might go the way of the FAA’s Office of Chief Counsel, and the way of those just culture models that split human failure into only inadvertent errors and reckless violations. The prosecutor may easily convince the jury that Vaught’s practice deviations were choices rather than errors. “Ms. Vaught, did you choose to obtain the medication via override? Did you choose not to confirm the drug? Did you choose to walk away from the patient after administering the drug?” It will be easy to show the jury that these were not unintentional errors, but conscious choices. And with an overly simplistic and idealistic model of human behavior, reckless remains the only other option.
Adding to Vaught’s troubles, the hospital, to protect itself, may not be truthful about how often RaDonda Vaught’s choices were typical of other nurses at Vanderbilt. Likewise, peer nurses, to avoid self-incrimination, might testify that they always follow the rules, from the drug confirmations to remaining with the patient after drugs have been administered. Does Vaught, then, have a chance of acquittal when the entire system finds life easier to live in a state of denial? At-risk behavior is the dirty laundry, for Vaught’s fellow nurses, for the hospital, and for us as a society. And if there’s a place not to air our dirty laundry, a criminal court is probably near the top of the list.
It would be up to a jury of RaDonda Vaught’s peers, or in this case the criminal jury, to determine if she was reckless. Recklessness is a reflection of culture, not a proclamation from the king. I cannot proclaim that Vaught’s actions were at-risk because I do not live in her local culture, nor do I understand what risks she and her fellow nurses see in their local environment. Given that, let’s say the jury of her peers finds no recklessness, but instead finds one human error and four at-risk behaviors. Given that, was RaDonda Vaught’s conduct still criminal? The answer is yes, unfortunately, according to Tennessee law. Had the prosecutor indicted RaDonda Vaught under the negligent homicide law, it would have only required a showing that RaDonda Vaught should have seen the substantial and unjustifiable risk she was taking. If this were the case, I think criminal negligence would be easy to prove. Did Vaught make a mistake? Yes. Should Vaught have known she was carrying the wrong drug in her hand? Yes. Like every other state, in the interest of safety, the good people of Tennessee have made both human error and at-risk behavior a crime.
We’ve been working for over 25 years to help create a more just culture; to reconsider the wisdom of punishing human error and at-risk behavior as legitimate public and human resource policy. We’ve also been working hard to end our collective reliance on “no harm, no foul” as a meaningful risk management strategy. Justice is about the quality of choices we make in life – you, me, and RaDonda Vaught. In our model of Just Culture, RaDonda Vaught would have likely been consoled around an error and coached around a series of at-risk behaviors. It is unlikely, given the facts that we know today, that RaDonda Vaught did anything reckless. More importantly, this event should have been prevented in the first place. In a Just Culture, we anticipate and design for the five behaviors (purpose, knowledge, reckless, at-risk, and human error), knowing that the latter two (human error and at-risk behavior) are particularly pervasive. When the severity of the potential consequence warrants it, managers have an obligation to design systems in anticipation of both our inescapable fallibility and our propensity to drift. They also have an obligation to monitor where drift is likely to occur and to coach and mentor their staff to safer choices. A manager who turns a blind eye to faulty systems and/ or the risky behavioral norms of employees would simply not be a part of a Just Culture.
If, like many, you view Just Culture as a reactive process to be used only after harm has occurred, you have missed the point of a Just Culture entirely. In our model of Just Culture, 99.9% of the Just Culture work happens ahead of harm. And in the rare cases where harm does occur, we do exactly what we would have done if harm had not occurred. Just Culture is, first and foremost, about the prevention of the harm. It is about the managerial responsibility to design good systems, and help employees make good choices, knowing they are inescapably fallible creatures with a very strong propensity to drift. Just Culture is also about the relationships between employees on the floor – creating the environment and building the skills to coach each other, living out the idea that we, too, can be our brothers’ and sisters’ keepers.
Unfortunately, as a species, we have a tendency to turn a blind eye to both risky systems and risky choices. We think that everyone is safe as long as bad outcomes don’t occur. And then when a bad system and/or an unlucky nurse’s choices do cause harm, we rise to the occasion with termination of employment and a criminal prosecution. It appears that in the death of Charlene Murphey and the prosecution of RaDonda Vought, we closely fit this pattern.