Music to my ears

Originally posted 6-9-2010:
One of the main themes throughout the NPSF Conference was culture change.  Of all the aspects of healthcare, patient safety solutions involving culture change will take the most time and energy to achieve.  Many session topics at the conference focused on healthcare organization, teamwork, driving change, and training and education. The fun and refreshing way the conference opened really caught my attention.

The opening session was titled The Music Paradigm, with Conductor Roger Nierenberg.  Walking into the hall, I noticed the seats for the audience were scattered in between different sections of a full orchestra.  The room was full of different musicians of all varieties: brass, strings, percussion...even a harp.  My first thought as I sat down was what does music have to do with patient safety?

Conductor Nierenberg uses music and the dynamic nature of an orchestra to provide a metaphor for well run organizations.  The first five minutes was the orchestra playing absolutely beautiful music.  Then he starts to get the audience thinking about how an entire orchestra, with so many different people playing different instruments, can work together so well.  That is when it hits you, there is a comparison to how an orchestra runs with how a hospital is run. The conductor of the hospital could be the C-suite or the Chief of Medicine. The section leaders are like the managers of the different teams and specialties, perhaps a nurse manager or DOP.  Then the rest of the orchestra is the rest of the staff, both clinical and non-clinical.  Every musician with their instrument is essential to creating the music that will stop someone in their tracks to listen.  Just like everyone in the hospital is needed to provide the safe, high quality healthcare the patients expect.

The audience heard the result of an orchestra that was well conducted and worked together as a team. What would happen if different parts of the orchestra did not work together well?  He asked only the section leaders play, and the music still sounded good.  Then he told these select musicians to play the song with whatever style they felt like playing.  The music was off and flat.  He switched to asking only the strings to play, but he started waving his baton before they were ready.  The result was like hearing nails on a chalkboard.  In another example, Conductor Nierenberg was very casual in waving his baton and leading the orchestra.  You could see that he was not really listening to the orchestra and was in his own world on the stage. I'm pretty sure my old junior high orchestra sounded better as each section sounded as if it was on a different measure of the music.  Finally he asked the section leaders to play however they wanted, while the rest of the orchestra followed his lead.  The music actually sounded good, but no one would have guessed a few of the musicians were slacking off.  This was an interesting way of showing that even if everything sounds fine, there could still be issues lurking that need to be addressed.

I found this to be a fun way to discuss healthcare organization dynamics.  Getting that musical perfection was the result of everyone listening to each other carefully and working together as a well run team.  Each example resulting in poor quality music could be compared to a healthcare situation where the culture and teamwork broke down.  When a team is not functioning well it can lead to miscommunication, handoff errors, and frustrated staff - all of which can lead to a potentially unsafe culture for the patient.   Without a good conductor that promotes patient safety, the rest of the staff will be flat in achieving safety.  Every level and specialty has to work together smoothly to achieve the goal of safe patient care.  Healthcare organizations and teams should ask themselves, what is it going to take to make that music beautiful?

A Simulated Lesson - part 2

Originally posted on 6-1-2010:
In my last entry I shared my experience with the IV pump simulation at the NPSF conference.  After learning some of the realities of nursing, I decided it would be worth attending another one of the simulations.  The next lab I went to was called the Patient Safety Risks Challenge. 

This setup was a patient simulator in a typical hospital bed setting.  The challenge presented to me was to find the 15 patient safety issues within five minutes.  The facilitator asked my background before I started, and I explained what I did working as a Human Factors Engineer for a pharmacy IT vendor.  He wished me luck as he stated that I should have no problem finding the "pharmacy issues" within the setup. No pressure...

Instead of looking at the situation holistically, I became overly focused on the flimsy 4 page paper chart in my hand to desperately find the pharmacy issues he mentioned.  I flipped back and forth, back and forth, to look at the medications recorded upon arrival and the current prescriptions.  I did find the aspirin missing and another med with the name changed on accident.  I felt pretty good getting those out of the way.  I also noticed some obvious errors - the "bloody" bandage that needed replacing, the overflowing box of sharps on the ground, and the urine bag literally under the wheel of the bed (please tell that is not based on a true story).  I ended up finding about 9 or 10 of the errors and figured my lack of clinical expertise would be the reasons I missed a few.

It turned out I did miss a few obvious safety issues.  The patient had a latex allergy, but did not have the band on his wrist indicating this.  I also learned that a patient that has a fall risk should not have one of bed rails down.  I will always wonder how I overlooked the pulse oximeter reading of 90 on the monitor.  The facilitator then pointed out that I had missed one of the pharmacy issues.  He simply stated, "you missed the dosing error."  I'm sure I had a stunned look on my face as I asked, "what dosing error?"  Flipping through the chart, he showed one of the admitting medications with a dose of 12.5 mg.  What showed up on the next page was 125 mg.  I can't remember the name of the drug, but I would have to guess this "death by decimal" error would be a serious one in a real setting.

I spent a good two minutes just staring at those pages to find an error like this, and it still slipped by me.  It provided another moment of enlightenment of just how easy it is to miss something so simple, yet potentially deadly.  Once again I was in the shoes of the nurse (or pharmacist), and I realized what hard shoes they are to fill.  As the facilitators pointed out, many hospitals still use paper charts similar to what was given to me.  I would like to think technology could have prevented me from missing that decimal point, however I have a feeling there is more to the root cause of my mistake.

I went into the conference already recognizing the value of simulation in healthcare.  These two sessions validated my feelings on the topic. Some thoughts I have from both experiences:

- New technology should be tested in a simulated environment to validate the technology fits into workflows of the clinicians in a safe, easy to use fashion.  Just testing the "sunny day" scenarios in a controlled environment will not validate the device will be safe and usable in a crisis situation.
- Simulation is a great way to train clinicians on new technology.  I would like to see hospitals go beyond a few hours of in service to really understand the tools they will be working with.  I have heard stories from friends in nursing that they have had to use technology for the first time with real patients.  I wonder if a few experts on the technology could be on call for such situations.
- I also think simulation can go beyond technology for healthcare.  I can see benefit in training staff on new procedures with no risk to a patient.  There was a third lab setup just to show clinicians the proper technique for central line insertions.
- I have gained even more respect for the clinicians on the front line of patient care.  I have a better understanding of what you are dealing with and how critical those situations are.  This all came from a few minutes of "playing" in a simulated environment.
- My final thought on the importance of simulators can be summed up like this: If you were flying on a plane, would you want the pilot to be completely unfamiliar with that particular plane?  They have those flight simulators for a reason.

I am curious to hear from those of you on the front lines.  Have you ever gone through clinical simulations such as this in your training?  If so, what were the scenarios you were trained in?  How are you typically trained when new technology is brought to your hospital?

A special thanks to the facilitators from the Center for Medical Simulation and the University of Miami - Jackson Health System for running these very educational simulation labs.

A Simulated Lesson

Originally posted on 5-25-2010:
Last week I attended the National Patient Safety Foundation (NPSF) Annual Congress.  It was a good experience, and I plan to share some of what I learned in the next few entries. The main reason I wanted to attend was to see the simulation labs the conference was facilitating.  I have always found medical simulation intriguing as it provides a safe environment to test clinical procedures.  It is also the ideal way to test how well new technology fits into clinical workflows. I like to think of it as the healthcare equivalent of training pilots in flight simulators.  I did not realize a couple of the simulations were going to provide an enlightening experience for me.

The first simulation I went to was demonstrating how to user test new medical devices.  The setup had a state of the art patient simulator, a monitor, and an IV pump.  After sharing my human factors engineering background, the facilitators thought I would be an ideal candidate to run through the simulation.  My job was to act as the nurse taking care of "Mary", a pregnant woman that was anemic with high blood pressure.  While keeping Mary at ease, I simply had to load and administer two units of blood and a blood pressure medication.  I think using the word disaster was an understatement for what happened next.

Everything was fine at first.  "Mary" was nervous, but I was able to explain what I was giving her and why.  I'll admit I thought it was silly to talk to a dummy that was talking back to me via one of the facilitators.  I knew it was part of the whole scenario and I played along with a smile.  I turned to the pump to load the first unit of blood. Well...I would have loaded the first unit of blood if I could figure out how to load the pump.  As I switched between staring at the pump helplessly and trying anything to load the IV bag, Mary started to wonder if I knew what I was doing.  Did you know some IV pumps have a Load button to open a port to run the medication?  Thankfully after 5 minutes of sweating it out and listening to Mary complain, the facilitator pressed the button that was not located near the loading port.

Okay, a slow start.  I could figure out the programming piece to choose the right infusion rate for the unit of blood.  I looked through the list of drugs to find the selection for blood.  To my frustration, there was not a selection for "Blood".  I made the selections that seemed correct to me and was ready to deliver the first unit of blood.  The facilitator intervened at this point to inform me I made the wrong selection while Mary was starting to get dizzy and nauseous. I did not realize that there was an acronym for the unit of blood that I had missed.  This was starting to go poorly.

The facilitators recruited a nurse who was watching the spectacle to help me out.  She admitted that she had not been in a clinical role for 15 years, but would do the best she could to help get through the scenario.  The blood was finally getting delivered, so we focused on the blood pressure medication.  While getting the medication programmed in, the pump started to alarm.  Now what?  The alarm message stated, matter of factly, that air was in the line as the blood IV bag was already empty from the rapid infusion rate I programmed in.  What a chaotic scene this turned into.  Other nurses were tending to Mary who was complaining about something, the IV pump was blaring at me as I struggled to find the alarm silence button, and I was in a full sweat accomplishing nothing.  In my panic I wanted to yell at the pump and tell Mary to shut up...which thankfully I did not do. I was not exactly smiling at this point.

Mercifully the facilitator pressed the alarm silence button which was located on the top of the pump, which I thought was the power button.  He calmly helped us finish the scenario, and Mary ended up being fine.  I'm sure she never wanted to see me again. The group running the simulator thanked me for being a good sport, and letting them have some fun at my expense.  I'm sure it was funny for someone else to watch me struggle through the scenario, and I can admit to finding it humorous now.  I think a conference rep was snapping pictures, so there probably is documented evidence of my not so fine hour.

Ten minutes later the adrenaline was still pumping pretty hard.  I got so caught up in the frustration that it felt real.  I felt like I had really failed and put a patient at risk...even though I knew it was simulated experience.  That's when it hit me.  Is this what nurses go through? Are these the same feelings and emotions that someone on the front lines feels when they are frustrated by technology they need for patient?  What if I was a float nurse or new to a hospital and had never seen that pump before?  I walked in the shoes of a nurse for 20 minutes (I swear it felt like 2 hours), and it gave me a greater appreciation for what they deal with and what they do.  I also realized how important my role is designing healthcare software so it doesn't cause any confusion in a critical moment. The simulator provided a moment of enlightenment.  I had to be in that moment to really understand what a care giver is thinking and experiencing.  And yet, that scenario was only one of many tasks to be safely and successfully completed.

I attended another simulation session later in the day that was also eye opening.  I'll touch on that experience in my next entry.  Stay tuned...

The SEIPS Model - A Framework for Safety

Originally posted 5-17-2010:
Before I start getting too deep into this blog, I think it would be fair to share the framework behind my patient safety philosophy.

I mentioned in my first entry what I believe is the key point behind patient safety: Care givers are human, and humans make mistakes.  This means an assumption needs to be made that mistakes can and will happen. In order to protect patients, a 'system' must be put in place to reduce the amount of errors and to prevent an error from causing harm. What is this 'system'?

The 'system' covers everything and everyone the care giver interacts with.  I like to explain this with the University of Wisconsin System's Engineering Initiative for Patient Safety (SEIPS) model.

32iE4E5EFC8C4E9DFA1

According to the model, the healthcare work system is broken into the following components:
  •  Person - The care giver is at the center of the model. The person component also covers interactions with other people, including the patient.
  •  Tools/Technology - The tools and technology the care giver requires.  This can range from paper and pencil to computers.
  •  Tasks - The tasks required by the care giver to treat the patient.  Documenting a lab result, transcribing a prescription, and talking with a patient are all examples of healthcare tasks.
  •  Environment - This piece covers the physical environment the care giver works in. Adequate lighting, the location of computer terminals, protection from hazards all fall under environment considerations.
  •  Organization - This looks at the rules and regulations handed down from management in the hospital.  The culture of that the care giver works in also falls under organization.

When combined with the processes put in place for the care giver, the work system will affect patient outcomes. The theory is that every piece of the system is important in order for someone to do their job successfully.  If any component is inadequate or fails, the work system of the care giver will become unbalanced and increase the likelihood of harm.  I like to think of this as a puzzle where all the pieces need to fit together properly around the patient.  The SEIPS model is the framework I fall back on when considering how technology I help design fits into the healthcare system.

A quote I read recently in "Why Hospitals Should Fly" drives home the importance of taking a system approach to patient safety: "Every system is perfectly designed to get the results it consistently achieves." What would it take for the system to achieve zero patient injuries?

A Healthy Dose of Reality

Originally posted 5-6-2010:
98,000. I still remember being shocked by the number when I first heard it.

I decided to take a course on medical errors as part of my graduate work in Human Factors Engineering. It seemed like a good fit, as I was also working for a large medical device manufacturer. I figured understanding medical errors and how they occurred would help me to develop solutions using the products I was working on.  Little did I know how big of an issue medical errors were.

98,000? Was it really possible that an estimated 98,000 patients were dying every year due to medical errors? I have to admit, before hearing the number, I always figured the hospital was the safest place for someone who was sick. The next concept was just as intriguing: healthcare professionals are humans and humans make errors. It was so simple and made so much sense, yet the thought had never occurred to me.

That two credit course ended up having a significant impact on my professional career. I wanted to know more. I started reading more on medical errors and the idea of how the system, not the caregiver, fails the patient.  I searched out articles and books related to patient safety, seeking to understand the solutions being presented. It became a passion. I started to ask myself how I could use this growing knowledge of patient safety issues to design medical software to protect the patient.

10 years after the Institute of Medicine came out with that report with the estimate of 98,000 patients, the numbers have not improved much.  There is still a lot work to be done to make the healthcare system safer for patients.  On the other hand, there has been a lot of great research and solutions presented.

I want to share what I am learning. With each blog post I plan to feature and discuss an article, book, or news story related to patient safety. My goal is to raise patient safety awareness.  Reading this blog will help you to learn more about the reality of the issues that are harming patients. My hope is that it may spark discussions on solutions.