MRSA infection rates improving

Originally posted 9-1-2010:
One of the hot topics in the patient safety world is Healthcare Associated Infections (HAIs). These are infections that occur as a result of treatment in a hospital or other healthcare setting.  Rates of HAIs peaked at 2.3 infections per 1000 hospital visits in 2004 and 2005.  This may not sound like much, but it can become a serious issue when hospital stays can last up to 19 days longer and add nearly $43,000 in costs.   It can be extremely devastating and dangerous if an infection is acquired from a surgery; the cost of one Surgical Site Infection (SSI) can be up to $60,000 and significantly increase the chance.... Some of these infections are resistant to certain antibiotics, making it difficult to treat.  One particular nasty infection is Methicillin-Resistant Staphylococcus aureus, or MRSA as it is commonly known.   

The increased costs and risk to the patient is why various safety groups are looking at HAI reduction as a hot item. The significance can be seen with the Association for Professionals in Infection Control and Epidemiology (APIC)  "Targeting Zero" campaign.  The premise being to reduce the number of HAIs to zero.  It is a bold and noble goal, but a lot of work needs to be done to get there.  There has been a lot of emphasis on solutions that include better education on handwashing techniques, use of checklists to ensure proper central line catheter insertion, and electronic surveillance software.

It appears that the emphasis on reducing HAIs may be having a positive effect.  A recent article published in the Journal of the American Medical Association showed that rates of healthcare associated MRSA infections dropped 28% between 2005 and 2008.  Community acquired MRSA infections dropped 17% during this period as well. This bit of good news was also featured on NPR recently.  It is very promising to see these results and report on an area where improvements have been made.  This vigilance needs to continue to get closer to that target of zero.

It could happen to anyone...

Originally posted 8-23-2010:
Sometimes a real life story is the best way to drive home the importance of preventing medical error.  It puts a face on the patient safety issues and serves as a reminder that these errors have an effect on someone's life.  I have noticed that there is a stronger response when a medical error is looked at from a patient or patient's family point of view.  This is why Dennis Quaid has been acting as a patient safety advocate for the last few years.  The story of his twin daughters being subjected to a dangerous heparin overdose in now one of most well known patient safety stories.  A story not only identifies a potential patient safety issue, but it gets an emotional response out of people that can drive the desire to make a change.

I recently noticed how the story behind a medical error can have a stronger impact when you know the person affected.  A friend told me last week that he realized that a pharmacy gave him the wrong dose of a antibiotic.  The prescription was due to a sinus infection he was suffering from.  He noticed the infection was not actually getting better, but instead lingered causing him to miss a lot of work and just continue to live in misery.  After some investigation, he found out the prescription was correct but the error occurred in the pharmacy.  This was not a life threatening situation, but there was a temporary reduction in quality of life and productivity for him.  In addition, this is the sort of thing that can cause increased bacterial drug resistance.  I can only imagine what sort of effect this could have had if the medication had a more critical purpose.

I suggested to my friend that he go back to the pharmacy to let them know what happened, otherwise they will never be aware and learn from the mistake. The pharmacy needs to evaluate its system, find the root cause, and develop a solution to avoid this sort of wrong dosing error from occurring again. This also shows how important it is for patients to keep track of their own care.  Patients can provide the final piece of feedback to make sure the proper treatment is made.

The dangerous side effects of EHRs

Originally posted 8-12-2010:
Technology has allowed healthcare to make leaps and bounds in patient care.  It has helped solve many problems for health care professionals in treating their patients.  After years of working in healthcare, I have learned that introducing new technology has side effects and can introduce dangerous problems into the system.  A recent article in this month's Health Data Management drives this point home.

The article focuses on the types of problems that EHRs can introduce into the healthcare environment, many of which can lead to potentially dangerous situations for the patients the system is trying to protect.  This quote from David Bates in the article sums it up well: "Your EHR may prevent 10 errors for every new one it causes..."  One of the main problems is the overriding of alarms due to alert fatigue. A lack of alert standardization also is part of the reason why "eighty to 90 percent of alerts are overridden."  Other dangerous issues mentioned include entering data for the wrong patient with too many windows opened on the screen and look alike drugs on the display. The article also points out "problems with computerized systems" was ranked 7th on the ECRI Institute's top 10 list of technology hazards in healthcare.

The article states a few reasons behind why these issues are occurring with EHRs.  The one that stuck out most to me was "software design, or lack of it, is a common culprit".  That is a tough one to swallow from where I sit.  Unfortunately, I have had heard many remarks from healthcare professionals about software they find tedious to use.  Even on my recent vacation a lab worker overhearing me explaining my job, spoke very negatively about an EHR system she used to work with.  I have mentioned it a few times now, but it is very important for the IT vendors to really know their users when designing their products.  Understanding workflows and working closely with end users at implementation will lead to smooth transition with the use of the new system.  There is so much information to digest, there is no need to make it any harder for the care givers to work with the technology. Software design is not the only issue, but good design is one of the steps to make these systems safer for the patients.

One final item from the article worth mentioning is the lack of healthcare IT standardization.  The article finishes with some discussion on the advantages and disadvantages of the FDA regulating EHRs.  In my opinion, I do believe there needs to be some standardization to promote patient safety in these products.  I understand regulations can bring a lot of red tape, but setting up some standards can improve the safety and quality of these products.  It has been there for medical devices, why shouldn't it be there for medical records and decision support tools?

I'm curious to know if any of you have run into situation with an EHR that could have led to a dangerous situation.

Alarm fatigue

Originally posted 8-4-2010:
It is always interesting to find parallels between a significant current event and the healthcare world.  We have all seen the sad devastation from the oil spill in the Gulf of Mexico.  What makes it difficult for me to comprehend was it could have been prevented.  Apparently an alarm that could have warned of the explosion was turned off.  According to this interesting article I ran across recently, turning off alarms occurs quite often as people get desensitized to them.  The article went on to say that instead of instilling a sense of urgency, obnoxious false alarms would cause a "cry wolf" response.

This "alarm fatigue" happens often in healthcare.  The article highlights a study with healthcare professionals in which three-quarters admitted to becoming desensitized to important alarms.  How can it not happen?  An American Medical News brief today highlighted a 2005 study where over 16,000 alarms sounded in an 18 day period.  With that many alarms it can be pretty easy to start tuning the noise out.

I used to do a lot of work around the Operating Room and Intensive Care Unit environments.  These are environments with all sorts of medical devices that will start beeping and squawking.  When I used to interview anesthesiologists, one of the first things they would ask is "Can you reduce the amount of alarms?"  Because of medical device standards, the number of alarms would usually increase instead of decreasing. During ventilator usability studies, I would consistently observe the participant pressing the alarm silence button immediately after the first alarm sounded.  In fact, when I was a participant trying to run an infusion pump I had the same reaction...I just wanted to shut the thing up. It sometimes made me wonder why bother with the alarms if they were going to be blindly silenced right away.  I have even seen this in an actual clinical setting.  During her stay in an ICU, I saw nurses ignoring a high lung pressure ventilator alarm whenever my grandmother coughed.  She was coughing so often, they assumed it was a false alarm.

So what if one of those false alarms was signaling a critical problem that needed immediate attention?  That is what alarm fatigue does.  Alarms get turned off or ignored, which can lead to potentially dangerous and sometimes fatal situations.

It is a difficult problem to solve. You can't just get rid of the alarms.  The American Medical News article did show that a Johns Hopkins iniative reduced the amount of critical alarms by 43%.  Looking at setting appropriate default alarm levels and alarm management training seemed to have a positive effect.  I would also think making appropriate alarm level settings on a patient by patient basis would reduce the amount of false alarms as well.  An appropriate level of training on new medical devices should also combat any confusion or improper alarm settings.  Finally, device manufacturers should continue to study and understand user tasks and workflows to help evaluate which alarms are necessary.

Poor Safe Care ratings for the US

Originally posted 7-21-2010:
As a soccer fan, I really enjoyed watching the World Cup earlier this summer.  The final game came down to a tough battle between Spain and the Netherlands, with Spain bringing home the Cup for the first time ever.  Sadly for the Netherlands, it was the third time they fell just short of winning the final.

Even though the Netherlands can not claim to be number one in the world at soccer, they can claim to be number one in quality and high performance healthcare.  This is based on the results of the recently updated Commonwealth Fund survey titled Mirror, Mirror on the Wall.  The study surveyed and rated seven countries on attributes like quality of care, access to care, efficiency and compared it to health care expenditures per capita. The Netherlands received the top overall ranking as well as the top Safe Care ranking.

How did the US fare?  7th place...of the seven countries. In addition, the US had the highest health care expenditure per capita.  This is not exactly the kind of findings to brag about.  I want to highlight that in the category of Safe Care, the US also ended up last.  Some of the highlights (or lowlights) from the Safe Care section include:

  • "Among those who had a lab test in the previous two years, sicker adults in the U.S. were more likely to have been given incorrect medication or experience delays in being notified about abnormal results.
  • "Canada, Germany, and the U.S. lag in terms of using IT to receive computerized alerts or prompts about potential problems with drug doses or interactions"
  • 16 percent of the respondents believed a medical mistake was made during their treatment in the last two years.
The report did say caution should be taken on relying only on patient's perceptions to rank safety, but even that perception may have a lot of significance.  I would say that it is very significant the perception of unsafe care is coming from the patients.  You have to wonder about the trust levels from the general public in regards to their healthcare. I also find it hard to swallow that patients in this country pay a lot of money for healthcare that has such poor safety ratings.

The report is already highlighting some of the safety issues that are already known.  I don't think anything new has come to light about the quality and safety of healthcare in this country, but it represents a reality check in a report card format on where we are currently.  Needless to say, there is still a lot of work to be done.

I'll be on vacation next week.  Take some time to glance over the report and contemplate the results. I would be interested to hear your thoughts.

An example to learn from Aviation

Originally posted on 7-13-2010:
In my last post, I discussed how using Aviation as a safety model for Healthcare may have shortcomings.  I was not knocking the way Aviation and the FAA have kept air travel safe. I was questioning whether Healthcare may have relied too much on the model.

This week I'm highlighting an example of a safety initiative by the FAA that Healthcare leaders should look closely at. The FAA recently raised safety concerns when they noticed a spike in incidents in which planes violated minimum separation distances. The rates of the violation went from 2.44 per million flights to 3.28 per million in the last year (you read that right - per million).

I want to highlight the way the FAA is responding to this issue. They are holding a summit with employees, management, and safety experts next month to address this concern. They are also doing their research by "asking every air traffic controller, as well as other employees involved in air traffic operations, to tell them before the meeting what are the biggest safety problems they see. FAA officials are also fanning out to major airlines for meetings with their chief pilots. "  The FAA is going to the pilots and air traffic controllers on the front lines of the issue to come up with solutions to fix the system. On top of that, it is the third time in four years they are holding such a meeting to quickly address a safety concern.

This is the sort of reposes that needs to consistently happen in healthcare.  Some hospitals are already taking similar actions to address their safety issues, but it is still not happening everywhere.  There should be a quick response like this whenever a patient safety concern is raised.  The doctors, nurses, pharmacists, techs, and anyone else on the front lines need to be brought in to help find solutions.  There is no way to understand how the system is breaking down without understanding what the people on the front lines are dealing with.

I also want to point out the use of a reporting system for the controllers to disclose their mistakes.  The FAA receives 250-300 reports a week to spot any trends.  Can you imagine the safety trends we could spot in Healthcare with a similar reporting system?

I really like the final quote of the article: "People come to rely on the equipment and the collision warning systems, and that's bad."

Is the Aviation analogy falling short?

Originally posted 7-9-2010:
Healthcare has been looking at the aviation industry for safety inspiration for some time now.  And why not?  Over the last 30 years there has been a reduction in accident rates as the industry culture became safety focused.  (The mystery of where your luggage will end up is another story.)  Patient Safety initiatives such as the use of checklists, Crew Resource Management, and increased use of simulator training are all from aviation. Experts make comparisons to aviation when teaching Patient Safety science.  John Nance wrote an entire book, which I recommend reading, on what the ideal hospital would look like if it followed the same steps aviation did to improve the culture of safety. 

How effective have these techniques and initiatives transferred to healthcare?  This article that was published on the American Medical News website provides a good answer.  The article is a good read as it provides an overview of the aviation techniques tried in healthcare, and discusses which ones have worked.  The article also does a good job summarizing where the aviation analogies are falling short in healthcare.

One of the key things that stood out was the reporting issue.  The article states, "many reporting systems in health care fail to replicate the aviation industry's no-blame model."  The healthcare culture needs to continue to move in a direction where people are comfortable reporting safety issues and mistakes.  How can we learn from our mistakes if no one is comfortable reporting them?

I also liked Dr. Pronovost's quote at the end - "A mistake health care made is to try to take lock, stock and barrel what's been done in aviation and plunk it down in medicine. Health care has to find its own way."  Perhaps healthcare has gotten to the point where it has learned all it can from aviation.  I don't think the lessons learned from aviation safety should be ignored, but is it time for the patient safety leaders to look within for the next set of initiatives?

The Color Changing Card Trick

Originally posted 7-1-2010:
I'm going to have a little fun with this post.  This card trick video was presented in the Safety and Quality in the Medication Use System course I took last year.  Don't read ahead until you watch it:




Did you catch all four of the color changes?  If you did, then you have an astounding attention to detail.  The first time I saw this I missed every one of the real "tricks" in the video.  I think I was just a little too focused on that 3 of diamonds.

So why would I show this?  First, I think it's fun and hopefully you thought so too.  More importantly, I find this video a great example of how easy it is to miss the "obvious".  With so much focus on the cards, the other changes in the video get lost.  It illustrates how easy it is for human error to occur.

Now imagine this sort of scenario happening in the healthcare setting.  The healthcare environment contains so much important patient related information in a background filled with distractions.  Something as simple as transposing numbers on a date can be potentially dangerous.  A pharmacist friend of mine told me a story where they wrote up a drug intervention based on a lab result from June 4th.  They came to find out later that the result was actually from April 6th and the intervention was unnecessary.  They read 6/4 when it actually was showing 4/6. (Cognitive psychologists call this bias.) Luckily, the patient wasn't harmed as a result.

A good system around the clinician will catch these sort of errors before they can reach the patient.  In this case, a simple system change could be writing out the months on lab report dates instead of using the numbers.

Too many medical scans?

Originally posted 6-22-2010:
I came across a few interesting articles in the last week about overtesting in healthcare.  The Wisconsin State Journal published a national and local story about growing concerns that ER doctors are running too many tests on patients due to concerns about malpractice lawsuits.  Lawsuits are one of the major concerns in the healthcare culture, and is one of the reasons many care givers are wary of reporting medical errors and near misses. The low levels of reporting errors and near misses makes it harder to learn from the mistakes and fix the system to prevent a future occurrence.  The articles also hint at a few other issues including patient demands, overworked ER personnel, and the lack of familiarity between the ER staff and the patient. 

This quote: “We just want to make sure someone doesn’t have something that is serious or life-threatening,” does summarize why there is a lot of caution and extra testing.  The fact that missed heart attacks in the ER have dropped from 5% to under 1% in the last couple decades does show that the overtesting is producing positive results.

But what if all this testing is causing long term harm to the patient?  I read this article on MSNBC.com that suggests we are getting exposed to too much radiation via medical imaging tests.  The article says the amount of CT scans given in the US have increased significantly over the years.  The first article gave an example of a patient that had a CT scan just to make sure she did not have appendicitis.  It is a very interesting thought that showing extra caution could potentially cause long term harm to a patient.  The MSNBC article does a very good job of highlighting a few of the healthcare cultural issues that have led to this overtesting.  Patient pressure, malpractice fear, and healthcare chaos are repeated as causes.

So what's to be done?  Which way should we be leaning on this?  I think the FDA is off to the right start trying to set standards on the radiation doses for imaging tests.  The idea of a "radiation medical record" is also another great idea and points to the continuing need to digitize medical records.  I would also think setting standards or guidelines for treatment can help reduce the number of unnecessary scans.  I got a sense from the articles that this is being explored as well. Not only should this thinking apply to medical scans, but I think there is benefit to looking at standardizing other treatments.  The success of the Keystone Project in standardizing central line insertions to reduce infections is an example of this.

One final thought.  I really liked the fact the articles provided questions for patients to ask while being treated.  Patients need to be part of the feedback loop with their care givers to help the safety and quality of their care.  Questions like "Do I need this?" and "Why are we doing this?" will ensure some double checking on testing. John Nance touches on the importance of including patient feedback on their care in his book "Why Hospitals Should Fly".

Are there other areas of healthcare where you have seen overtreatment?  What do you think is causing it to happen?

Reforming Medical Education

Originally posted 6-17-2010:
One of the more interesting tidbits I heard at the NPSF Conference was the belief that medical schools are not teaching enough about patient safety.  The Lucian Leape Institute released a paper outlining five concepts that are fundamental to improving safety in the healthcare system.  The first of the five items the Institute addressed was medical education reform.  The group recently released a white paper addressing their recommendations on medical education reform entitled, "Unmet Needs: Teaching Physicians to Provide Safe Patient Care"

Members of the Lucian Leape Institute held a panel to discuss their recommendations at the conference.  I got the feeling from the discussion that medical schools have been very focused on the technical aspects of medicine without teaching much about safety issue.  Some of the recommendations presented included a need to place a higher priority on patient safety, as well as promoting teamwork and collaboration.  There was a lot of discussion around teaching and integrating patient safety science throughout the entire medical education experience.  The University of Central Florida has recently opened a new medical school, and they are trying to include these recommendations in their teaching philosophy.  For example, there has been a strong focus on the importance of handwashing included in their Microbiology course.  The discussions gave me a sense that this philosophy has been well received by the UCF students, and it may promote a safety focused cultural change.

I have seen patient safety education in Pharmacy school.  A little over a year ago, I attended the "Safety and Quality in the Medication Use System" course at the University of Wisconsin. The course was split into teaching the Human Factors principles behind patient safety and the application of them in pharmacy.  It addressed  many of the major safety concerns currently in pharmacy, while provided tools and methodologies to develop solutions. Even though the first half was a lot of review, I found the course to be helpful to better understand the safety issues of pharmacy.  I came to find out that I was in the minority among my classmates.

The course was required for all 3rd year pharmacy students before they started their clerkships.  The safety research presented was coming at the tail end of their classroom education at a time when they were pretty burned out.  Most of the students that I talked to felt there wasn't much value to the course and were more interested in learning more about drug interactions and other technical aspects of their field.  Whenever we did group exercises that looked at the root cause analysis of a medication error, my teammates generally would quip, "maybe the lighting was bad."  They seemed to think bad lighting was always a safe answer to provide as a reason for an error to occur in a pharmacy. (To be fair, that is sometimes part of the reason.) I found it disheartening to see such a cynical attitude coming from the students.  A few of the students would tell me they did not quite understand why they were learning engineering principles in the class. Some students could tell it was important, but just didn't want to deal with it as they really needed to focus on their other courses.  I actually approached the instructors to get their take.  What I was pleased to learn was the instructors were always listening to the comments and suggestions to improve the course in the future.  They admitted that the timing of when they taught the safety principles may have been coming late in their education, and could relate to the overworked students.

That is why I was pleased to hear the recommendations at the conference. Not only do I hope it catches on with other medical schools, but I hope it is looked at the other healthcare disciplines.  The sooner the students are educated, the sooner they can appreciate the science and research behind reducing medical errors.  I agree with the concept of integrating patient safety science into their traditional training. Having it as a separate course seems to create a bit of a disconnect between those principles and the everyday work the students will be engaging in.  I would imagine earlier and integrated lessons will build acceptance and promote a safer culture.

I'm curious to know if the Wisconsin course was unique or if there are others like it.  How was patient safety taught in your education, if at all?