Near misses and medical error – how a systems approach can change safety culture (short video)

This video explains the sort of research I have been involved in and why: it highlights that relatively small errors in medical device design can have big consequences; in terms of psychology people quickly piece together signals from their environment with their expectations; the default position seems to be suspend and investigate individuals rather than look at the arrangement and design of the broader system in which they work. James Reason (2000) says this about system changes, “We cannot change the human condition, but we can change the conditions under which humans work.” It also starts to get at the sticky subject of blame in healthcare…

chi+med blog

Annie’s Story: How A Systems Approach Can Change Safety Culture
from MedStar Health.

Here’s a short video that nicely illustrates how a systems approach can be more effective in improving patient safety.

A patient’s blood glucose level was actually extremely low but the blood glucose meter the nurse was reading indicated the exact opposite. To try and bring the ‘high’ level down the nurse gave the patient some insulin, which of course just lowered it further. The patient became unresponsive and was taken to intensive care where the problem was spotted – fortunately both the patient and their blood glucose levels recovered.

When a second nurse experienced a similar problem with a blood glucose meter the initial response resulted in one of the nurses being placed under a disciplinary investigation with threat of suspension. This shook the nurse’s confidence, yet didn’t seem to solve the problem.

Hospital staff asked a…

View original post 123 more words


2013 in review

The stats helper monkeys prepared a 2013 annual report for this blog.

Here’s an excerpt:

A New York City subway train holds 1,200 people. This blog was viewed about 3,800 times in 2013. If it were a NYC subway train, it would take about 3 trips to carry that many people.

Click here to see the complete report.

What is interaction design?

Interaction design aims to design things we interact with in an intuitive way so we can use them easily, efficiently, effectively, satisfyingly, etc. Oddly, when interaction design works really well we don’t even notice it. So, to demonstrate what interaction design is it is often effective to show when it breaks down – hopefully through these examples people ‘get it’. Getting it is not the same as reading and memorising a definition but there is more information here if you want to do that: Wiki page.

This is a breakdown which I experienced this morning when ordering a product through the Amazon App on my iPhone. This first screen shot displays when you try to register a new credit card:

AmazonApp photo-1

This second screen shot displays when you start to input information. Can you see a problem with this?

AmazonApp photo-2

What happens when you enter all the information?

If you do nothing you just stay there. If you press the ‘return’ key nothing happens. If you go back it forgets all you card details and you have to start again. What do you do?

The problem here is that the ‘save’ button is now hidden so the user doesn’t know what to do. When we’re at the first screen shot we don’t plan how we’re going to interact with the whole screen, we just hit the ‘select card type’ field and get on with it. Then we get to the end we’re stuck. We don’t even know that a ‘save’ button is hidden by the pop-up keyboard because we didn’t think about it when it was in our field of vision in the first screen shot!

Once you work out it is hidden, then comes the problem of trying to get to it – every field has a keyboard or similar input screen which pops up to obscure the save button, and once you start inputting you cannot just get rid of the input screen part. The ONLY screen that let’s you get at the ‘save’ button after you’ve finished inputting is the ‘select card type’, which is really REALLY unintuitive because this is the first one that you do!

Interaction design is all about noticing these sorts of issues and solving them so users don’t notice them and have to solve them themselves.

If you like this you might also like Microwave Racing – this short engaging video shows how interaction design can really effect even simple tasks that we do everyday.

Reversing negative discourse on #ptsafety

I saw this on Twitter via @SteveAndrews3 and thought it was very clever. It particular resonates with recent work we have been doing on understanding discourse around human error. This video is part of the Safe Care Campaign.

What ‘resilience strategies’ help medical professionals reduce error & improve performance?

***Please take part in our survey to support this research***

Resilience has broad connotations and is often associated with emotional resilience and the ability to bounce back in the face of adversity – here we might say a person can be resilient to dramatic events or incidents. These qualities are important for medical professionals but when we talk about ‘resilience strategies’ we mean something quite different.

‘Resilience Strategies’ are the informal, unofficial and inventive ways that people choose to behave to reduce the likelihood of error and to improve their performance. We do not include those things people do when they are just following official rules. Resilience strategies should either be novel and inventive things that people do, or they might be assumed to be the normal way of doing something in some context but they are not part of the formalised procedures.

For example, I do a four-point check before I leave the house, this is fairly normal practice for me now and reduces the likelihood of me forgetting my wallet, keys, phone or travel card. I know some people do similar and others don’t. I also write lists and set reminders on my phone to reduce the chance that I forget something. I might leave something by the front door so I don’t forget it before I leave home. Other examples can be found in this stream:


When I was observing nurses give chemotherapy treatment as part of my research, they also did double-checking with each other to make sure that they had the right medication for the right patient. This helps reduce error but it isn’t a resilience strategy because this is written into their formal procedures – they are just following the rules.

There were other things that I saw them do which weren’t written into the rules, which I’ve tried to capture here:

Furniss, D., Back, J. & Blandford, A. (2011). Unwritten Rules for Safety and Performance in an Oncology Day Care Unit: Testing the Resilience Markers Framework. Proc. Fourth Resilience Engineering Symposium.

Some of the things I noticed included:

  • Using the trolley and tray to organise work rather than just as something to carry mediation and equipment. This helped nurses check what medication they had and whether they had administered it all at the end of the treatment. The trolley and tray served to help bound a unit of work so it could be monitored more easily.


  • The experienced nurses also monitored what people were doing that were new to the department, e.g. one healthcare worker was replacing equipment on a trolley and didn’t know that a certain brand of equipment was preferred to another because it had less chance of leaking.
  • When programming two infusion pumps in parallel nurses would intentionally programme one and then the other so there was less chance of confusing numbers and medication.

Other things I’ve noticed since that study include:

  • Sticking sample request stickers on patients’ doors when they are in their own side rooms, so when a clinician enters they are reminded of what samples need to be collected if an appropriate opportunity presents itself.
  • Letting some patients know how to silence their own infusion pump alarms when it’s considered convenient and safe to do so. No patient is meant to interact with their own infusion pump unless it is a special PCA pump. Note that some patients who worked out how to control the devices themselves are discouraged from doing so when it is not safe to do so either because of their medication, condition or mental state.

Recently on Twitter I have noticed nurses sharing other tips, which aren’t part of the official rules or procedures, and other nurses have benefitted from this advice, e.g.:

  • A nurse recently posted that her daughter was having a panic attack and wanted advice on what to do. A number of other nurses got back to this nurse with different pieces of advice. The strategy they used which helped was putting an elastic band on the patient’s wrist and flicking it and focusing on her breathing. The nurse that gave the advice said she had an arsenal of strategies to use in this circumstance but that is her favourite.
  • On a #SNTtwitchat recently nurses were sharing advice for student nurses who were just starting. One piece of advice, which was well received, was to carry a small note pad around to note down conditions, treatment and medication so this students can see what they need to read up on later.

This sort of informal knowledge sharing is great. I wonder whether it would be possible to share these sorts of resilience strategies and tips more formally, or just find out more about how this goes on already. If you have thoughts on this please share them below, this might include specific strategies, how to share them, how to organise them and pick the best ones, or just whether this sort of approach seems promising or not.

We have started a line of research to investigate this and we want to talk to nurses and other medical professionals about this work:

Further reading on resilience strategies:

Furniss, D., Back, J. & Blandford, A. Cognitive Resilience: Can we use Twitter to make strategies more tangible? Proc. ECCE 2012.

Furniss, D., Barber, N., Lyons. I., Eliasson, L., Blandford, A. (in press). Unintentional Nonadherence: Can a spoonful of resilience help the medicine go down? BMJ Quality & Safety.

Can resilience strategies help with diabetes management?

To err is human…, but the proactive creation of resilience strategies to reduce the likelihood of error is human too.

Resilience strategies are those things we do to help us avoid error and recover from it. For example, this might be something like putting your umbrella by the front door so its not forgotten when you leave the house, setting a phone alarm so an appointment is not missed, checking that you have your keys before leaving so you’re not locked out, and so on. Until recently no one has given this class of behaviours a common name, and at UCL we have turned research efforts towards understanding these behaviours and started to think about its different applications.

Can a spoonful of resilience help the medicine go down?” is a title of a paper that is currently in press for BMJ Quality & Safety. In this viewpoint paper we consider the application of resilience strategies to the problem of people forgetting to take their medication – this is a huge problem that has big financial and healthcare costs, and it is under-researched.

Patients already adopt strategies to help them cope with adherence, e.g. setting reminders of when to take medication on their phone, having pills with dinner, and leaving medication in a place where they’ll see it as part of their daily routine. In our paper we propose studying these strategies more closely so we can understand and share suitable ones for people who would benefit from them.

One of the nice things about this approach is that we turn to the community for the answers and seek to share the most useful practices. Therefore, the community is supporting itself and will have real solutions for real problems they have in common. Even outside of medicine we do not always work out the best approach to an issue ourselves and sometimes we can benefit from the advice of others. For example, when my girlfriend and I first met we lived in separate places, I was frustrated because I would often wash my clothes and then forget it in the washing machine so I’d need to do it again. She suggested leaving out the laundry basket on the floor, or a tea towel in a telling place, so that it would remind me there’s stuff in there. It worked, no more forgetting (well, at least until my mum came round and tidied away the laundry basket). But the point is that we can share these strategies, improve practice and reduce error.

Could this approach be turned to benefit the diabetes community too?

One of the things that got me interested in this application is that someone at my girlfriend’s work puts her insulin pen in her lunch box, so she has it at the right time everyday. I thought this was a simple but clever strategy. It’s also a good example of a resilience strategy; it simplifies her life and reduces the likelihood of an error. I imagine that other people with diabetes might make simple mistakes every now and then too (we all do it), or maybe frustrating and fairly frequent mistakes like my washing machine example. I think that the same community could have simple and clever strategies that they could share with each other for reducing error and making life easier. We’re only just starting out with this research and we would love to hear your thoughts.

We are collecting resilience strategies and errors through Errordiary – you can either tweet including the hashtags #errordiary or #rsdiary depending on what you are posting, or you can post directly through the website. Whatever way you post, we suggest using the tag ‘#diabetes’ in the post so people can search for this string of letters on the website and find all the posts related to diabetes.

We are also recruiting for focus groups that will take place soon at UCL’s main campus, if you live in and around London and want to take part we would love to hear from you. More information can be found here: The focus groups were help and went well.

The survey is open and their is a prize pot of £150 to give away! Fill it in here.

A big Errordiary competition is coming soon (15th Oct); subscribe to the Errordiary mailing list to be one of the first to find out about these events.

Doctors, nurses and everyone makes mistakes. Can we talk about that?

I recently came across Dr Goldman’s TED talk titled “Doctors make mistakes. Can we talk about that?” It’s a great talk, and its sentiment fits perfectly with the research we are currently doing in and around Errordiary.

Dr Goldman paints a picture of healthcare where there are good apples (people that don’t make mistakes) and bad apples (people that make mistakes). The current way to improve the system is to get rid of the bad apples, so we’re only left with good apples, and hence no more mistakes. Of course no one wants to be classed a bad apple, so when people do make mistakes they keep it quiet, bury it, and hide it because they don’t want to be labelled a bad apple.

Here’s the twist: everyone makes mistakes, why are we talking about good and bad apples when it just encourages unfruitful discussion (pardon the pun), forces learning underground, and why are we getting rid of so called bad apples. Dr Goldman says – if we remove everyone that makes mistakes from healthcare there would be no one left! Martin Bromiley goes further to make a moving case for why it’s good to keep ‘bad apples’ in the system, “they can spread those very personal lessons on to their colleagues, and all of them will be much better clinicians as a result, and of that there is no doubt.”

Dr Goldman’s talk doesn’t just apply to doctors but nurses and everyone else too. Obviously it is more acceptable to admit your failures when you work outside of healthcare. In her TED talk Kathryn Schulz is right when she says: that in an abstract way we all know that making mistakes is part of being human. But what do we expect when doctors and nurses don their overalls and clock in to do their shift – yes, that’s right, welcome to the super-humans! Except they’re not. There are no super-humans. Everyone makes mistakes.

Dr Goldman calls for a change in culture, to redefine doctors as humans that make mistakes, rather than super-humans that don’t. Healthcare professionals are human (shock horror), and this includes nurses too! His drive is not just so doctors and nurses can feel better about making the mistakes they would make anyway, but it is a drive for a better system. So stories are not hidden, messages are not buried, and learning is not driven underground.

Errors are absolutely ubiquitous, and so what we need are error tolerant systems and lots of learning, so as errors and near misses happen we squeeze all the learning out of them. In this view errors are not only inevitable, but they can also be a healthy part of the system, i.e. they teach people about what could go wrong so they are more resilient to these sorts of errors in the future. Much like our bodies are given a small dose of vaccination to make it more resilient to a disease, so errors can make a system more resilient to accidents. It’s not only character building, but these lessons can be invaluable and come at a high cost.

After he spoke to thunderous applause and despite being mobbed by well-wishers, Sully gave me a 27-minute lecture on patient safety I won’t soon forget.

“Everything that we know in aviation, every procedure that we have, every rule in the book, every technique that we have, ultimately is because someone somewhere died,” Sully told me.

“What we have learned are lessons purchased at great cost – many of them literally bought with blood.”

Excerpt from Brian Goldman’s article, read in full here:

This request for a change in culture, is essentially a change from a blame culture to a learning or a just culture. This change means that people in the system are still accountable but there is more emphasis in finding improvements and learning than there is with finding whose fault it was. It also means running an organisation where people are encouraged to share their errors rather than hide them – to talk about them more.

So, can we talk about errors more whether we are a doctor, a nurse, a patient or a member of the general public? The theory says that we would learn more and we’d develop a safer system in the long term. However, there are real challenges with this request as doctors and nurses fear discipline, being sued and being labelled a bad apple. They’re not alone in this as many professionals outside of healthcare also do not want to talk about errors because they feel it might undermine their credibility. Also, even if doctors and nurses wanted to talk about their errors more perhaps the public would just rather not know about them. There is desire to maintain 100% belief that the healthcare systems that we depend upon are faultless or close to faultless – why worry us if we can’t do anything about it?

Talking about errors, and raising this sort of debate is something we are striving to do through Errordiary. Please have a look at the site, register and get involved.

At the moment we are recruiting for focus groups to shed light on some of these issues, we want to learn from people’s views on this, if you live in or around London we’d be interested to hear from you if you’d like to take part. More details can be found here: