Lack of Formal Documentation – Not a Root Cause

When conducting root cause analysis, “Lack of formal documentation” is a suggested root cause I have often come across. It seems superficially like a good, actionable root cause. Let’s get some formal documentation of our process in place. But, I always ask, “Will the process being formally documented stop the issue from recurring?” What if people don’t follow the formally documented process? What if the existing process is poor and we are simply documenting it? It might help, of course. But it can’t be the only answer. Which means this is not the root cause – or at least it’s not the only root cause.

When reviewing a process, I always start off by asking those in the process what exactly they do and why. They will tell you what really happens. Warts and all. When you send the request but never get a response back. When the form is returned but the signature doesn’t match the name. When someone goes on vacation, their work was in process and no-one knows what’s been done or what’s next. Then I take a look at the Standard Operating Procedure (SOP) if there is one. It never matches.

So, if we get the SOP to match the actual process, our problems will go away won’t they? Of course not. You don’t only need a clearly defined process. You need people that know the process and follow it. And you also want the defined process to be good. You want it carefully thought through and the ways it might fail considered. You can then build an effective process – one that is designed to handle the possible failures. And there is a great tool for this – Failure Mode and Effects Analysis (FMEA). Those who are getting used to Quality-Based Risk Management as part of implementing section 5.0 of ICH E6 (R2) will be used to the approach of scoring risks by Likelihood, Impact and Detectability. FMEA takes you through each of the process steps to develop your list of risks and prioritise them prior to modifying the process to make it more robust. This is true preventive action. Trying to foresee issues and stop them from ever occurring. If you send a request but don’t get a response back, why might that be? Could the request have gone into spam? Could it have gone to the wrong person? How might you handle it? Etc. Etc.

Rather than the lack of a formal documented process being a root cause, it’s more likely that there is a lack of a well-designed and consistently applied process. And the action should be to agree the process and then work through how it might fail to develop a robust process. Then document that robust process and make sure it is followed. And, of course, monitor the process for failures so you can continuously improve. Perhaps more easily said than done. But better to work on that than spend time formally documenting a failing process and think you’ve fixed the problem.

Here are more of my blog posts on root cause analysis where I describe a better approach than Five Whys. Got questions or comments? Interested in training options? Contact me.

 

Text: © 2019 DMPI Ltd. All rights reserved.

Image: Standard Operating Procedures – State Dept, Bill Ahrendt

Please FDA – Retraining is NOT the Answer!

The FDA has recently issued a draft Q&A Guidance Document on “A Risk-Based Approach to Monitoring of Clinical Investigations”. Definitely worth taking a look. There are 8 questions and answers. Two that caught my eye:

Q2. “Should sponsors monitor only risks that are important and likely to occur?”

The answer mentions that sponsors should also “consider monitoring risks that are less likely to occur but could have a significant impact on the investigation quality.” These are the High Impact, Low Probability events that I talked about in this post. The simple model of calculating risk by multiplying Impact and Probability essentially prioritises a High Impact, Low Probability event the same as a Low Impact, High Probability event. But many experts in risk management think these should not be prioritized equally. High Impact, Low Probability events should be prioritised higher. So I think this is a really interesting answer.

Q7. “How should sponsors follow up on significant issues identified through monitoring, including communication of such issues?”

One part of the answer here has left me aghast. “…some examples of corrective and preventive actions that may be needed include retraining…” I have helped investigate issues in clinical trials so many times, and run root cause analysis training again and again. I always tell people that retraining is not a corrective action. Corrective actions should be based on the root cause(s). See a previous post on this and the confusing terminology. If you think someone needs retraining, ask yourself “why?” Could it be:

      • They were trained but didn’t follow the training. Why? Could it be one or more of the Behavior Engineering Model categories was not supported e.g. they didn’t have time, they didn’t have the right tools, they weren’t provided with regular feedback to tell them how they were doing? Etc. If it’s one of these, then focus on that. Retraining will not be effective.
      • They haven’t ever received training. Why? Maybe they were absent when the rest of the staff was trained and there was no plan to make sure they caught up later. They don’t need retraining – they were never trained. They need training. And is it possible that there might be others in this situation? Who else might have missed training and needs training now? Maybe at other sites too.
      • There was something missing from the training (as looks increasingly likely as one possible root cause in the tragic case of the Boeing 737 Max). Then the training needs to be modified. And it’s not about retraining one person or one site on training they had already received. It’s about training everyone on the revised training. Of course, later on, you might want to try to understand why an important component was missing from the training in the first place.

I firmly believe retraining is never the answer. There must be something deeper going on. If your only action is retraining, then you’ve not got to the root cause. I can accept reminding as an immediate action – but it’s not based on a root cause. It is more about providing feedback and is only going to have a short-term effect. An elephant may never forget but people do.

Got questions or comments? Interested in training options? Contact me.

 

Text: © 2019 DMPI Ltd. All rights reserved.

Beyond Human Error

One of my most frequently viewed posts is on human errors. I am intrigued by this. I’ve run training on root cause analysis a number of times and occasionally someone will question my claim that human error is not a root cause. Of course, it may be on the chain of cause-and-effect but why did the error occur? And you can be sure it’s not the first time the error has occurred – so why has it occurred on other occasions? What could be done to make the error less likely to occur? Using this line of questioning is how we can make process improvements and learn from things that go wrong rather than just blame someone for making a mistake and “re-training” them.

There is another approach to errors which I rather like. I was introduced to it by SAM Sather of Clinical Pathways. It comes from Gilbert’s Behavior Engineering Model and provides six categories that need to be in place to support the performance of an individual in a system:

Category Example questions
Expectations & Feedback Is there a standard for the work? Is there regular feedback?
Tools, Resources Is there enough time to perform well? Are the right tools in place?
Incentives & Disincentives Are incentives contingent on good performance?
Knowledge & Skills Is there a lack of knowledge or skill for the tasks?
Capacity & Readiness Are people the right match for the tasks?
Motives & Preferences Is there recognition of work well done?

 

Let’s take an example I’ve used a number of times: getting documents into the TMF. As you consider Gilbert’s Behavior Engineering Model you might ask:

    • Do those submitting documents know what the quality standard is?
    • Do they have time to perform the task well? Does the system help them to get it right first time?
    • Are there any incentives for performing well?
    • Do they know how to submit documents accurately?
    • Are they detail-oriented and likely to get it right?
    • Does the team celebrate success?

I have seen systems with TMF where most of the answers to those questions is “no”. Is there any wonder that there are rejection rates of 15%, cycle times of many weeks and TMFs that are never truly “inspection ready”?

After all, “if you always do what you’ve always done, you will always get what you’ve always got”. Time to change approach? Let’s get beyond human error.

Got questions or comments? Interested in training options? Contact me.

 

Text: © 2019 DMPI Ltd. All rights reserved.

DIGR-ACT® is a registered trademark of Dorricott Metrics & Process Improvement Ltd.

Picture: Based on Gilbert’s Behavior Engineering Model

Searching For Unicorns

I read recently that we have reached “peak unicorn”. I wonder if that is true. I joined a breakout discussion at SCOPE in Florida last month entitled “RBM and Critical Reasoning Skills” and the discussion shifted to unicorns. The discussion was about how difficult it is to find people with the right skills and experience for central monitoring. They need to understand the data and the systems. They need to have an understanding of processes at investigator sites. And they need to have the critical reasoning skills to make sense of everything they are seeing, to dig into the data and to escalate concerns to a broader group for consideration. Perhaps this is why our discussion turned to unicorns – these are people who are perhaps impossible to find.

It does, though, strike me in our industry how much we focus on the need for experience. Experience can be very valuable, of course, but it can also lead to “old” ways of thinking without the constant refreshing of a curious mind, new situations and people. And surely we don’t have to just rely on experience? Can’t we train people as well? After all, training is more than reading SOPs and having it recorded in your training record for auditors to check. It should be more than just the “how” for your current role. It should give you some idea of the “why” too and even improve your skills. I asked the group in the breakout discussion whether they thought critical reasoning skills can be taught – or do they come only from experience? Or are they simply innate?  The group seemed to think it was rather a mixture but the people who excel at this are those who are curious – who want to know more. Those who don’t accept everything at face value.

If we can help to develop people’s skills in critical reasoning, what training is available? Five Whys is often mentioned. I’ve written about some of the pitfalls of Five Whys previously. I’m excited to announce that I’ve been working with SAM Sather of Clinical Pathways to develop a training course to help people with those critical thinking skills. We see this as a gap in the industry and have developed a new, synthesized approach to help. If you’re interested in finding out more, go to www.digract.com.

Unfortunately, looking for real unicorns is a rather fruitless exercise. But by focusing on skills, perhaps we can help to train future central monitors in the new ways they need to think as they are presented with more and more data. And then we can leave the unicorns to fairy tales!

 

Text: © 2019 DMPI Ltd. All rights reserved.

Deliver Us From Delivery Errors

I returned home recently to find two packages waiting for me. They had been delivered while I was out. One was something I was expecting. The other was not – it was addressed to someone else. And at a completely different address (except the house number). How did that happen I wondered? I called the courier company. After waiting 15 minutes to get through, the representative listened to the problem and was clearly perplexed as the item had been signed for on the system. Eventually he started “Here’s what I can do for you…” and went on to explain how they could pick it up and deliver it to the right address. Problem solved.

Except that. It caused me inconvenience (e.g. a 20 minute call) for which no apology ever came. Their customer did not receive the service they paid for (the package would now be late). The package was put at risk – I could have kept it and no-one would have known. There was no effort at trying to understand how the error was made. They seem to be too busy for this. It has damaged their reputation – I would certainly not use that delivery firm. It was simply seen as a problem to resolve. Not an opportunity to improve.

The next day, a neighbour came round to hand over a mis-delivered parcel. You guessed it, it was the same courier company who had delivered a separate package that was for us to a neighbour. It’s great our neighbour brought it round. But the company will never hear of that error.

So many learnings from this! If the company was customer-focused they would really want to understand how such errors occur (by carrying out root cause analysis). And they would want to learn from the problems rather than just resolving each one individually. They should take a systemic approach. They should also consider that data they hold on the number of errors (mis-deliveries in this case) is incomplete. Helpful people sort mis-deliveries out for them every day without them even knowing. When they review data on the scale of the problem they should be aware that their data is an underestimate. And as for customer service, I can’t believe I didn’t even get a “sorry for the inconvenience” comment. According to a recent UK survey, 20% of people have had a parcel lost during delivery in the last 12 months. This is, after all, a critical error. Any decent company would want to really understand the issue and put systems in place to try to prevent future issues.

To me, this smacks of a culture of cost-cutting and lack of customer focus. Without a culture of continuous improvement, they will lose ground against their competitors. I have dealt with other courier companies and some of them are really on the ball. Let’s hope their management realises they need to change sooner rather than later…

 

Text: © 2018 Dorricott MPI Ltd. All rights reserved.

Don’t Waste a Good Mistake…Learn From It

Everyone is so busy. There’s not enough time to even think! This seems to be a challenge in many areas of business – we expect more and more from fewer people. Tom DeMarco describes this situation in his book “Slack” which I have recently re-read. And I think he’s on to something when he quotes “Lister’s Law – People under time pressure don’t think faster.” And of course, that’s right. Put people under time pressure and they will try to cut out wasted time. And they can re-prioritize so they spend more time on that task. They can work longer hours. But eventually, there is a limit and so people start to take cognitive short-cuts…”this problem is the same as one I’ve encountered before and so the solution must be the same”. Of course, that might be the right conclusion but if you don’t have the available time to interrogate it a little further then you run the risk of implementing the wrong solution and even making the problem worse.

One of the reasons I often hear as to why people don’t do root cause analysis is that they don’t have the time. People don’t want to be seen analysing a problem – much better to be taking action. But what if the action is the wrong action and is not based on the root cause? If the action is “re-training” you can be sure no-one has taken the time to really understand why the problem occurred. Having a good method you can rely on is part of the battle (I suggest DIGR® of course). But even knowing how is no good if you simply don’t have the time. Not having the time is ultimately a management issue. If managers asked “why” questions more and encouraged their staff to take time to think, question and get to root cause rather than rushing to a short-term fix, we would have true learning.

If we are not learning from things that go wrong to try to stop it recurring then we have missed an opportunity. If the culture of an organization is for learning and improvement then management must support staff with the right encouragement to understand, and good tools. But above all they must provide the time to really understand an issue, get to root cause and implement actions to try to stop recurrence. And if your manager isn’t providing time and encouraging you in this, challenge them on it – and get them to read this blog!

As Robert Kiyosaki said “Don’t waste a good mistake…learn from it.”

 

Text: © 2018 Dorricott MPI Ltd. All rights reserved.

DIGR® is a registered trademark of Dorricott Metrics & Process Improvement Ltd.

Is more QC ever the right answer? Part II

In part I of this post, I described how some processes have been developed that they can end up being the worst of all worlds by adding a QC step – they take longer, cost more and give quality the same (or worse) than a one step process. So why would anyone implement a process like this? Because “two sets of eyes are better than one!”

What might a learning approach with better quality and improved efficiency look like? I would suggest this:

In this process, we have a QC role and the person performing that role takes a risk-based approach to sampling the work and works together with the Specialist to improve the process by revising definitions, training etc. The sampling might be 100% for a Specialist who has not carried out the task previously. But would then reduce down to low levels as the Specialist demonstrates competence. The Specialist is now accountable for their work – all outputs come from them. If a high level of errors is found then an escalation process is needed to contain the issue and get to root cause (see previous posts). You would also want to gather data about the typical errors seen during the QC role and plot them (Pareto charts are ideal for this) to help focus on where to develop the process further.

This may remind you of the move away from 100% Source Document Verification (SDV) at sites. The challenge with a change like this is that the process is not as simple – it requires more “thinking”. What do you do if you find a certain level of errors? This is where the reviewer (or the CRA in the case of SDV) need a different approach. It can be a challenge to implement properly. But it should actually make the job more interesting.

So, back to the original question: Is more QC ever the answer? Sometimes – But make sure you think through the consequences and look for other options first.

In my next post, I’ll talk about a problem I come across again and again. People don’t seem to have enough time to think! How can you carry out effective root cause analysis or improve processes without the time to think?

Text: © 2018 Dorricott MPI Ltd. All rights reserved.

Is More QC Ever the Right Answer? Part I

In a previous post, I discussed whether retraining is ever a good answer to an issue. Short answer – NO! So what about that other common one of adding more QC?

An easy corrective action to put in place is to add more QC. Get someone else to check. In reality, this is often a band-aid because you haven’t got to the root cause and are not able to tackle it directly. So you’re relying on catching errors rather than stopping them from happening in the first place. You’re not trying for “right first time” or “quality by design”.

“Two sets of eyes are better than one!” is the common defence of multiple layers of QC. After all, if someone misses an error, someone else might find it. Sounds plausible. And it does make sense for processes that occur infrequently and have unique outputs (like a Clinical Study Report). But for processes that repeat rapidly this approach becomes highly inefficient and ineffective. Consider a process like that below:

Specialist I carries out work in the process – perhaps entering metadata in relation to a scanned document (investigator, country, document type etc). They check their work and modify it if they see errors. Then they pass it on to Specialist II who checks it and modifies it if they see any errors. Then the reviewer passes it on to the next step. Two sets of eyes. What are the problems with this approach?

  1. It takes a long time. The two steps have to be carried out in series i.e. Specialist II can’t QC the same item at the same time as Specialist I. Everything goes through two steps and a backlog forms between the Specialists. This means it takes much longer to get to the output.
  2. It is expensive. A whole process develops around managing the workflow with some items fast-tracked due to impending audit. It takes the time of two people (plus management) to carry out the task. More resources means more money.
  3. The quality is not improved. This may seem odd but if we think it through. There is no feedback loop in the process for Specialist I to learn from any errors that escape to Specialist II so Specialist I continues to let those errors pass. And the reviewer will also make errors – in fact the rework they do might actually add more errors. They may not agree on what is an error. This is not a learning process. And what if the process is under stress due to lack of resources and tight timelines? With people rushing, do they check properly? Specialist I knows That Specialist II will pick up any errors so doesn’t check thoroughly. And Specialist II knows that Specialist I always checks their work so doesn’t check thoroughly. And so more errors come out than Specialist II had not been there at all. Having everything go through a second QC as part of the process takes away accountability from the primary worker (Specialist I).

So let’s recap. A process like this takes longer, costs more and gives quality the same (or worse) than a one step process. So why would anyone implement a process like this? Because “two sets of eyes are better than one!”

What might a learning approach with better quality and improved efficiency look like? I will propose an approach in my next post. As a hint, it’s risk-based!

Text: © 2018 Dorricott MPI Ltd. All rights reserved.

I Must Do Better Next Time

I was interviewed recently by LMK Clinical Research Consulting (podcast here). I was intrigued when in the interviewer’s introduction, he said that from reading my blog he knew that I “have a fundamentally positive outlook with how humans interact with systems”. I suppose that’s true but I’d not thought of it that way before. I do often quote W. Edwards Deming “Nobody comes to work to do bad job” and “A bad system will beat a good person every time”. The approach is really one of process thinking – it’s not that people don’t matter in processes, they are crucial. But processes should be designed to take account of the variation in how people work. They should be designed around the people using them. No point blaming the individual when things go wrong – time to learn and try to stop it going wrong next time. I wrote previously about the dangers of a culture of blame from the perspective of getting to root cause. Blame is corrosive. Most people don’t want to open up in an environment where people are looking for a scape-goat – so your chance of getting to root cause is much less.

Approaching blame in this way has an interesting effect on me. When things go wrong in everyday life, my starting point isn’t to blame myself (or someone else) but rather to think “why did that go wrong?” A simple everyday example…I was purchasing petrol (“gas” in American English) and there were two card readers at the till. The retailer asked me to put my card in – which I did. He immediately said “No – not that one!” So, I took it out and put it in the other one. “That’s pretty confusing having two of them,” I said. To which he replied, “no it’s not!” I can see how it’s not confusing to him because he is using the system every day but to me it was definitely confusing. I don’t think he was particularly interested in my logic on this, so I paid and said “Good-bye”. Of course, I don’t know why he had two card readers out – what was the root cause? But even without knowing the root cause, he certainly could have put a simple correction in place by telling me which card reader to put my card in to.

There’s no question, we can all learn from our mistakes and we should take responsibility for them. But perhaps by extending the idea of no blame to ourselves, we can focus on what we can do to improve rather than simply thinking “I must do better next time.”

 

Text: © 2018 Dorricott MPI Ltd. All rights reserved.

Get rid of plastic packaging – are you mad?

There has been much in the UK media recently about the need to eliminate plastic packaging. The shocking pictures from the BBC series “Blue Planet” showing just how much plastic ends up in our oceans has been a wake-up call. We have to do something to fix this. And it seems that the solution is obvious – we even have 200 members of parliament writing to major supermarkets calling for plastic-free aisles. Let’s rid the world of plastic packaging. But I worry that we are in danger of “throwing the baby out with the bath water.”

It is important to understand what the problem is first – what is the problem we are trying to solve? Then let’s investigate the problem and see if we can understand the root causes. After that, we can focus our solutions on the root causes. This is the most efficient and effective way to solve problems. Jumping straight to solutions without even understanding the problem risks unfocused, inefficient actions and unintended consequences.

So what is the problem? Too much plastic in the oceans. What sort of plastic? Mostly packaging. Where does it come from? Mainly 10 rivers – 8 in Asia and 2 in Africa. Why from those rivers and not other ones? They pass through very populated areas where there is limited collection and even less recycling of plastic waste. Of course, the reasons for this limited collection and recycling are many and varied but by focusing on those, we have a good chance of having a real impact on the problem and reducing the new plastic going in to the oceans. We could also work on ways to try to reduce the plastic that is already there and reduce our use of unnecessary plastic packaging such as bottled water and plastic-coated single-use coffee cups.

But – the focus in the media and by the MPs seems to be on getting rid of plastic packaging in the UK all together.  Given what the problem is and the source for much of the plastics in the oceans, getting rid of plastic packaging in the UK seems an odd solution. It doesn’t appear to be focused on the root cause(s). And, of course, it does not consider the unintended consequences. In many circumstances, plastics are the most effective and efficient type of packaging. Using films and modified atmosphere packaging to wrap fresh meats can more than double shelf life. Cucumbers can last weeks rather than days when shrink-wrapped. These huge increases in shelf life mean much more efficient supply chains with larger, more efficient production runs, fewer deliveries, less stock rotation and, most importantly, much less waste from farm to plate. 1/3 of food in the UK is thrown away – food that uses resources to be grown, processed and transported. Plastics, when used appropriately and handled well at end-of-life are a real boon to the environment by substantially reducing food waste.

We must try to reduce packaging to a minimum – reduce, reuse, recycle. But let’s define the problem first before we go rushing off into seemingly simple, populist solutions that may have unintended consequences. The first step in solving a problem is to define the problem. Then try to understand the root cause(s) and  develop solutions focused on theose root cause(s).

Packaging has an important job to do. And plastic packaging plays a very important role in keeping food waste down.

Let’s not get rid of plastic packaging!

 

Text: © 2018 Dorricott MPI Ltd. All rights reserved.