Thanks as always for the feedback on previous posts and welcome to our new subscribers. In writing the last couple of weeks’ posts, it reminded me that we have been having the same conversation around training for many years… It goes something like this:
“But we must do the face-to face because the subject contains practical elements” (insert here any subject like Administering Medication, Moving and Handling, Buccal Midazolam and even the Care Certificate.)
I think the reason this conversation has come back into focus over recent weeks is because there is guidance from Skills for Care and CQC which we discussed last week around using digital methods for delivering and assessing learning. As a result, the finance people are saying “this is great, we can save money by using eLearning”, but the training people are saying no, “no we can’t do this, because”. Alongside this, we are in the midst of a massive change as a result of current circumstances, yet another reason why this conversation has come back into focus.
So let’s break it down…
Why do the training people feel so strongly about the face-to-face delivery? In my experience, it is because they care about the outcome, they care about the people they support and they generally want what is best – and we agree.
The second reason the training people feel so strongly is because there is some truth in the old adage of “if it ain’t broke, don’t fix it”, we have always delivered learning in this way, we know it and it works!
Lastly, there is a lot to be said for everyone getting together in a training room, having discussions, sharing experiences, learning and building connections with their colleagues and again we agree. One of the critical elements in Lead to Succeed Module two – “Developing Positive Culture” – is a culture that values learning and sharing experiences and face-to-face is a great way of building this.
You may have noticed a potential challenge here, in particular with the final point. Given the current Skills for Care guidance (September 2020 & see screenshot from Skills for Care webinar presentation)
Whilst it would be nice to get everyone together in a room, sharing examples and having conversations, that is far from possible at the moment, which means we have to look at things from a different perspective.
Could vs Can
This is something I plan to blog about more fully in the future. Briefly though, it is a mindset thing. What we can do is always limited by time, money and resources and that can sometimes limit our thinking i.e. “we can’t do eLearning because staff don’t like it” or “we can’t do x or y because it will take too much time”. You get the idea.
But what we could do is creative and can facilitate a mindset shift. What could we do? Well, if our mindset is positive, we could do anything. Whilst that might take a bit of effort or COVID creativity, it does shift our thinking into a space where we might just come up with something that we can do.
Things change and, if we are open to all possibilities, then who knows what might happen!
Time to capitalise on the COVID Creativity.
Hopefully you saw our recent posts about “Lockdown Learning” and “But eLearning does not work”. With a little bit of effort and a bit of nudge, we found out that those people who don’t like eLearning can actually use technology. And, with a little more of an effort, we found that we could deliver learning using different methods, whilst still achieving the evidence we need for CQC, etc.
The new normal means we cannot carry on as we were and things do need to change.
So let’s go back to the other two points we raised earlier: if it ain’t broke, don’t fix it and people caring about the outcome.
We are not suggesting that you should get rid of face-to-face training all together, but let’s at least be open to what we could do. Saying “if it ain’t broke, don’t fix it” is almost as insipid as “we have always done things this way”.
Let’s take a moment to go over the essentials
The key thing is to be clear about what we are trying to achieve or what is it we actually need – i.e. the reason we are doing the training in the first place? The purpose of training is to facilitate learning, the stuff you take away from the training and put into practice. As we have said before now our “Know, Understand and Do” methodology is all about putting learning into practice.
What we need (especially to meet CQC Regulations 17 and 18) are safe and competent staff with skills relevant to the needs of the people we support. Regulation 18 does not say “everyone must be in date”; it says that staff need to be competent.
So does delivering training mean that staff are competent, clearly the answer is NO. Does the fact that they have a certificate mean they are competent, NO. If the training matrix is up to date and everyone is “green”, does that make them competent, NO.
So does delivering the training give us what we need, partly yes, but the method of delivering the learning could be something we could look at. If we change the method but achieve similar or better results, does that give us what we need? Quite possibly, yes.
Think of it this way: there has been a medication error, so everyone attends training. Working backwards slightly, in order to be administering medication in the first place, the likelihood is the person has completed medication administration training, but something still went wrong. Is going a second time going to fix the issue, when going in the first place did not deliver the learning and practice required?
This reminds me of an article Professor Martin Green wrote back in 2018, where he outlines that the NHS spends £100,000 per MINUTE on training (just let that sink in for a second) and yet, despite spending £100k a minute on training, we still have incidents like Mid Staffordshire Hospital and the resulting Francis Report and Cavendish Review. In the article, he essentially makes the same point that hopefully I am – that training is not the panacea/fix-all solution when there is an issue. So, it is all very well to say “if it ain’t broke, don’t fix it”, but is that really true?
Lastly, the training people care about the outcome and so do we, passionately, it is why our mission is to Improve Lives Through Learning. However, if we care about the outcome, then the method for achieving the outcome should not be an issue, as long as we achieve the safe and competent staff we need to support the people we support.
What if there was a way to evidence what staff do and don’t know? Briefly, practice is inextricably linked to knowledge; if you know how to do something, we dramatically increase the chances of actually doing it. The flip side, of course, is that if you don’t know, you simply cannot apply in practice something you don’t know.
So, if we take a cross section of any group of staff, you will have a mixture of people with years of experience to draw on, some new staff and some staff who fit into the middle of that range. BUT, we deliver the same course to all of those people. I often hear managers say, “we need to be sure that staff know how to do it our way”. I do get that, but if you deliver one course to the diverse audience I have outlined, will it really facilitate learning and by definition translate into application in day-to-day practice? For some, maybe; for others, possibly not.
So, if there was a way to establish the knowledge levels within the staff group and then focus our efforts accordingly, it has to be more effective and possibly cheaper too.
Think of the experienced people I just described; if they could quickly and easily benchmark themselves against an agreed set of learning outcomes, you would naturally build a portfolio (and evidence for CQC) that a particular member of staff has the knowledge. If we then add to that real world observations of practice, we achieve what CQC are looking for – “safe and competent staff”.
Take the new staff who, in theory, know nothing because they have not attended the training before. A useful by-product of eLearning is using it to prepare them for the course, so the course is less daunting because they already know the essentials: they have heard the terminology and have at least some level of awareness. Therefore, during the face-to-face session, they can contribute more fully and potentially not be overwhelmed by the topic.
Lastly, the people in the middle. By finding out what they already do or don’t know, as the case may be, we might find that person X really needs more training and person Y does not, so WHY send both on the course?
Given that time, money and resources are under extreme pressure at the moment, finding out who NEEDS the training has to be the way forward. Train the people that need it, but build competence portfolios for everyone as part of the process and focus our stretched budget on where it is needed most..
Click’s Smart Assessment does exactly this. Throughout the course, learners create a person-centred development plan which allows them to check their knowledge and access the learning at any stage of the course. In addition, they can build an evidence portfolio to show all the best bits of the practical or face-to-face activity they have also completed. A digital solution to support you to deliver blended learning.