In The News
It was Alex Burghart who, as skills minister, set the target of a 67 per cent overall achievement rate for apprenticeship standards, back in June 2022.
Figures released subsequently have revealed that the achievement rate for standards has fallen from 51.8 per cent in 2020/21 to 51.4 per cent in 2021/22. The overall achievement rate, for both apprenticeship standards and the old-fashioned apprenticeship frameworks, fell from 57.7 per cent to 53.4.
As FE Week reported, this means that three-quarters of apprenticeship providers were below the target. Undeterred by the stubborn rate, Burghart’s successor Robert Halfon used a letter to apprenticeship providers earlier this year to restate the aim of achieving 67 per cent by 2024/25.
So how is the government helping providers achieve that target? Halfon’s letter, published in March, set out several initiatives to support improvements to the achievement rate.
This included an ‘Apprentice Support Centre’ that would collate information on the support that is available for apprentices in one location. The Institute for Apprenticeships and Technical Education, the letter continues, was also carrying out a series of exceptional funding band reviews for certain standards.
However, IfATE chief executive Jennifer Coupland told the FE Week Annual Apprenticeship Conference in March that only half of the 20 apprenticeships in scope for the reviews would go through the process, which was meant to conclude in April.
However, are achievement rates the metric we ought to be measuring apprenticeship providers against? When Burghart set out the aim at the Association of Employment and Learning Providers’ national conference, the sector expressed its doubts about the achievability and suitability of the objective.
“There is a danger that we are trying to compare frameworks to standards when they are not like for like,” Jane Hickie said.
When the 2021/22 achievement rate data was released in March, Hickie further said the way rates are calculated is “out of date” and the government should instead “focus on outcomes, not outputs”. This includes tracking learner progression and earnings following an apprenticeship.
Despite this, ministers seemed determined on driving providers on an uphill struggle to achieve the 67 per cent target.
INterview
This month, MBKB's business director and CEO Mark Bremner gives us his provider's experience of achievement rates and how the system of measuring success needs to be changed if ministers hope to raise the rate over 70 per cent.
How does MBKB support apprentices to finish their programme of training?
We have a completely flexible and tailored learner journey that is reviewed every three months.
That allows us to take into account their progress towards end-point assessment and also address any emerging threats or challenges throughout the programme.
We'll give them indicative grades six months before gateway. So that they're aware of what they're working towards.
We put in any action plans or extra support if they need that throughout as well.
Some standards have professional qualifications embedded within them. We know that can affect achievement as apprentices will maybe complete the professional part of it and then leave the apprenticeship.
It can be an issue. We point out that the apprenticeship is the driving licence if you like. It proves they can actually do the job. Whereas the qualification just proves they know how to do the job.
We also make clear that it will affect the employer's and our success rate if they only do the certificate element.
I did have a conversation with another provider this week about that and they were discussing the potential of threatening the employer with a 20 per cent invoice if the apprentice pulls out after only achieving the certificate.
I don't think that's the way to go. But it was an interesting take on these things. I think it's about transparency at the beginning, and ensuring the apprentices and managers see the value of both elements.
What are some of the factors that cause apprentices to abandon training and how have breaks in learning affected dropouts?
Some of the things that affect dropouts, first and foremost, are work pressures and many organisations appearing to be short-staffed.
The lack of value that I think is sometimes attached to the apprenticeship is another factor, where the training is mandated by the organisation but it’s not something individuals want to use. We do try to get around that by tailoring the programmes and demonstrating the benefits.
Our dropout rate is pretty low in comparison to the sector and our peers.
Breaks in learning can be disruptive and are not always the best option.
I've never known so many breaks in learning as now. There's two reasons for that: I think there's more recognition of factors such as mental health, which is a positive thing overall. In terms of the apprenticeships, we can add in additional modules that deal with resilience and so on to try and assist.
Whereas historically, you would be on a BIL due to a long condition such as maternity leave or perhaps a physical condition like a broken leg, there seems to be many ’shorter’ BILs now.
It’s also worth considering that breaks in learning may be used more frequently to try and protect success rates as ‘overstayers’ are penalised in success rates. Yet in truth, once on a BIL, the apprentice may not return. Wouldn’t it be better to put in additional support and relieve some time pressure on the apprentice? That would keep them engaged and help with their welfare and career development in ‘lock step’. Driving the programme on a time-bound strategy is very unhelpful and flies in the face of what we are all trying to achieve.
I do understand the pressures of work and apprenticeships and the stresses that everybody's under. However, we believe our programmes and modules – due to the additional support and specialist modules – can actually be used to have a positive impact on welfare and mental health. Seventy-five per cent of our breaks are due to mental health concerns, but I would estimate our support can address many of these.
Are achievement rates an effective means of measuring the quality of training?
Yes and no. I don't think there's anything wrong with achievement rates or success rates. What's fundamentally flawed is the way they are calculated.
The way the government is measuring achievement is fundamentally flawed
The reason being that 15 years ago, success rate was a simple measure, based on the period August 1 to July 31 the following year, of how many people have exited that training company, of which how many have achieved their aim? That is a simple, very adequate measure.
It allows for organisations and training providers to go above and beyond with support. It allows the apprentices to take additional time with gateway if they've got anxiety about tests or prefer to stagger the EPA elements.
The new measures, where it's all focused on what was their original target date, doesn’t give the flexibility. If somebody's target date was July 20 and they go on holiday for the last two weeks of July, so they achieve in August, then that's a negative measure on the success rate.
Retention for me has always been a great measure as well. How many people have you still got on board that started with you, because that is an accurate measure of success and the support provided by the ITP.
When the national achievement rate data was published, we scored 57 per cent and I had four training companies reach out to me and say congratulations. They said it was ‘excellent to see that this is aligned to your Ofsted ‘Outstanding’'.
My response to all four of them was, it's horrific. If as a sector we are celebrating 57 per cent, we are in all sorts of trouble. Our internal measures of success, using the methodology that I outlined earlier, is above 74 per cent for every single programme, which is much more worthy of congratulations.
I understand why the four providers congratulated me, as we are considerably higher than most and it is a challenging time for the provider. We’re in the top five to 10 per cent. But that’s a pretty poor statistic, isn’t it?
The way they're measuring it is fundamentally flawed. That's the thing.
Were the ways of measuring success to be adjusted, do you think the 67 per cent target is perhaps too low then?
Absolutely. I know historically they used to take account of different sectors and I think it depends on what programme you're looking at.
I think the government could go one of two ways. If they're going to leave the measures as they are now, they need to accept that 50 per cent is a good minimum level of performance. Anything above that should be green on their apprenticeship accountability framework.
If they change it to the measure I suggest, then I would think 70 per cent should be the target.
It does massively depend upon the sector obviously. That wouldn’t work for the health and social care sector, retail, hospitality, leisure, and tourism due to the pressures and cohorts associated. While we do not deliver in those sectors, I admire the providers that are in there and value the work they do.
This is because those are sectors with a lower average of pay and a high turnover of employees?
Yes, particularly the health and social care sector. The turnover of staff is phenomenal, and I'm not surprised. It's a tough, tough job. Again, I admire everybody that's in that sector. I couldn't do that as a role, it takes a special person and the pressures are high.
So, I completely understand why those sectors have higher levels of breaks in learning and dropouts. It's not a criticism but it needs to be recognised in terms of success rates pressures put on them.
How important for providers and apprentices was the decision to lift the cap on SME apprenticeship starts?
Massively, hugely important for several reasons. One, it removed the inequality between SMEs and levy-payers. Therefore, giving access to a better range of options for the employers.
Obviously, I'm completely biased, from an MBKB point of view, but we didn't have a direct contract prior to the levy, so we had to subcontract and that was a very painful experience.
There's a lot of excellent new providers that are really strong as well that won't have access to direct government funds. Without the lifting of that cap, their growth was restricted which impacted them and, most importantly, the employers they worked with. Lifting the cap offered opportunities for SMEs to engage with these new exciting providers, it gave a fairer playing field.
Is there anything else you want to add?
At the heart of it, we completely agree with formal measures of quality because they’re required. I could put you in the direction of fantastic training providers that do an excellent job. But there are poor providers as well. So I think the success and retention rates are extremely valid.
I do think there's an urgent review required of how they're measured. If you weren’t in the training sector and the first time you looked at it, you saw we’re only at 57 per cent, you would think it's barely a pass, and that’s not accurate at all.
The government’s current measure formula is setting the whole sector to look worse than they actually are, and if they're trying to increase the awareness and uptake of apprenticeships, then they need to be showing them in their best light.
So, there's a definite middle ground, that the government is not at, which controls quality but also doesn't make the sector look poor.
Opinion
Simon Ashworth, Director of Policy for the AELP, outlines how the government could modernise the way success and quality is measured within apprenticeship.
Since the advent of apprenticeship standards and the wider apprenticeship reforms, AELP has regularly questioned whether the current success rate methodology informing Qualification Achievement Rates (QAR) is fit for purpose.
The existing methodology goes back to historic qualification-based apprenticeship frameworks. These types of apprenticeships have, of course, mostly been phased out and replaced by longer apprenticeship standards. The new standards are longer and harder and are based on knowledge, skills, and behaviours as well as including external validation through end-point assessment.
There have been some welcome tweaks to the rules and measures, such as extending the period of change of employer to up to 12 weeks through a break in learning, but these haven’t gone far enough, fast enough. As a result, we still see huge problems with the measure. For example, the way the QAR calculates overall success can in some scenarios count the same apprentice as a non-completion on multiple occasions, yet successful completions are only ever counted once. The methodology also avoids taking account of how employers, and learners, are reacting to a tight labour market with large levels of churn.
Although we recognise that accurately and appropriately measuring quality in apprenticeship delivery is an important factor in ensuring accountability, the outdated model needs to change
Although we recognise that accurately and appropriately measuring quality in apprenticeship delivery is an important factor in ensuring accountability, the outdated model needs to change. Disaggregating achievements should be a priority. Having such a broad range of sectors and occupations available means there are significant variations and the underpinning rationale on labour market and occupational specific challenges get lost in any narrative.
Alongside this, we are calling for five other changes to the framework which would deliver a more accurate accountability system. These are:
Removing the QAR Pass Rate measure and replacing it with a pass rate that relates to EPA. Assessment is a key facet of the reformed system and needs greater transparency and oversight.
Extend data capture via the Individual Learner Record (ILR) to include a range of reasons for withdrawal (non-completion). The current data capture fields on the ILR are too narrow to fully collect enough data on the underlying reason for non-completion.
Make the methodology more reflective of quality by removing withdrawals that are out of the providers’ control from the QAR measures, such as if the apprentice has been dismissed from their job.
Expand the methodology to include wider success measures. In further education, though longitudinal outcomes remain underdeveloped as a tool, measuring earnings and other indicators of success such as promotion and progression, would give a fuller picture.
Develop a more inclusive set of accountability measures that better reflect the role of employers and their specific behaviours in the wider apprenticeship system. This should include capturing feedback from providers and apprentices on their experiences.
These changes would go a long way to modernising the way we measure success and quality within apprenticeship and in turn would give both employers and learners the widest possible information when it comes to making their choice of which qualification to undertake.