Editions
4 Oct 2024

AI in FE: A Class Act or Artificial Hype?

In this month's edition, we take a closer look at the role that Artificial Intelligence (AI) could play in further education provision across the UK.

Back to all insights
A graduation cap sits atop a glowing light bulb resting on an open book. Surrounding the light bulb are icons representing education, technology, and science, with a chalkboard background filled with various mathematical and scientific symbols.

In the News

Are Labour making progress on skills reform?

A man with silver hair stands at a podium, speaking into a microphone. He is wearing a white shirt and a maroon tie, with a backdrop featuring the Union Jack and the words "Britain's Future."

It’s been another important month for the further education sector.

We kicked off with the news that former Shadow Minister for Children and Early Years, Helen Hayes, had been appointed Chair of the Education Select Committee. The Dulwich and West Norwood MP pledged an "ambitious programme of inquiries", including how best to create a fit-for-purpose skills system.

In the middling weeks, as the Labour Conference loomed, two major reports were released advocating for change in the FE and apprenticeships system. 

One was conducted by Multiverse, and explored how the proposed new Growth and Skills Levy could be designed most effectively to address skills shortages and unlock economic growth. The levy should be, they concluded, employer-led, easily accessible, for everyone, and excellence driven. The report also pitched the introduction of a “Right to Reskill" programme to "encourage employers to increase investment in their workforce and help employees respond to changes in the workplace brought about by new technology." 

Another report that attracted attention was released by the NFER, finding that pay differentials in colleges versus schools and industry is contributing to significant shortages in recruitment and retention in FE colleges across the country. Among its recommendations, it advocated for an increase in funding to the sector to help colleges match FE teacher pay with that of school teachers; and to devise a long-term evidence-based strategy and concrete resources to help reduce FE teacher workload.

But perhaps the key headline of this month came after the Prime Minister’s speech at the Labour conference in Liverpool, in which he fleshed out the government’s plans to fix the ‘broken’ skills system that they inherited. Sir Keir Starmer confirmed Labour's headline levy reform - the introduction of a new 'Growth and Skills Levy' - will allow funding for shorter apprenticeships by lifting the current 12 month minimum duration requirement. The reformed levy will also be targeted at training for key growth sectors, will support the introduction of foundation apprenticeships for young people, and will encourage businesses to fund more of their higher level training outside of the levy.

"We've got to give businesses more flexibility to adapt to real training needs, and also unlock the pride and ambition that young people feel when building a future not just for themselves, but for their community", the Prime Minister said at the Labour conference. "We will rebalance funding in our training system back to young people and align that with what businesses really need."

Alongside their report this month, Multiverse were keen to emphasise the need for a workforce able to "take advantage of the opportunities offered by new technologies". In fact, the potential impacts of technology were identified by the Prime Minister as one of the challenges that the country "desperately has to face up to".

Perhaps the most significant technological innovation on everyone's lips right now is AI, and what it might mean for the future. 

But what might AI look like in practice in the FE, skills and apprenticeships sector? How do we weigh up its risk versus reward? How exactly can - and should - teachers employ its power in the classroom? Could it prove a potential catalyst  for a 'skills revolution' across the country?

To consider these questions - and more - in further detail, in this month's edition we have the pleasure of sitting down with two brilliant experts in the field: Richard Foster-Fletcher, Executive Chair at Morality and Knowledge in Artificial Intelligence (MKAI), and Rachel Bradley, Specialist in AI Ethics and Governance at R People Ltd. We explore their thoughts on the potential, limitations, and solutions for AI's use in education across the UK.  

Interview: Richard Foster-Fletcher, Chair, MKAI.org

"The government must make a bold, strategic investment to put AI tools in the hands of every teacher in further education."

A smiling man with short, light brown hair wearing a black t-shirt stands in front of a blurred urban background.

1. Where do you think the further education sector currently stands in terms of its interaction with artificial intelligence, including GenAI? In what ways do you feel it has been “too cautious” in considering its potentially transformative effects?

The FE sector is already demonstrating pockets of innovation, where educators are actively exploring how AI can reduce workloads and enhance learning. In fact, there’s a strong sense that AI is a positive force, with many in the sector recognising its potential. However, this is a generational shift—AI, particularly tools like ChatGPT, requires a different approach to problem-solving. It's not simply a digital skill, but a process that involves strategic, iterative thinking. Educators must learn how to prompt, refine, and engage with AI in ways that generate meaningful outcomes, which goes beyond traditional training.

Where I see caution, it’s more at the structural level—from the government and broader strategies that are still too cautious in truly backing the sector’s innovators. The real shift needed is recognising that AI works best when it becomes an invisible enabler—running in the background, enhancing teachers’ strengths without requiring constant interaction. Many educators want AI to be omnipresent, not something they consciously “use” at every turn. The challenge lies in supporting this more intuitive, integrated approach while avoiding imposing overly simplistic solutions from the top down.

2. How could the government go about improving the awareness and accessibility of GenAI in further education? What would be the “bold” moves that you are hoping for?

The government must make a bold, strategic investment to put AI tools in the hands of every teacher in further education. This is not about a few upskilling sessions—AI adoption is complex and requires serious, ongoing support. Safeguards must be built into these systems to ensure AI tools are ethical, protect privacy, and pose no risk to teachers. Educators need a system that fully supports them in embedding AI into their work, enabling them to focus on teaching, not managing technology.

This is just the beginning—today’s tools are relatively simple productivity enhancers, not true AI. If we fail to harness these now, we won’t be ready for the far more advanced technologies that are coming. The UK must prepare educators to master current tools and be equipped to embrace each new wave of AI. This will transform the sector, making it more efficient, innovative, and globally competitive.

3. Which key areas of further education provision do you believe AI could have the greatest impact in, and how?

AI’s most immediate and impactful role in further education is in scaling personalisation. With today’s productivity tools, AI can enable teachers to provide tailored support and feedback to more students in less time. Instead of relying on rigid marking systems or one-size-fits-all assessments, AI allows teachers to seamlessly personalise assignments and feedback to fit each student’s learning journey. This enhances individual learning without replacing the essential human connection in the classroom.

We’re not talking about AI replacing teachers or grading students—far from it. AI should serve as an extension of the teacher’s abilities, enabling them to manage larger groups of students while maintaining a high level of personal engagement. This approach is achievable right now with current AI tools, and it needs to be rolled out immediately. By embracing these tools, we ensure that the sector is ready for the next wave of AI advancements rather than still trying to harness the first.

4. Where do you think the limitations of artificial intelligence in education lie?

The limitations of AI in education lie in its current inability to replace the human connection that is central to effective teaching. AI can assist with tasks like providing feedback and maintaining consistency, but the relational aspects of teaching—understanding individual student needs, fostering creativity, and building engagement—cannot be replicated by AI. AI should be a support tool, not a replacement for essential educational functions like assessment.

Another limitation arises when AI is treated as a digital skill rather than a deeper shift in how educators engage with technology. It’s not about simply learning a tool—AI adoption requires a fundamental change in mindset. Without this, educators will struggle to harness both current AI tools and the more advanced technologies yet to come.

AI’s usage will also be limited unless these systems are designed with privacy and ethics at their core. Teachers need to trust that these tools protect both their students and themselves, ensuring privacy by design.

Finally, AI must align with the sector’s sustainability goals. Colleges are committed to carbon net-zero, and AI’s energy consumption must not add to the sector’s environmental burden. 

You can connect with Richard on LinkedIn.


Interview: Rebecca Bradley, AI Ethics and Governance Specialist, R People Ltd

"When it comes to AI in education, best practice really needs to be grounded in three key areas: transparency, ethics, and inclusivity."

A woman with shoulder-length brown hair and green eyes stands outdoors, looking confidently at the camera. She wears a black leather jacket, and the background features blurred greenery.

1. Where do you think the further education sector currently stands in terms of its interaction with artificial intelligence, including GenAI? Do you think that many have been too hesitant in considering its implementation?
 
The further education sector is really starting to tap into the potential of GenAI, and there are some great examples of best practice out there. However, for many, there’s still a sense of uncertainty. It’s not a lack of vision - it’s more that these tools, while familiar to some, can still feel complex to navigate. Recent advancements, particularly in large language models, have opened up new opportunities, but they’ve also heightened the sense of overwhelm.
 
This feeling of hesitancy is understandable. Learning how to implement GenAI safely and effectively, while keeping up with such a rapid pace of technological change, can feel daunting - especially without a clear roadmap.
 
What we find makes the difference is when people start to understand how AI works. By showing its history, what it can do, and how it can be responsibly integrated into further education you start to see the hesitation lift, and initial reluctance often turns into excitement. Suddenly, AI becomes a tool that personalises learning, simplifies administrative tasks, and helps prepare learners for a tech-driven future, rather than something to be feared.
 
2. Given the breadth of artificial intelligence's capabilities, what should form the foundations of 'best practice' rules and regulations?
 
When it comes to AI in education, best practice really needs to be grounded in three key areas: transparency, ethics, and inclusivity. Transparency means making sure that everyone - students, educators, and leaders - understands how AI systems work and how decisions are made. Ethics are just as crucial; we need to be confident that AI isn’t reinforcing biases, especially in things like recruitment, assessment, or personalised learning. And then there’s inclusivity: AI should be accessible to all learners, no matter their background or level of tech knowledge.
 
The real challenge now is ensuring that everyone, not just the early adopters, has the opportunity to engage with these tools. It’s not enough to focus on those who already understand GenAI - we need to bring everyone on board to prevent knowledge gaps that could lead to inequalities in education. By ensuring that everyone has access to these tools, we can avoid creating disadvantages and make sure the benefits are shared across the board.
 
Getting these foundations right is key. When AI is implemented properly, it can be a game-changer - making learning more personal and ensuring that everyone, no matter their background, can benefit from what AI has to offer.
 
3. To what extent do you believe tackling perceptions about the risks of artificial intelligence could improve uptake in education?
 
Addressing the risks surrounding AI is one of the biggest hurdles to improving uptake in education. A lot of the hesitation isn’t because people don’t see the potential. There’s often a misconception that AI is a threat - whether it’s fears about data misuse, job loss, or unintended bias in decision-making. But when we start having open conversations about these risks, showing people that AI can be implemented safely and ethically, it helps put them more at ease.
 
We’ve created the first City & Guilds assured courses in the UK for both AI Governance and Ethics, and Prompt Engineering, to help tackle some of the big challenges in using AI responsibly. The Governance course looks at five core pillars - security, ethics, privacy, transparency, and bias - ensuring AI is used safely and fairly. Our Prompt Engineering course focuses on using AI models to get the best, most accurate outputs while maintaining safety and consistency.
 
4. Is there a role for the Government to play in enhancing accessibility and awareness of AI's capabilities in education?
 
Absolutely! The Government can have a huge impact by helping to make AI more accessible and ensuring that everyone in education understands its potential. This could mean introducing national strategies that encourage AI literacy at all levels - ensuring teachers, students, and leaders alike are equipped with the skills and knowledge they need. Providing funding for AI tools and investing in training would go a long way in preparing the FE sector to adopt AI safely.
 
There’s also a real opportunity to support research on AI’s impact in education, which could shape future policy and ensure the technology is used responsibly. By making sure GenAI tools are available to those in more disadvantaged areas, the Government can help ensure that everyone benefits from the opportunities AI can offer in education.
 
5. Which key areas of further education provision do you believe AI could have the greatest impact in, and how?
 
AI has huge potential to make a real difference in further education, particularly in personalising learning and easing the workload of administrative tasks. Imagine AI systems that adapt to each learner’s progress and strengths - helping learners get the support they need at just the right time. This level of personalisation can make a big impact on student outcomes.
 
Aside from learning, AI can also take on routine tasks like marking, scheduling, and other admin jobs that take up valuable time. By streamlining these processes, focus can be freed up for what matters most: teaching and supporting students.
 
You can connect with Rebecca on LinkedIn or email rebecca@rpeople.co.uk

Share your details and we’ll be in touch