AI in secondary schools: embracing the future, thoughtfully | Softcat
Skip to main content

AI in secondary schools: embracing the future, thoughtfully

What AI actually means for pupils, parents and teachers – and why people still matter most
Softcat PPT Background Diagonal Plum Aubergine Gradient RGB Softcat PPT Background Diagonal Plum Aubergine Gradient RGB

Rachel Clay

Chief Strategist - Public Sector

AI is not a passing trend. I see it as a transformative force that is already reshaping how students learn, how teachers teach and how schools operate. We are moving beyond simple AI tools towards advanced ‘agentic’ AI systems – those that can act with a degree of autonomy, make decisions, adapt to learners and initiate actions rather than simply respond to prompts. The reality is clear: AI isn’t going away. It is already shaping how knowledge is accessed, skills are developed and progress is measured. The real question for education is no longer whether AI should be present in schools, but how it is used, why and who remains in control.

Using AI well requires care and clear intent. As these tools become more capable, offering feedback, guiding learning pathways or automating decisions, the responsibility on schools becomes even greater. AI must be used in ways that enhance learning while protecting fairness, and trust, supporting safeguarding and upholding equity and ethics.

This begins with education. As leaders, teachers and decision-makers, we must understand both the capabilities and the limitations of AI. When used well, agentic AI can personalise learning, reduce administrative workload and increase engagement. However, students must remain active, critical thinkers, ethically aware and guided by human judgement, rather than being directed unquestioningly by algorithms. AI should empower learners, not quietly steer them without transparency.

The role of schools in educating parents

Schools cannot implement AI effectively without bringing parents with them. Much of the resistance in education stems from understandable concerns: fear of cheating, screen time, data misuse or the loss of ‘real learning.’ When these concerns aren’t addressed openly, mistrust grows.

Schools have a responsibility to:

  • Explain what AI tools are being used and why
  • Clarify what AI is not being used for
  • Be transparent about data, safeguarding and boundaries
  • Address myths and misunderstandings directly

Parent education might include workshops, information evenings, clear policies and practical examples of how AI supports learning rather than replaces it. Showing parents how critical thinking, ethics and media literacy are being taught can shift the conversation from fear to partnership.

Equally important is equity. Schools must recognise that access to technology varies at home and ensure that learning does not depend on private subscriptions, powerful devices or constant connectivity. Fair access and inclusive design are essential, not optional.

A parent’s perspective

As a parent, I’m acutely aware of how quickly the world is changing. My child is growing up in a world where AI is no longer a futuristic idea; it’s already in their pocket, embedded in their devices and increasingly present in their schoolwork. The possibilities are exciting: personalised learning experiences, new ways to explore ideas and tools that can remove barriers to learning.

At the same time, it can feel daunting. I worry about fairness, data privacy and whether my child is learning how to think independently, rather than deferring to AI for answers. I also worry about exposure to AI-generated content beyond the classroom, including persuasive chatbots, deepfake images and videos that blur the line between reality and fiction. Avoiding AI isn’t realistic; it’s already part of their world. The better approach is to engage with it openly, guide its use and educate ourselves so we can help our children navigate this landscape safely, ethically and confidently.

Why we should be excited about AI in secondary schools

AI holds genuine potential to enhance teaching and learning in ways that were previously difficult to achieve.

1. Personalised learning

Personalised learning is often misunderstood as making work easier. In reality, its greatest value lies in stretching students, especially in areas they care deeply about.

When used responsibly, AI can:

  • Identify gaps and strengths
  • Adapt pace without lowering expectations
  • Suggest enrichment rather than repetition
  • Support deeper exploration of interests

For example, a student passionate about environmental science could explore richer data sets, ethical debates or real-world case studies, while a student interested in creative writing might receive targeted feedback, stylistic challenges or prompts that push them beyond their comfort zone.

Importantly, teachers remain central. AI may surface patterns or possibilities, but educators provide the relational knowledge, professional judgement and pastoral care that ensure challenge is appropriate, supportive and meaningful.

2. Reducing teacher workload

Teachers spend significant time on marking, planning and administration. AI can help with objective assessments, draft feedback and analysing patterns in student progress. Agentic tools can even flag students who may be struggling emotionally or academically. Used responsibly, this frees teachers to focus on relationships, mentoring and high-quality teaching.

3. Supporting inclusion and accessibility

For students with additional learning needs, such as dyslexia or English as an additional language, AI can offer text-to-speech, real-time translation, reading support and structured guidance. Systems can adjust support dynamically, helping students build confidence without drawing attention to differences. This can play a powerful role in creating genuinely inclusive classrooms.

4. Engaging and interactive learning

AI-powered simulations, adaptive quizzes and virtual tutors can make learning more interactive and responsive. When designed ethically, these tools can encourage curiosity and independence, rather than promoting passive consumption, and help students take ownership of their learning journey.

What should we be wary of?

The promise of AI comes with real risks, particularly as tools become more capable.

Bias and fairness

AI systems learn from data. If that data reflects existing bias, inequalities can be reinforced. Human oversight remains essential to question outputs and ensure that assessments and opportunities remain fair.

Teaching students to think, not just generate

One of the greatest risks is uncritical acceptance. When generative AI produces fluent answers instantly, students may treat outputs as authoritative rather than provisional.

Critical thinking in an AI-enabled classroom must go beyond asking, “Did the AI write this?” and instead focus on:

  • Where did this answer come from?
  • What assumptions might it be making?
  • What might be missing or oversimplified?
  • How can I check or improve it?

This requires AI literacy, not just digital literacy. Students need to understand that AI does not think or hold values; it predicts language based on patterns in data. That distinction matters.

Data privacy and safeguarding

AI relies heavily on student data. Safeguarding, GDPR compliance and transparency about how data is used are non-negotiable. Schools must understand where data is stored, how long it is retained and whether it is used to train future models. With agentic AI, the risks increase if systems act or adapt without clear human oversight.

Deepfakes and misinformation

AI makes it increasingly easy to generate realistic images, audio and video. In schools, this raises serious safeguarding concerns, from fake content involving students or staff to the spread of misinformation that can damage trust and wellbeing. Students need education on media literacy, consent and the ethical use of generative tools.

Mental health, trust and over-reliance

Over-reliance on AI can affect confidence, motivation and self-esteem and students may struggle to develop resilience and independent thinking. Schools must ensure technology supports learning without replacing human connection, feedback and care.

Equity and access

Not all students have equal access to devices or connectivity outside school. AI strategies must consider these realities to avoid widening existing gaps. Equity must be part of every AI strategy, not an afterthought.

Finding the right balance

AI can feel both exciting and intimidating. It opens doors to personalised learning and new experiences, but it also challenges us to rethink safeguarding, trust and responsibility in education. Avoiding AI isn’t an option; using it wisely is, ensuring students learn to question outputs, understand limitations and use it as a tool rather than a crutch.

Responsibility is shared:

  • Schools and leaders: develop clear AI strategies, invest in staff training and put safeguarding-first
  • Teachers: explore AI tools that genuinely enhance learning, integrate them thoughtfully and model critical, responsible use.
  • Parents: talk openly with children, understand the tools they use and support healthy, balanced use.
  • Students: use AI as a learning partner, but remember that curiosity, creativity and critical thinking remain uniquely human skills.

If we get this right, AI can strengthen education rather than undermine it, preparing young people not just to live with technology, but to shape its future responsibly.

You can find out more about how Softcat supports schools, colleges and universities by visiting our Education Hub. You can also view our Data, AI and Automation solutions here.