No items found.
No items found.
No items found.
Endorsed
*This resource has been tested for appropriateness in the classroom and scrutinised for safeguarding and cybersecurity issues. However, please do carry out any due diligence processes required by your own institution before using or recommending it to others.
Experimental
*This is an example of a resource that is under development and may not have been fully vetted for security, safety or ethics.  Please carry out your own safeguarding and cybersecurity due diligence before using it with students or recommending it to others.

Developing an AI Strategy in Schools

Primary
Secondary
Pupil Referral Unit
SEND
Leadership & Implementation
CPD & Training Materials
Key Stage 1
Key Stage 2
No items found.
Opinion Piece
No items found.
No items found.
Practitioners
Andrew Dax

Head of Digital Strategy, Queen Anne's School

This piece offers school leaders a clear roadmap for introducing AI in a balanced and informed manner. As a guide, it is tailored specifically for school leaders, seeking to integrate Artificial Intelligence (AI) within their establishments. The guide focuses on Narrow AI and Generative AI and delineates how pioneering institutions are harnessing its potential effectively.

This piece stresses the importance of online safety, advocating for schools to teach students about the sensible and responsible use of AI. The guide illustrates AI’s capability to reinforce pedagogical methodologies and enhance the operational efficiency of schools. Nonetheless, it is imperative to acknowledge potential pitfalls; there are concerns of students and staff developing an excessive reliance on AI, if not taught properly, and looming data privacy challenges.

1) Policy and Strategy

From the mere identification of AI tools to the development of a holistic strategy for their implementation, schools have come a long way. AI is categorised into two primary types. The first, Narrow AI, is tailored for specific tasks. Examples of this include Microsoft Learn's Reading Progress and Presentation Coach. The second, Generative AI, is capable of producing human-like responses, popular examples are ChatGPT, Bing (which uses ChatGPT) and Google Bard.

Schools such as Queen Anne's School, Epsom College, and Cottesmore School have taken the lead, and been followed by many others, in initiating AI clear policies. This trend underscores the importance of not only having a policy in place but also a distinct strategy for implementation. Ideally, this strategy should be spearheaded by a member of the leadership team to ensure any implementation of AI is in-line with the school’s policy and is appropriate for its intended application.

Recommendations:

  • Schools should look at adopting or adapting from established AI policies of pioneering institutions.
  • A clear distinction between policy and strategy is vital. The AI strategy should align seamlessly with the overarching school policy.
  • Strategic goals should reflect the broader curriculum aims, making AI an enabler rather than just an add-on.

2) E-Safety

AI not only brings about innovative educational methodologies but also introduces potential risks that students might face online.

A strategic approach, as observed in leading schools, is the development of an E-Safety curriculum that specifically addresses the nuances introduced by AI. This curriculum ensures students are aware of both the potential and the pitfalls of online AI tools. Partnerships with organisations, such as the Online Safety Alliance, underscore the significance of teaching E-Safety aspects related to AI. Their certification programs are designed to equip students, staff and parents with the necessary skills and knowledge to navigate the digital world safely.

In addition to formal curricula, communication plays a pivotal role. Regular updates via school newsletters shed light on AI topics covered each term. This keeps both students and parents informed about the ever-evolving landscape of AI in education. Furthermore, school websites serve as repositories, offering continuous updates on AI and its associated E-Safety guidelines.

AI Utilisation at Home:

An aspect that schools must be conscious of is the potential usage of AI tools at home. While controls can be placed within the school environment where support from staff is available, it's essential for parents and guardians to be aware and involved. The understanding that while the school might limit the use of certain AI tools, students might have unrestricted access to these platforms at home, makes parental awareness a crucial component of the E-Safety equation.

Data Sharing and AI Policy:

One of the paramount concerns with the integration of AI is data privacy. The distinction between different AI types, particularly Narrow AI and Generative AI, is crucial when it comes to data usage and sharing. Transparent communication about how data is used, stored, and shared is essential to dispel concerns and foster trust. Schools must ensure that all stakeholders, especially students and parents, are fully aware of the data policies in place.

Recommendations:

  • Schools should integrate AI-specific modules within their E-Safety curricula, ensuring comprehensive coverage of potential risks and safeguards.
  • Regular communication channels, such as newsletters and website updates, should be utilised to keep students and parents well-informed of the latest developments in AI and E-Safety.
  • Workshops for parents can be instrumental in extending the E-Safety net beyond school boundaries, ensuring students practise safe online behaviour even at home.
  • Data policies should be regularly reviewed and updated, with an emphasis on transparency and clarity. Any changes or amendments should be promptly communicated to all stakeholders.

3) Teaching and Learning

The integration of AI tools in teaching methodologies needs to be carefully managed. There is little evidence of the tangible impact AI tools have on enhancing student learning or promoting faster progress. A number of leading schools are adopting an action research model to monitor, track and critically analyse the impact AI tools are having on learning and attainment. Notably, the innovative approach of Royal Grammar School Newcastle to the use of AI in language classes offers enlightening insights. Sarah Buist provides further details about RGS's action research method, exploring the tangible impact of AI on student progression.

The Head of MFL adopted an AI tool tailored for language learning. Using Microsoft Reading Progress, the tool allows students to read aloud specific texts. It then offers precise feedback, to the students and the teacher, gauging their reading accuracy, pace, and overall fluency.

To measure its efficacy, an 8-week trial was initiated involving two Year 8 Spanish classes. In this study, one class, referred to as the intervention group, utilised Reading Progress for their homework. In contrast, the other class stuck to traditional methodologies. Although both groups exhibited improvements in their reading, the class using Reading Progress demonstrated a markedly superior enhancement in their language skills. Reading speed and accuracy were increased significantly in the intervention group and mispronunciation was decreased. In an era where reading and critical thinking are vital skills, this research highlights the potential use of Narrow AI tools to support student reading progress in multiple languages. This work was completed in conjunction with WhatWorked Education.

Furthermore, there are plans extend this intervention to other year groups and other subject areas. The ultimate goal is to understand the broad applicability and effectiveness of such AI tools across diverse subjects and age groups.

For action research to be effective, it is essential to identify a need or purpose that an AI tool can address. The teacher ought to present their research question and methodology to the SLT member leading the initiative. Clear communication with students and parents about the trial, its tests, and anticipated outcomes is paramount. It is also advisable to highlight potential risks or pitfalls and, where feasible, to mitigate them. Upon completing a trial, the results should be thoroughly analysed, collecting both qualitative and quantitative data wherever possible.

Schools need to address the issue of students using Generative AI to complete their homework. Staff will need to devise more imaginative approaches to ensure students undertake independent or out-of-class work that cannot merely be completed using AI. Ironically, staff can employ AI to help them reshape an existing task in such a way that students are compelled to complete it on their own. These tasks emphasise personal reflection and judgements, critical thinking, creativity, observation, and real-world application. By inputting the current task, desired learning outcomes, and other parameters into the AI, it can suggest a suitable task, supply the necessary information, and provide a marking rubric.

Recommendations:

  • Action research should be a model for schools looking to objectively measure the impact of AI tools on student progress.
  • The use of any AI tools with students should be followed by rigorous analysis to ascertain their efficacy – this is also true of any staff use.
  • Feedback mechanisms, especially ones that allow real-time student feedback, should be central to any AI tool. This immediate feedback loop can considerably enhance the learning process.
  • Results from such research should be collated, analysed, and shared across departments to ensure best practices are propagated.
  • The integration of AI tools should be a continual process, informed by regular feedback and refinements based on research findings.
  • Collaboration with tech companies ensures that schools remain at the forefront of AI educational technology, leveraging the latest tools to benefit the students.
  • Re-design tasks to engage students in meaningful independent and out-of-class work, ensuring genuine learning and reducing reliance on AI for task completion.

4) Teaching Staff and Professional Services

AI's integration into schools isn't restricted only to direct teaching and learning methodologies; its scope extends to enhancing the efficiency of administrative and professional services. This broad applicability means that the staff can utilise AI tools not just for pedagogical purposes, but also for several operational tasks that traditionally consumed a significant amount of time.

One area where AI has shown immense potential is in the realm of exam analysis. Trials show how Generative AI can be employed for analysing exam results, thereby streamlining the otherwise cumbersome process. By automating the task of number crunching, teaching staff can allocate more time to strategic interventions based on the insights derived from the analysis. This means educators can focus more on understanding trends, areas of improvement, and student needs rather than getting bogged down by the data itself.

AI tools also be used for tasks such as email checking. By categorising, prioritising, and even drafting responses, AI can help in reducing the administrative burden on staff, while improving the communication quality. The immediate benefit is that it allows educators to dedicate more of their time to actual teaching, student interactions, and professional development.

However, there is a looming challenge of over-reliance, particularly among newer teaching staff. The balance between traditional teaching methods and AI assistance is a delicate one.

Recommendations:

  • Schools should actively explore and integrate AI tools that promise to increase the efficiency of administrative tasks.
  • Encourage staff to share experiences, creating a rich repository of case studies and real-world experiences that can guide future implementations.
  • Reiterate the idea that AI is a tool to enhance teaching, not replace it, ensuring that human-led instruction remains at the heart of education.
  • Continuous training should be provided to staff to familiarise them with the functionalities of these AI tools, ensuring they're used to their full potential.
  • Regular reviews should be scheduled to evaluate the effectiveness of these AI tools in administrative tasks, ensuring they align with the school's operational and strategic goals.
  • There should be an emphasis on a balanced approach, ensuring that while AI aids in administrative tasks, the human touch and discernment in tasks like email communication are retained.

5) Students

Students of varying ages have shown an inclination towards AI tools. Narrow AI is certainly a safe place for KS1-3 students to interact and benefit from AI. However, Generative AI should be approached with far greater caution. Concerns are raised over the potential for students to use Generative AI inappropriately or to complete work for them. Any use of Generative AI should be teacher-led and closely monitored. At KS5 there is more scope for the use of Generative AI to support students in their studies. For example, are being guided to use tools like ChatGPT for immediate feedback on essays. Prompts should be provided by the teacher, so students know exactly what they should use and how. At the same time, students should be learning how to create structured prompts to get the best output. This structured approach ensures students receive guidance on how to improve rather than merely what to amend. When students submit their work, a transparent process is followed: they provide their original draft, the AI's feedback, and their revised work. Such an open and honest policy ensures educators can discern AI's influence and the student's individual effort.

Recommendations:

  • Before introducing students to AI, they should be educated about the E-Safety implications of the technology.
  • A phased introduction of AI tools should be employed, ensuring students of all ages understand their functionalities and the boundaries of their use.
  • Emphasis should be placed on creating a transparent learning environment. Students should be educated on the open and honest policy, encouraging them to understand and integrate AI feedback into their work so that their teacher can review it’s use and provide further guidance.
  • A monitoring mechanism should be in place to oversee student interactions, pinpointing areas such as academic honesty.
  • Periodic reviews should ensure the balance between student autonomy and AI-guided feedback is maintained.
  • Clear guidelines, emphasising ethical considerations, should be established for AI tool usage at home and school.

6) Parents and Guardians

The role of parents and guardians in the age of AI-driven education is crucial. Their understanding and endorsement are pivotal for the effective and ethical use of AI at home. Schools are making strides by ensuring they remain in the loop, with efforts such as newsletters and website updates centred on AI topics.

Recommendations:

  • Schools should invest in organising workshops to familiarise parents with the evolving AI landscape in education.
  • Clarity is essential. A clear line of communication should be established, ensuring parents are aware of AI's potential and pitfalls.
  • The school's stance on data usage, especially concerning AI applications, should be transparently communicated to alleviate concerns.

7) Concerns

AI's integration brings forth an array of challenges. At the forefront is the concern of academic honesty. With tools at their fingertips, there is a substantial risk of students and staff leaning too much on AI, leading to concerns of plagiarism, loss of critical thinking, and genuine learning.

Furthermore, there are specific concerns for SEND (Special Educational Needs and Disabilities) and EAL (English as an Additional Language) students. These students, due to their unique needs, might either over-rely on AI tools or find them challenging to navigate. Over-reliance could potentially impede their genuine academic progress.

Another critical concern is for Early Career Teachers (ECT). As newcomers to the profession, they might be tempted to excessively depend on AI for lesson planning and content creation, thereby possibly losing the personal touch and the art of traditional teaching methods.

Recommendations:

  • Schools should put in place stringent guidelines and offer support to ensure academic honesty is maintained in the era of AI.
  • Special considerations should be made for SEND and EAL students. Tailored guidance and support structures should be established to ensure they harness AI tools effectively without becoming overly reliant.
  • Early Career Teachers should be provided with structured guidance and mentorship. This ensures that while they are introduced to the world of AI in teaching, they also understand the importance of human-led pedagogy.
  • Workshops and training sessions should focus on striking the right balance between AI tools and traditional teaching methods to ensure holistic development.
  • Implement robust procedures around academic honesty, ensuring students and staff are aware of the ramifications of unethical AI use.
  • Regular reviews should be in place to ensure that a balance between AI tools and human-led instruction is maintained.
  • Data protection should be paramount, with transparent policies around AI-driven data collection and usage.

Conclusion:

The integration of Artificial Intelligence in schools is a promising journey with its set of challenges. By developing a well-rounded strategy encompassing all stakeholders, schools can harness the transformative power of AI. Addressing concerns and ensuring continuous evolution in tandem with technological advancements will be pivotal in shaping the future of AI in education.

Key Learning

Risks