Calls for Transparency on AI Use in Curriculum Review
The Department for Education (DfE) has yet to confirm if artificial intelligence (AI) was used to analyse over 7,000 submissions to its recent curriculum and assessment review. Despite a contract specification indicating that AI techniques are "likely" to be part of the analysis, no official statement has clarified the extent or manner of AI involvement.
Concerns from School Leaders
School leaders, including the NAHT union, have urged the DfE to be fully open about the role AI played in reviewing the evidence collected. The NAHT worries that AI might overlook important nuances in responses, which could influence the review’s outcomes. Sarah Hannafin, NAHT’s head of policy, emphasized that contributors deserve assurance their detailed input has been properly considered.
Similarly, Pepe Di’Iasio, general secretary of the Association of School and College Leaders, called for clarity on how AI was used and what safeguards were in place to ensure all responses were fairly accounted for. So far, neither the DfE nor the review panel has informed the public or respondents if AI was involved.
Risk of Missing Nuance
The NAHT highlighted that AI's effectiveness in analysing education-related consultations is unproven, especially at this scale. While AI might identify broad trends, there is concern it could miss subtle details, examples, and context that human analysis would capture.
Such gaps could affect the review’s recommendations, which have significant implications for schools and pupils. The union stresses that getting the analysis right is essential given the potential impact.
AI and the ‘Human in the Loop’ Principle
The DfE’s contract details for the analysis specify that suppliers are expected to use "cost-effective analytical techniques," including AI methods like natural language processing. The contract also calls for ethical considerations and requires a "human in the loop" to validate AI findings.
Alma Economics, the company awarded the contract, highlights its AI capabilities on its website. It promotes a tool called Cobflow, designed for public consultation analysis, which heavily incorporates AI with safeguards to ensure human oversight and compliance with statutory requirements.
However, despite repeated requests, neither Alma nor the DfE has confirmed whether Cobflow or any AI tools were used for this particular analysis.
Why This Matters for Education Professionals
Education professionals who contributed to the review deserve transparency about how their input was processed. Understanding whether AI was used—and how—affects confidence in the review’s conclusions.
Given AI’s growing presence in data analysis, those in education may benefit from staying informed about AI tools and their implications. For those interested in learning more about AI and its applications, resources such as Complete AI Training offer practical courses that cover various AI skills relevant to education and beyond.
Summary
- The DfE has not confirmed AI’s role in analysing 7,000+ curriculum review responses.
- School leaders call for transparency to ensure the process fairly reflects all contributions.
- Concerns focus on AI potentially missing nuanced feedback from education experts.
- Contract specs mention AI use with required human oversight, but details remain unclear.
- Education professionals are encouraged to seek clarity and consider AI training to understand these tools better.
Your membership also unlocks: