Gen AI in Schools: The Role of School Leaders
- Michael Sciffer and Alice Leung
- Jul 22
- 4 min read
By Michael Sciffer and Alice Leung

Generative Artificial Intelligence (Gen AI) tools such as Chat GPT have been touted as inevitably transforming teaching and learning. Such claims are often motivated to sell professional learning to teachers and technology to schools, while lacking pedagogical rigour. Limited professional learning budgets and classroom teaching time can be wasted on products that are less effective than established teaching practices.
A recent example has been the hype surrounding coding in schools. Gen AI has since abolished the need for many entry-level coding jobs, and the number of computer programming jobs in the US has shrunk by 28% in two years. Replacing traditional curriculum content with narrow technical skills leaves students at risk of being redundant before entering the workforce.
Gen AI touches deeply on a broad range of ethical issues in the teaching profession that should be considered before applying tools in schools, from classroom learning activities, through to student management systems. When using new technology, principals and teachers should not only consider the safety and well-being of students, but what is gained and what is lost compared to more traditional teaching strategies.
Thus, the professional leadership of principals and teachers is paramount in the adoption of Gen AI in schools. The Australian Framework for Generative Artificial Intelligence in Schools is a useful resource for schools to effectively and ethically plan for the use of Gen AI in teaching and learning. It was developed through consultation with education departments, non-government sector peak bodies, ACARA, AERO, and AITSL.
The Framework provides high-level professional guidance to principals and teachers in the appropriate use of Gen AI in schools. The framework gets beyond the edu-tech marketing hype to centralise the professional leadership of principals and teachers. It highlights that it is the responsibility of principals to ensure Gen AI enhances learning and critical and creative thinking, respects the professionalism of teachers, does not cause harm or reinforce biases, and is used transparently.
Is Gen AI intelligent?
AI has been promoted and feared as broadly intelligent system that can replace human thinking and reasoning. But “actually existing AI” only serves a narrow set of goals through mathematical computations. For example, large language models (LLMs) such as Gen AI reassemble pre-existing text scraped from the Internet based on statistical probabilities, not through understanding the text. LLMs produce replications of human text without creativity, imagination, or comprehension. Gen AI generates text based on statistical patterns, not genuine understanding or intent. As such, Gen AI does not understand the text it creates.
This is critical for Gen AI users, especially students, to recognise. The tools are developed to encourage humans to interact with them through natural language processes. This can create the impression for the user that they are engaged in an actual conversation. This lends a level of authority for, and emotional engagement with, Gen AI tools which is not founded in reality.
What does current research say?
A central theme of educational research literature is that principals and teachers should reject the narrative that the teaching profession has little control over technological change in learning, or that existing ways of teaching and schooling are barriers to technological “progress”. Instead, the profession remains at the centre of ensuring quality teaching is delivered consistently with the needs of students and the curriculum.
Principals should ask:
Does an AI tool add value to existing practices?
Is it a tool potentially harmful to professional standards?
Will the tool diminish rich learning opportunities?
Does a tool solve problems that matter to teaching and learning?
Avoiding harm
The educational research literature has identified a range of potential harms of Gen AI that principals should consider then adopting it in schools. These include:
Potential narrowing of curriculum, pedagogy, and assessment to fit the processing needs of AI tools and the pedagogical beliefs of software developers,
Algorithmic discrimination of students based on social backgrounds,
Lack of transparency to teachers, students, parents, and governments of AI decision making,
Inconsistencies and hallucinations (errors) in the information provided, and
Undermining the capacity of students to work and think independently.
Conclusion
The key takeaway is that the professional authority of principals and teachers remains central to quality teaching and learning. Gen AI is one tool among many that may complement existing instruction practices. Claims that the teaching profession needs to get out of the way are naive about the learning needs of students, the complexities of classrooms and schools, and the major limitations of the technology.
Principals continue to have a leading role in ensuring the highest standards of teaching occur in schools, while protecting the well-being of students. From classroom teachers to education ministers, it is incumbent on all to ensure we maintain the integrity to high professional standards.
Alice Leung and Michael Sciffer presented at the University of Newcastle’s Quality Teaching in Practice Conference 2025. The materials to their session can be found here. Alice is a head teacher who leads the whole school technology. Michael is a school counsellor and PhD candidate.
References
Selwyn, N. (2024). On the limits of artificial intelligence (AI) in education. Nordisk tidsskrift for pedagogikk og kritikk, 10(1), 3-14. https://pedagogikkogkritikk.no/index.php/ntpk/article/view/6062/9573
Selwyn, N. (2025). No easy answers… what (social) science says about digitisation and schools. In A-L. Godhe, & S.S. Hasemi (Eds.), Digital kompetens för lärare (pp. 33-42). Gleerups. https://doi.org/10.31235/osf.io/3jpk5_v1
Capraro, V. et al. (2024). The impact of generative artificial intelligence on socioeconomic inequalities and policy making. PNAS Nexus, 3(6), 191-209. https://academic.oup.com/pnasnexus/article/3/6/pgae191/7689236
Dam, V. A. (2025, March 14). More than a quarter of computer-programming jobs just vanished. What happened? The Washington Post. https://www.washingtonpost.com/business/2025/03/14/programming-jobs-lost-artificial-intelligence/
Comments