While the integration of AI in schools holds significant promise for personalised learning, its rising use also comes with substantial, often unforeseen, downsides for students. This image starkly contrasts the idealised vision of AI in education with the potential negative realities, highlighting risks such as diminished critical thinking, increased social isolation, and an over-reliance that could foster academic dishonesty. Image (and typos) generated by Nano Banana.
Source
Education Week
Summary
A new report from the Center for Democracy and Technology warns that the rapid adoption of AI in schools is undermining students’ relationships, critical thinking and data privacy. In 2024–25, 85 % of teachers and 86 % of students used AI, yet fewer than half received any formal training. The report highlights emotional disconnection, weaker research skills and risks like data breaches and tech-fuelled bullying. While educators acknowledge AI’s benefits for efficiency and personalised learning, experts urge schools to prioritise teacher training, AI literacy, and ethical safeguards to prevent harm. Without adequate guidance, AI could deepen inequities rather than improve learning outcomes.
Key Points
AI use has surged across US classrooms, with 85 % of teachers and 86 % of students using it.
Students report weaker connections with teachers and peers due to AI use.
Teachers fear declines in students’ critical thinking and authenticity.
Less than half of teachers and students have received AI-related training.
Experts call for stronger AI literacy, ethics education and policy guardrails.
A new global survey reveals that Australian teachers are among the leading adopters of artificial intelligence in classrooms worldwide, pioneering its integration into daily teaching practices. This image celebrates Australia’s significant role in transforming educational environments into hubs of AI-augmented learning, showcasing educators actively embracing technology to enhance student engagement and outcomes. Image (and typos) generated by Nano Banana.
Source
The Conversation
Summary
According to the OECD’s 2024 Teaching and Learning International Survey (TALIS), Australian teachers rank among the world’s highest users of artificial intelligence in education, with 66 % of lower secondary teachers reporting AI use—well above the OECD average of 36 %. Most use AI for lesson planning and content learning, though fewer apply it for grading or analysing student data due to privacy and ethical concerns. The survey also highlights serious teacher stress, with Australia ranking third-highest in reported workplace stress and first in frequent stress incidents. Despite satisfaction with academic preparation, teachers feel undertrained in behaviour management, signalling the need for systemic support alongside technological adoption.
Key Points
66 % of Australian teachers use AI, placing them fourth globally.
AI is mostly used for planning and learning, not assessment or data analysis.
Australian teachers report some of the highest stress levels in the OECD.
Only half felt adequately trained in managing student behaviour.
The report calls for policies balancing teacher wellbeing with technological progress.
This image showcases the growing integration of AI in Pakistani classrooms, specifically at the University of Lahore. It depicts a dynamic learning environment where students and educators are actively engaging with artificial intelligence, highlighting the nation’s efforts to adapt its educational system to the demands of a technology-driven future. Image (and typos) generated by Nano Banana.
Source
The News (Pakistan)
Summary
Dr Kashif Salik highlights how artificial intelligence could transform education in Pakistan, especially amid challenges from climate disasters, poor infrastructure, and entrenched inequalities. While AI offers opportunities for resilient and inclusive learning—through online platforms, personalised tutoring, and adaptive instruction—its benefits remain limited by inadequate connectivity, teacher training, and gendered access to technology. The article calls for integrating AI into broader education reform, emphasising digital literacy, climate awareness, and psychological well-being. Salik argues that responsible use of AI can bridge educational gaps and sustain learning during crises, but only if supported by policy, funding, and equitable access.
Key Points
AI can improve access to education during crises and support remote learning.
Pakistan’s poor infrastructure, low digital literacy, and gender divide hinder adoption.
Initiatives like Ataleek and global grants show potential for scalable e-learning.
AI could personalise instruction and strengthen resilience in the education system.
Reform must combine technology with inclusive, climate-aware education policies.
Some people are for it, some people are against it, some people write guidelines about it. What unites most of these ‘factions’ is an almost total inability to use it effectively. Yes, I am talking about AI, the super useful, super intuitive thing that is changing your life by, well, generating some misspelled logos.
This blog post offers something different. Three real things you can do. I will tell you what they are, how to do them, why you should do them, and how much it will cost you. And I am not talking about the things everyone else is talking about.
I know about all this, because ever since ChatGPT launched late 2022, I have been all over it. I have incorporated AI tools into all aspects of my work (even life) and I have built an entire business on the back of it. Here we go then.
Thing Number 1: Updating Your Resources
Academics spend a great deal of time updating things. We update reading lists, handbooks, lecture notes, guides, textbooks, all of the time. Sometimes, we update material in areas we have deep expertise, sometimes we update a handbook in a module that got dumped on us because someone left. Here is how to leverage technology to make this process faster and less painful. You will need a subscription to ChatGPT (£20+VAT, prices as of September 2025) and access to Gemini (via Google Pro subscription, currently free for a year for students and staff or £18.99/month).
Upload the material you wish to update, for example a chapter from a textbook. You select the deep thinking or research option and ask the bot to conduct a review for updates and recommend changes to your uploaded text. Once this is ready, ask it to incorporate the changes into your text, or input them manually as needed. You then take the updated document and run it through the other model, asking it to check for accuracy.
The result is an updated document, which has been updated, verified and often reworded for clarity. Both models offer references and explain their rationale, so you can verify manually and check every step of the process.
Conclusion? You have updated your resource, and it took you one morning, instead of a week
Thing Number 2: Creating Podcasts
Yes, we can lecture, we can offer tutorials, indeed we can upload pdfs on the VLE. But why stop there? Students benefit from multi-modal forms of learning. The surest way to get the message across is to deliver the same information live, via text, audio and video, multiple times.
What has been stopping us doing this effectively? The answer is that most of us are terrible in front of a camera. Yes, students may appreciate a video podcast, but if you look like the hostage in a ransom request video, you are unlikely to hold their attention.
Here is what to do. Use one of the bots you already subscribe to, ChatGPT, or Gemini (as above) or even copilot (via an institutional licence or £19/month) to turn your lecture notes, book chapter or even recordings of live sessions into a podcast script. You can select length, style, and focus to match your intended audience.
You then go to ElevenLabs ($22+VAT/month) and make a clone of your voice. This sounds scary, but it isn’t. Just one tip, do not believe the ‘5 minutes of recording and you’ll fool your mom’ spiel. You need to find quality recordings of your voice that run for a couple of hours for good, reliable results.
Once you have your voice clone, go to HeyGen ($29+VAT/month) and create your video clone. This can be done (at high spec) by either uploading a green screen recording of yourself of about 20’ or by using a new photo-to-video feature (I know this sounds unlikely, but it works very well).
You are now good to go, having cloned yourself in audio and video. You can bring the two together in HeyGen, feed it your pre-prepared scripts and bingo, you can produce unlimited videos where you narrate your lecture scripts looking like a Hollywood star, and no one needs to expect a ransom note.
Thing Number 3: Generating Student Feedback
Most of us are used to generate student feedback on the basis of proformas and templates that combine expectations on learning outcomes with marking grids and assessment criteria.
What students crave (and keep telling us in surveys such as the NSS in the UK), is personalised feedback. Hardly anyone has the time however to personalise templates to student scripts in a way that is deeply meaningful and helpful. We usually copy-paste the proforma, and stick a line extra with an invitation to ‘come and see me if you have questions’.
The bots described above (ChatGPT, Gemini, Copilot etc) do a fantastic job of adapting templates to individualised student scripts. Yes, this requires you uploading a great deal of university resource and the student scripts to the bot, so more on this below.
And Now The Small-Print
First of all, we’ve racked up quite a bill here. You must be thinking, why should I pay for all this out of my own pocket? The answer is you should not, but even if you did, it won’t be that bad. You don’t need to keep paying the subscriptions post the point you need them (all these are PAYG) and most offer free trials or discounts at the beginning of your subscription. You may even get your university to pay for some of it (for example copilot).
Secondly, aren’t there data protection and privacy concerns? Yes there are. All of the above assumes you either own the resources you are uploading, or you have permission to do so. This is a tall order for most institutions that have an unnecessarily guarded view of AI. Some concern is real. For example, I don’t like the latest iteration of Gemini’s terms which forces third party reviewers onto your content if you want full functionality.
Thirdly, won’t a kid in Texas go without a shower, every time you say thank you to ChatGPT? I choose not to care about the latter. The Americans have bigger fish to fry at the moment.
Here you have it therefore, 3 concrete things you can do, as an academic with AI, with costs explained. Bonus feature? You can use all these tools to launch your own school on social media, like I did!
Dr Ioannis Glinavos
Academic Entrepreneur
Dr Ioannis Glinavos is a legal academic and education innovator with a strong focus on the intersection of AI and EdTech. As a pioneer in integrating artificial intelligence tools into higher education, he has developed engaging, student-centred resources—particularly in law—through podcasts, video content, and interactive platforms. His work explores how emerging technologies can enhance learning outcomes and critical thinking, while keeping students engaged in synchronous and asynchronous content.
A prominent professor warns that the widespread use of AI is actively depriving students of opportunities to develop critical thinking skills. This image dramatically visualizes AI as a looming, pervasive force in the academic lives of students, offering quick solutions that may bypass the deeper cognitive processes essential for genuine intellectual growth and independent thought. Image (and typos) generated by Nano Banana.
Source
Business Insider
Summary
Kimberley Hardcastle, assistant professor of business and marketing at Northumbria University, warns that generative AI is not just facilitating plagiarism—it’s encouraging students to outsource their thinking. Based on Anthropic data, about 39 % of student-AI interactions involved creating or polishing academic texts and another 33 % requested direct solutions. Hardcastle argues this is shifting the locus of intellectual authority toward Big Tech, making it harder for students to engage with ambiguity, weigh evidence, or claim ownership of ideas. She urges institutions to focus less on policing misuse, and more on pedagogies that preserve critical thinking and epistemic agency.
Key Points
39.3 % of student-AI chats were about composing or revising assignments; 33.5 % requested direct solutions.
AI output often is accepted uncritically because it presents polished, authoritative language.
The danger: students come to trust AI explanations over their own reasoned judgement.
Hardcastle views this as part of a larger shift: tech companies increasingly influence how “knowledge” is framed and delivered.
She suggests the response should emphasise pedagogy: design modes of teaching that foreground critical thinking over output policing.