A new global survey reveals that Australian teachers are among the leading adopters of artificial intelligence in classrooms worldwide, pioneering its integration into daily teaching practices. This image celebrates Australia’s significant role in transforming educational environments into hubs of AI-augmented learning, showcasing educators actively embracing technology to enhance student engagement and outcomes. Image (and typos) generated by Nano Banana.
Source
The Conversation
Summary
According to the OECD’s 2024 Teaching and Learning International Survey (TALIS), Australian teachers rank among the world’s highest users of artificial intelligence in education, with 66 % of lower secondary teachers reporting AI use—well above the OECD average of 36 %. Most use AI for lesson planning and content learning, though fewer apply it for grading or analysing student data due to privacy and ethical concerns. The survey also highlights serious teacher stress, with Australia ranking third-highest in reported workplace stress and first in frequent stress incidents. Despite satisfaction with academic preparation, teachers feel undertrained in behaviour management, signalling the need for systemic support alongside technological adoption.
Key Points
66 % of Australian teachers use AI, placing them fourth globally.
AI is mostly used for planning and learning, not assessment or data analysis.
Australian teachers report some of the highest stress levels in the OECD.
Only half felt adequately trained in managing student behaviour.
The report calls for policies balancing teacher wellbeing with technological progress.
This image showcases the growing integration of AI in Pakistani classrooms, specifically at the University of Lahore. It depicts a dynamic learning environment where students and educators are actively engaging with artificial intelligence, highlighting the nation’s efforts to adapt its educational system to the demands of a technology-driven future. Image (and typos) generated by Nano Banana.
Source
The News (Pakistan)
Summary
Dr Kashif Salik highlights how artificial intelligence could transform education in Pakistan, especially amid challenges from climate disasters, poor infrastructure, and entrenched inequalities. While AI offers opportunities for resilient and inclusive learning—through online platforms, personalised tutoring, and adaptive instruction—its benefits remain limited by inadequate connectivity, teacher training, and gendered access to technology. The article calls for integrating AI into broader education reform, emphasising digital literacy, climate awareness, and psychological well-being. Salik argues that responsible use of AI can bridge educational gaps and sustain learning during crises, but only if supported by policy, funding, and equitable access.
Key Points
AI can improve access to education during crises and support remote learning.
Pakistan’s poor infrastructure, low digital literacy, and gender divide hinder adoption.
Initiatives like Ataleek and global grants show potential for scalable e-learning.
AI could personalise instruction and strengthen resilience in the education system.
Reform must combine technology with inclusive, climate-aware education policies.
Despite sophisticated AI capabilities, admissions essays generated by artificial intelligence are often characterised by generic phrasing and a distinct lack of personal voice, making them relatively easy to spot. This image depicts an admissions officer using AI detection software and her own critical judgment to identify an AI-generated essay, underscoring the challenges and tools in maintaining authenticity in student applications. Image (and typos) generated by Nano Banana.
Source
Inside Higher Ed
Summary
Cornell University researchers have found that AI-generated college admission essays are noticeably generic and easily distinguished from human writing. In a study comparing 30,000 human-written essays with AI-generated versions, the latter often failed to convey authentic personal narratives. When researchers added personal details for context, AI tools tended to overemphasise keywords, producing essays that sounded even more mechanical. While the study’s authors note that AI can be helpful for editing and feedback, they warn against using it to produce full drafts. The team also developed a detection model that could identify AI-generated essays with near-perfect accuracy.
Key Points
Cornell researchers compared AI and human-written college admission essays.
AI-generated essays lacked authenticity and were easily recognised.
Adding personal traits often made AI writing sound more artificial.
AI can provide useful feedback for weaker writers but not full essays.
A detection model identified AI-written essays with high accuracy.
This image visually argues that AI literacy is not an entirely new concept but rather an evolution or “disguise” of existing digital and media literacy skills. It highlights the interconnectedness of understanding digital tools, critically evaluating information, and navigating algorithmic influences, suggesting that foundational literacies provide a strong basis for comprehending and engaging with artificial intelligence effectively. Image (and typos) generated by Nano Banana.
Source
Psychology Today
Summary
Diana E. Graber argues that “AI literacy” is not a new concept but a continuation of long-standing digital and media literacy principles. Triggered by the April 2025 executive order Advancing Artificial Intelligence Education for American Youth, the sudden focus on AI education highlights skills schools should have been teaching all along—critical thinking, ethical awareness, and responsible participation online. Graber outlines seven core areas where digital and media literacy underpin AI understanding, including misinformation, digital citizenship, privacy, and visual literacy. She warns that without these foundations, students face growing risks such as deepfake abuse, data exploitation, and online manipulation.
Key Points
AI literacy builds directly on digital and media literacy foundations.
An executive order has made AI education a US national priority.
Core literacies—critical thinking, ethics, and responsibility—are vital for safe AI use.
Key topics include misinformation, cyberbullying, privacy, and online safety.
The article urges sustained digital education rather than reactionary AI hype.
Some people are for it, some people are against it, some people write guidelines about it. What unites most of these ‘factions’ is an almost total inability to use it effectively. Yes, I am talking about AI, the super useful, super intuitive thing that is changing your life by, well, generating some misspelled logos.
This blog post offers something different. Three real things you can do. I will tell you what they are, how to do them, why you should do them, and how much it will cost you. And I am not talking about the things everyone else is talking about.
I know about all this, because ever since ChatGPT launched late 2022, I have been all over it. I have incorporated AI tools into all aspects of my work (even life) and I have built an entire business on the back of it. Here we go then.
Thing Number 1: Updating Your Resources
Academics spend a great deal of time updating things. We update reading lists, handbooks, lecture notes, guides, textbooks, all of the time. Sometimes, we update material in areas we have deep expertise, sometimes we update a handbook in a module that got dumped on us because someone left. Here is how to leverage technology to make this process faster and less painful. You will need a subscription to ChatGPT (£20+VAT, prices as of September 2025) and access to Gemini (via Google Pro subscription, currently free for a year for students and staff or £18.99/month).
Upload the material you wish to update, for example a chapter from a textbook. You select the deep thinking or research option and ask the bot to conduct a review for updates and recommend changes to your uploaded text. Once this is ready, ask it to incorporate the changes into your text, or input them manually as needed. You then take the updated document and run it through the other model, asking it to check for accuracy.
The result is an updated document, which has been updated, verified and often reworded for clarity. Both models offer references and explain their rationale, so you can verify manually and check every step of the process.
Conclusion? You have updated your resource, and it took you one morning, instead of a week
Thing Number 2: Creating Podcasts
Yes, we can lecture, we can offer tutorials, indeed we can upload pdfs on the VLE. But why stop there? Students benefit from multi-modal forms of learning. The surest way to get the message across is to deliver the same information live, via text, audio and video, multiple times.
What has been stopping us doing this effectively? The answer is that most of us are terrible in front of a camera. Yes, students may appreciate a video podcast, but if you look like the hostage in a ransom request video, you are unlikely to hold their attention.
Here is what to do. Use one of the bots you already subscribe to, ChatGPT, or Gemini (as above) or even copilot (via an institutional licence or £19/month) to turn your lecture notes, book chapter or even recordings of live sessions into a podcast script. You can select length, style, and focus to match your intended audience.
You then go to ElevenLabs ($22+VAT/month) and make a clone of your voice. This sounds scary, but it isn’t. Just one tip, do not believe the ‘5 minutes of recording and you’ll fool your mom’ spiel. You need to find quality recordings of your voice that run for a couple of hours for good, reliable results.
Once you have your voice clone, go to HeyGen ($29+VAT/month) and create your video clone. This can be done (at high spec) by either uploading a green screen recording of yourself of about 20’ or by using a new photo-to-video feature (I know this sounds unlikely, but it works very well).
You are now good to go, having cloned yourself in audio and video. You can bring the two together in HeyGen, feed it your pre-prepared scripts and bingo, you can produce unlimited videos where you narrate your lecture scripts looking like a Hollywood star, and no one needs to expect a ransom note.
Thing Number 3: Generating Student Feedback
Most of us are used to generate student feedback on the basis of proformas and templates that combine expectations on learning outcomes with marking grids and assessment criteria.
What students crave (and keep telling us in surveys such as the NSS in the UK), is personalised feedback. Hardly anyone has the time however to personalise templates to student scripts in a way that is deeply meaningful and helpful. We usually copy-paste the proforma, and stick a line extra with an invitation to ‘come and see me if you have questions’.
The bots described above (ChatGPT, Gemini, Copilot etc) do a fantastic job of adapting templates to individualised student scripts. Yes, this requires you uploading a great deal of university resource and the student scripts to the bot, so more on this below.
And Now The Small-Print
First of all, we’ve racked up quite a bill here. You must be thinking, why should I pay for all this out of my own pocket? The answer is you should not, but even if you did, it won’t be that bad. You don’t need to keep paying the subscriptions post the point you need them (all these are PAYG) and most offer free trials or discounts at the beginning of your subscription. You may even get your university to pay for some of it (for example copilot).
Secondly, aren’t there data protection and privacy concerns? Yes there are. All of the above assumes you either own the resources you are uploading, or you have permission to do so. This is a tall order for most institutions that have an unnecessarily guarded view of AI. Some concern is real. For example, I don’t like the latest iteration of Gemini’s terms which forces third party reviewers onto your content if you want full functionality.
Thirdly, won’t a kid in Texas go without a shower, every time you say thank you to ChatGPT? I choose not to care about the latter. The Americans have bigger fish to fry at the moment.
Here you have it therefore, 3 concrete things you can do, as an academic with AI, with costs explained. Bonus feature? You can use all these tools to launch your own school on social media, like I did!
Dr Ioannis Glinavos
Academic Entrepreneur
Dr Ioannis Glinavos is a legal academic and education innovator with a strong focus on the intersection of AI and EdTech. As a pioneer in integrating artificial intelligence tools into higher education, he has developed engaging, student-centred resources—particularly in law—through podcasts, video content, and interactive platforms. His work explores how emerging technologies can enhance learning outcomes and critical thinking, while keeping students engaged in synchronous and asynchronous content.