Students Who Lack Academic Confidence More Likely to Use AI


In a modern university library setting, a young female student with a concerned expression is intently focused on her laptop. A glowing holographic interface floats above her keyboard, displaying "ESSAY ASSIST," "RESEARCH BOT," and "CONFIDENCE BOOST!" with an encouraging smiley face. In the background, other students are also working on laptops. Image (and typos) generated by Nano Banana.
Research suggests a correlation between a lack of academic confidence in students and an increased likelihood of turning to AI tools for assistance. This image depicts a student utilising an AI interface offering “confidence boost” and “essay assist,” illustrating how AI can become a crutch for those feeling insecure about their abilities in the academic environment. Image (and typos) generated by Nano Banana.

Source

Inside Higher Ed

Summary

A survey by Inside Higher Ed and Generation Lab finds that 85 % of students claim they’ve used generative AI for coursework in the past year. Among the habits observed, students with lower self-perceived academic competence or low confidence are more likely to lean on AI tools, especially when unsure or reluctant to ask peers or instructors for help. The study distinguishes between instrumental help-seeking (clarification, explanations) and executive help-seeking (using AI to complete work). Students who trust AI more are also more likely to use it. The authors argue that universities need clearer AI policies and stronger support structures so that students don’t feel forced into overreliance.

Key Points

  • 85 % of surveyed students reported using generative AI for coursework in the past year.
  • Students with lower academic confidence or discomfort asking peers tend to rely more on AI.
  • AI use splits into two modes: instrumental (asking questions, clarifying) vs executive (using the AI to generate or complete work).
  • Trust in AI correlates with higher usage, even controlling for other variables.
  • Many students call for clear, standardised institutional policies on AI use to reduce ambiguity.

Keywords

URL

https://www.insidehighered.com/news/student-success/academic-life/2025/09/30/students-who-lack-academic-confidence-more-likely-use

Summary generated by ChatGPT 5


AI Is Robbing Students of Critical Thinking, Professor Says


In a grand, traditional university library, a menacing, cloaked digital entity with glowing red eyes representing AI looms over a group of students seated at a long table, all intensely focused on laptops with glowing blue faces. Thought bubbles emanate from the AI, offering to "GENERATE ESSAY," "SUMMARIZE," and "GIVE ANSWER." In the background, a visibly frustrated professor gestures emphatically, observing the scene. Image (and typos) generated by Nano Banana.
A prominent professor warns that the widespread use of AI is actively depriving students of opportunities to develop critical thinking skills. This image dramatically visualizes AI as a looming, pervasive force in the academic lives of students, offering quick solutions that may bypass the deeper cognitive processes essential for genuine intellectual growth and independent thought. Image (and typos) generated by Nano Banana.

Source

Business Insider

Summary

Kimberley Hardcastle, assistant professor of business and marketing at Northumbria University, warns that generative AI is not just facilitating plagiarism—it’s encouraging students to outsource their thinking. Based on Anthropic data, about 39 % of student-AI interactions involved creating or polishing academic texts and another 33 % requested direct solutions. Hardcastle argues this is shifting the locus of intellectual authority toward Big Tech, making it harder for students to engage with ambiguity, weigh evidence, or claim ownership of ideas. She urges institutions to focus less on policing misuse, and more on pedagogies that preserve critical thinking and epistemic agency.

Key Points

  • 39.3 % of student-AI chats were about composing or revising assignments; 33.5 % requested direct solutions.
  • AI output often is accepted uncritically because it presents polished, authoritative language.
  • The danger: students come to trust AI explanations over their own reasoned judgement.
  • Hardcastle views this as part of a larger shift: tech companies increasingly influence how “knowledge” is framed and delivered.
  • She suggests the response should emphasise pedagogy: design modes of teaching that foreground critical thinking over output policing.

Keywords

URL

https://www.businessinsider.com/ai-chatgpt-robbing-students-of-critical-thinking-professor-says-2025-9

Summary generated by ChatGPT 5


How to test GenAI’s impact on learning


 In a futuristic classroom or lab, a large holographic screen prominently displays a glowing human brain at its center, surrounded by various metrics like "GENAI IMPACT ASSESSMENT," "CREATIVITY INDEX," and "CRITICAL THINKING SCORE." Several individuals, some wearing VR/AR headsets, are engaged with individual holographic desks showing similar data, actively analyzing GenAI's effects on learning. Image (and typos) generated by Nano Banana.
As generative AI becomes more prevalent, understanding its true impact on student learning is paramount. This image envisions a sophisticated approach to assessing GenAI’s influence, utilising advanced metrics and holographic displays to quantify and analyse its effects on creativity, critical thinking, and overall educational outcomes. Image (and typos) generated by Nano Banana.

Source

Times Higher Education

Summary

Thibault Schrepel argues against speculation and for empirical classroom experiments to measure how generative AI truly affects student learning. He outlines simple, scalable experimental designs—e.g. groups forbidden from AI, groups using it without guidance, groups trained in prompting and critique—to compare outcomes in recall, writing quality, and reasoning. Schrepel also suggests activities like having students build AI research assistants, comparing human and AI summaries, and using AI as a Socratic tutor. He emphasises that AI won’t uniformly help or hurt; its impact depends on how it’s used, taught, and assessed.

Key Points

  • Use controlled classroom experiments with different levels of AI access/training to reveal real effects.
  • Recall or rote learning may not change much; AI’s effects show more in reasoning, argumentation and writing quality.
  • Activities like comparing AI vs human summaries or having AI play the role of interlocutor can highlight strengths and limitations.
  • Prompting, critique, and metacognitive reflection are central to converting AI from crutch to tool.
  • Banning AI outright is less useful than enabling pedagogical experimentation and shared insight across faculty.

Keywords

URL

https://www.timeshighereducation.com/campus/how-test-genais-impact-learning

Summary generated by ChatGPT 5


Universities give up using software to detect AI in students’ work


In a university meeting room, a holographic display shows a broken padlock icon, symbolizing the failure or abandonment of AI detection software. Professionals are seated around a conference table, some looking at laptops. Image (and typos) generated by Nano Banana.
Universities are re-evaluating their strategies for academic integrity as many are moving away from relying on software to detect AI-generated content in student assignments. This shift reflects growing complexities and challenges in accurately identifying AI’s role in students’ work. Image (and typos) generated by Nano Banana.

Source

RNZ

Summary

Several New Zealand universities, including Massey, Auckland, and Victoria, have abandoned AI-detection software in student assessments, citing unreliability and inconsistency. Massey University’s move followed a major online exam monitoring failure in 2024, after which academics reported that detection results were often misused to accuse students. Research shows detection tools are easy to fool, leading institutions to shift towards alternative strategies: secured in-person assessments, oral defences, and checking document version histories. Universities stress they are not giving up on integrity but adapting to a changing environment by embedding AI literacy and focusing on preventative measures rather than flawed detection.

Key Points

  • Massey, Auckland, and Victoria universities no longer use AI detection software due to poor reliability.
  • Detection tools were inconsistent, with some staff misusing results to accuse students.
  • Alternative checks include document history tracking, professional judgement, and oral exams.
  • Universities focus on secured assessments (e.g. labs, studios, exams) rather than online monitoring.
  • Shift aims to prioritise AI literacy, ethics, and learning-centred approaches over surveillance.

Keywords

URL

https://www.rnz.co.nz/news/national/574517/universities-give-up-using-software-to-detect-ai-in-students-work

Summary generated by ChatGPT 5


How AI is reshaping education – from teachers to students


A split image depicting the impact of AI on education. On the left, a female teacher stands in front of a holographic 'AI POWERED INSTRUCTION' diagram, addressing a group of students. On the right, students are engaged with 'AI LEARNING PARTNER' interfaces, one wearing a VR headset. A central glowing orb with 'EDUCATION TRANSFORMED: AI' connects both sides, symbolizing the pervasive change AI brings to both teaching and learning. Generated by Nano Banana.
From empowering educators with intelligent instruction tools to providing students with personalised AI learning partners, artificial intelligence is fundamentally reshaping every facet of education. This image illustrates the transformative journey, highlighting how AI is creating new dynamics in classrooms and preparing both teachers and learners for a future redefined by technology. Image (and typos) generated by Nano Banana.

Source

TribLIVE

Summary

In this article, educators in a Pennsylvania school district discuss how AI is being woven into teaching practice and student learning—not by replacing teachers, but amplifying their capacity. AI tools like Magic School help teachers personalise lesson plans, adjust reading levels, reduce repetitive tasks, and monitor student use. A “traffic light” system is used to label assignments by allowed level of AI. New teachers are required to learn AI tools; students begin learning about AI ethically from early grades. The district emphasises that AI should not replace human work but free teachers to focus more on interpersonal and high-order thinking.

Key Points

  • Magic School is used to adapt assignments by subject, grade, and reading level, giving teachers flexibility.
  • Teachers are being trained and supported in AI adoption via workshops, pilot programs, and guided experiments.
  • A colour-coded “traffic light” system distinguishes when AI is allowed (green), allowed for some parts (yellow), or disallowed (red).
  • Starting in early grades, students are taught what AI is and how to use it ethically; higher grades incorporate more active use.
  • The goal: reduce workload on teachers for repetitive tasks so they can devote more energy to student interaction and complex thinking.

Keywords

URL

https://triblive.com/local/regional/heres-how-ai-is-reshaping-education-from-teachers-to-students/

Summary generated by ChatGPT 5