Students are using AI tools instead of building foundational skills – but resistance is growing


In a dimly lit library, a focused male student interacts with a glowing holographic display from his laptop, showing complex data. A red, crackling energy line extends from the display towards his hands. On the desk, an open notebook beneath it is titled 'FOUNDATIONAL SKILLS: UNDERSTANDING' with handwritten equations. Other students are visible in the background, implying a widespread trend. The scene contrasts AI tool usage with fundamental learning. Generated by Nano Banana.
The convenience of AI tools poses a growing dilemma for students: relying on them for quick answers versus engaging in the hard work of building foundational knowledge. While the allure of efficiency is strong, a movement towards prioritising true understanding and essential skills is gaining momentum. Image generated by Nano Banana.

Source

ZDNet

Summary

The rapid uptake of AI in education is fuelling concerns that students are outsourcing critical thinking and failing to build long-term skills. While AI helps with grading, planning, and coding, academics worry about “hollow” assignments that lack depth and originality. Some professors highlight students’ inability to explain code produced with AI, exposing gaps in understanding. In response, a coalition of technology faculty issued an open letter urging universities to resist uncritical adoption, warning of dependence, loss of expertise, and damage to academic freedom. Advocates argue AI should supplement—not replace—foundational skills, with careful vetting and practical use.

Key Points

  • AI is heavily used in classrooms, but risks undermining deep learning and original thought.
  • Examples show students submitting near-identical AI essays or failing to explain AI-written code.
  • Professors call for limits and redesigns to safeguard academic freedom and integrity.
  • Concerns include declining quality of computer science education and over-reliance on prompting tools.
  • Best practice is to adopt AI deliberately, ensuring it serves genuine educational purposes.

Keywords

URL

https://www.zdnet.com/article/students-are-using-ai-tools-instead-of-building-foundational-skills-but-resistance-is-growing/

Summary generated by ChatGPT 5


90% Of College Students Use AI: Higher Ed Needs AI Fluency Support Now


A large, ornate lecture hall is filled with numerous college students, each intensely focused on their glowing laptop screens displaying various AI interfaces. At the front, a professor addresses the class. A prominent holographic banner above the students reads '90% OF COLLEGE STUDENTS USE AI' with an upward-trending bar graph. The scene highlights the widespread use of AI in higher education. Generated by Nano Banana.
With a staggering 90% of college students now integrating AI tools into their academic lives, the demand for robust AI fluency support in higher education has never been more critical. This image underscores the widespread adoption of AI by students, signalling an urgent need for institutions to adapt their curricula and resources to equip learners for an AI-driven future. Image generated by Nano Banana.

Source

Forbes

Summary

AI is now deeply embedded in student life: roughly 90 % of college students report using AI tools, and the evidence suggests institutions are lagging in supporting this shift. Many students use AI for learning support—brainstorming, drafting, reviewing—but worry about its limitations, risks, and policy clarity. Educators argue that AI fluency should be integrated into curricula so students can use it responsibly, distinguish strong from weak output, and avoid over-reliance. The piece calls for higher education to embed AI ethics and practical AI skills to prepare students for a changing work environment.

Key Points

  • About 90 % of college students now use AI tools in their academic work.
  • Students use AI for brainstorming, feedback, editing, drafting—not necessarily to cheat—but feel under-prepared in distinguishing good versus bad AI output.
  • There is a gap between student usage and institutional support; many students believe their universities aren’t keeping pace.
  • AI fluency (understanding how AI works, its limitations, ethical issues) is increasingly seen as a necessary component of modern education.
  • Clear policy, guidance, and curricular integration are needed to ensure AI is a help, not a crutch.

Keywords

URL

https://www.forbes.com/sites/avivalegatt/2025/09/18/90-of-college-students-use-ai-higher-ed-needs-ai-fluency-support-now/

Summary generated by ChatGPT 5


AI has turned college exams into a ‘wicked problem’ with no obvious fix, researchers warn


In a vast, traditional exam hall filled with students taking tests on laptops, three concerned researchers stand in the foreground, looking up at a large, glowing holographic display. The display shows complex data and a central neon pink circle stating 'THE AI EXAM PROBLEM - NO OBVIOUS FIX'. The image highlights the profound challenge AI poses to the integrity of college examinations. Generated by Nano Banana.
The integration of AI has transformed college exams into a complex ‘wicked problem’ with no easy solutions, as educational researchers increasingly warn. This image visualises the dilemma, where traditional assessment methods clash with advanced AI tools, creating a challenging landscape for academic integrity and fair evaluation. Image generated by Nano Banana.

Source

Business Insider

Summary

A recent study from Deakin University warns that AI’s impact on university exams represents a “wicked problem” — one without a single clean solution. Educators are struggling to balance academic integrity, creative assessment, and workload. Some are experimenting with mixed approaches (e.g. AI-free vs AI-permitted assignments, oral exams, personalized prompts), but each introduces trade-offs: logistics, fairness, and staff burden. The study argues universities should abandon chasing one-size-fits-all fixes and instead adopt flexible, context-sensitive, iterative strategies that acknowledge complexity rather than resolve it once and for all.

Key Points

  • AI has made exam design and assessment much harder; there is no consensus among faculty about how to respond.
  • Trade-offs are everywhere: authenticity vs manageability, creativity vs scalability.
  • Some teachers offer AI-free and AI-allowed tasks; others try oral exams or reflective work—but all find challenges.
  • Assessment methods are being revised to make misuse harder, but costs (time, logistics, ethics) are substantial.
  • The “wicked problem” framing suggests there’s no perfect fix; success may lie in local experimentation and flexible policy rather than universal rules.

Keywords

URL

https://www.businessinsider.com/ai-college-exams-wicked-problem-no-clear-fix-researchers-warn-2025-9

Summary generated by ChatGPT 5


Generative AI in Higher Education Teaching and Learning: Sectoral Perspectives


Source

Higher Education Authority

Summary

This report, commissioned by the Higher Education Authority (HEA), captures sector-wide perspectives on the impact of generative AI across Irish higher education. Through ten thematic focus groups and a leadership summit, it gathered insights from academic staff, students, support personnel, and leaders. The findings show that AI is already reshaping teaching, learning, assessment, and governance, but institutional responses remain fragmented and uneven. Participants emphasised the urgent need for national coordination, values-led policies, and structured capacity-building for both staff and students.

Key cross-cutting concerns included threats to academic integrity, the fragility of current assessment practices, risks of skill erosion, and unequal access. At the same time, stakeholders recognised opportunities for AI to enhance teaching, personalise learning, support inclusion, and free staff time for higher-value educational work. A consistent theme was that AI should not be treated merely as a technical disruption but as a pedagogical and ethical challenge that requires re-examining educational purpose.

Key Points

  • Sectoral responses to AI are fragmented; coordinated national guidance is urgently needed.
  • Generative AI challenges core values of authorship, originality, and academic integrity.
  • Assessment redesign is necessary—moving towards authentic, process-focused approaches.
  • Risks include skill erosion in writing, reasoning, and information literacy if AI is overused.
  • AI literacy for staff and students must go beyond tool use to include ethics and critical thinking.
  • Ethical use of AI requires shared principles, not just compliance or detection measures.
  • Inclusion is not automatic: without deliberate design, AI risks deepening inequality.
  • Staff feel underprepared and need professional development and institutional support.
  • Infrastructure challenges extend beyond tools to governance, procurement, and policy.
  • Leadership must shape educational vision, not just manage risk or compliance.

Conclusion

Generative AI is already embedded in higher education, raising urgent questions of purpose, integrity, and equity. The consultation shows both enthusiasm and unease, but above all a readiness to engage. The report concludes that a coordinated, values-led, and inclusive approach—balancing innovation with responsibility—will be essential to ensure AI strengthens, rather than undermines, Ireland’s higher education mission.

Keywords

URL

https://hea.ie/2025/09/17/generative-ai-in-higher-education-teaching-and-learning-sectoral-perspectives/

Summary generated by ChatGPT 5


How are UK students really using AI?


A five-panel mosaic shows diverse UK students interacting with AI. Top row: a Black female student drinks coffee while using a laptop with glowing AI interfaces in a library; a male student studies a book with AI visuals; a male student in a lab coat uses a holographic AI screen with a UK university building in the background. Bottom row: a female student writes while looking at AI data; another male student explains a holographic AI model to a small group of students. The overall image depicts widespread AI integration in student life. Generated by Nano Banana.
From research assistance to code generation and personalised study aids, AI is subtly and overtly reshaping how UK students approach their academic work. This mosaic illustrates the multifaceted ways technology is being integrated into learning, posing questions about its true impact and future trajectory in higher education. Image generated by Nano Banana.

Source

Times Higher Education

Summary

About two-thirds (66 %) of UK undergraduate students report using AI for work or study, with one in three doing so at least weekly. ChatGPT is by far the most popular tool. Universities vary in how they regulate or guide AI use: some discourage it, others provide boundaries but little training, few actively teach ethical AI practice. Most students use AI for understanding concepts, summarising content, or identifying sources, while a smaller but significant share admits to using AI to create or partially create graded work. Many believe AI has helped their grades, though others see little or negative impact. Clearer guidance and teaching around ethical, effective AI use are needed.

Key Points

  • 66% of UK undergrads use AI for degree-related work; 33% use it weekly.
  • The most common applications: explaining difficult concepts (81%), summarising sources (69%), finding sources, and improving existing work.
  • About 14% of AI-using students confess to using AI in graded work in ways that could be cheating (creating or heavily editing), which is ~9% of all students.
  • 47% of AI-using students report frequently encountering “hallucinations” (incorrect or false information) from AI tools.
  • Universities’ policies are mixed: some actively discourage use; many simply warn; only a minority proactively teach students how to use AI well and ethically.

Keywords

URL

https://yougov.co.uk/society/articles/52855-how-are-uk-students-really-using-ai

Summary generated by ChatGPT 5