Students’ complicated relationship with AI: ‘It’s inherently going against what college is’


A student stands in a grand, traditional library, looking conflicted between two glowing holographic displays. To the left, a blue 'AI: EFFICIENCY' display shows data and code. To the right, an orange 'COLLEGE: UNDERSTANDING' display hovers over an open book and desk lamp. The image symbolizes the internal conflict students face regarding AI in academia. Generated by Nano Banana.
Navigating the academic world with new AI tools presents a complex dilemma for students. This image illustrates the tension between the efficiency offered by AI and the foundational pursuit of deep understanding inherent to college education. It captures the internal debate students face as technology challenges traditional learning. Image generated by and typos courtesy of Nano Banana.

Source

The Irish Times

Summary

Many students express tension between using generative AI (GenAI) tools like ChatGPT and the traditional values of university education. Some avoid AI because they feel it undermines academic integrity or the effort they invested; others see benefit in using it for organising study, generating ideas, or off-loading mundane parts of coursework. Concerns include fairness (getting better grades for less effort), accuracy of chatbot-generated content, and environmental impact. Students also worry about loss of critical thinking and the changing nature of assignments as AI becomes more common. There is a call for clearer institutional guidelines, more awareness of policies, and equitable access and use.

Key Points

  • Using GenAI can feel like “offloading work,” conflicting with the idea of self-learning which many students believe defines college life.
  • Students worry about fairness: those who use AI may gain advantage over those who do not.
  • Accuracy is a concern: ChatGPT sometimes provides false information; students are aware of this risk.
  • Some students avoid using AI to avoid suspicion or accusation of cheating, even when not using it.
  • Others find helpful uses: organising references, creating study timetables, acting as a “second pair of eyes” or “study companion.”

Keywords

URL

https://www.irishtimes.com/life-style/people/2025/09/20/students-complicated-relationship-with-ai-chatbots-its-inherently-going-against-what-college-is/

Summary generated by ChatGPT 5


Students are using AI tools instead of building foundational skills – but resistance is growing


In a dimly lit library, a focused male student interacts with a glowing holographic display from his laptop, showing complex data. A red, crackling energy line extends from the display towards his hands. On the desk, an open notebook beneath it is titled 'FOUNDATIONAL SKILLS: UNDERSTANDING' with handwritten equations. Other students are visible in the background, implying a widespread trend. The scene contrasts AI tool usage with fundamental learning. Generated by Nano Banana.
The convenience of AI tools poses a growing dilemma for students: relying on them for quick answers versus engaging in the hard work of building foundational knowledge. While the allure of efficiency is strong, a movement towards prioritising true understanding and essential skills is gaining momentum. Image generated by Nano Banana.

Source

ZDNet

Summary

The rapid uptake of AI in education is fuelling concerns that students are outsourcing critical thinking and failing to build long-term skills. While AI helps with grading, planning, and coding, academics worry about “hollow” assignments that lack depth and originality. Some professors highlight students’ inability to explain code produced with AI, exposing gaps in understanding. In response, a coalition of technology faculty issued an open letter urging universities to resist uncritical adoption, warning of dependence, loss of expertise, and damage to academic freedom. Advocates argue AI should supplement—not replace—foundational skills, with careful vetting and practical use.

Key Points

  • AI is heavily used in classrooms, but risks undermining deep learning and original thought.
  • Examples show students submitting near-identical AI essays or failing to explain AI-written code.
  • Professors call for limits and redesigns to safeguard academic freedom and integrity.
  • Concerns include declining quality of computer science education and over-reliance on prompting tools.
  • Best practice is to adopt AI deliberately, ensuring it serves genuine educational purposes.

Keywords

URL

https://www.zdnet.com/article/students-are-using-ai-tools-instead-of-building-foundational-skills-but-resistance-is-growing/

Summary generated by ChatGPT 5


Professors experiment as AI becomes part of student life


In a modern university lecture hall, three professors (two female, one male) stand at a glowing, interactive holographic table, actively demonstrating or discussing AI concepts. Students are seated at desks, some using laptops with glowing AI interfaces, and one student wears a VR headset. A large holographic screen in the background displays 'AI Integration Lab: Fall 2024'. The scene depicts educators experimenting with AI in a learning environment. Generated by Nano Banana.
As AI increasingly integrates into daily student life, professors are actively experimenting with new pedagogical approaches and tools to harness its potential. This image captures a dynamic classroom setting where educators are at the forefront of exploring how AI can enrich learning, adapt teaching methods, and prepare students for an AI-driven future. Image generated by Nano Banana.

Source

The Globe and Mail

Summary

AI has shifted from novelty to necessity in Canadian higher education, with almost 60% of students now using it. Professors are experimenting with different approaches: some resist, others regulate, and many actively integrate AI into assessments. Concerns remain about diminished critical thinking, but educators like those at the University of Toronto and University of Guelph argue that ignoring AI leaves graduates unprepared. Strategies include teaching students to refine AI-generated drafts, redesigning assignments to require human input, and adopting oral assessments. The consensus is that policies alone cannot keep pace; practical, ethical, and reflective engagement is essential for preparing students to use AI responsibly.

Key Points

  • Nearly 60% of Canadian students use AI for coursework, rising globally to over 90%.
  • Professors face a choice: resist, regulate, or embrace AI; ignoring it is seen as untenable.
  • Innovative teaching methods include refining AI drafts, training prompt skills, and oral assessments.
  • Concerns persist about weakening critical thinking and creativity.
  • Preparing students for AI-rich workplaces requires embedding literacy, ethics, and adaptability.

Keywords

URL

https://www.theglobeandmail.com/business/article-professors-experiment-as-ai-becomes-part-of-student-life/

Summary generated by ChatGPT 5


Opposing the inevitability of AI at universities is possible and necessary


In a grand, traditional university library setting, a group of professionals and academics stand around a conference table, actively pushing back with their hands raised towards a large, glowing holographic brain that represents AI. The brain is split with blue (calm) and red (active/threatening) elements, and a "STOP AI" sign is visible on a blackboard in the background. Image (and typos) generated by Nano Banana.
While the integration of AI into universities often feels unstoppable, this image visualizes the argument that actively opposing its unchecked inevitability is not only possible but crucial. It suggests that a proactive stance is necessary to guide the future of AI in academia rather than passively accepting its full integration. Image (and typos) generated by Nano Banana.

Source

Radboud University

Summary

Researchers from Radboud University argue that AI’s spread in academia is being framed as inevitable, but pushback is both possible and essential. They warn that uncritical adoption—especially when backed or funded by industry—threatens academic freedom, distorts research priorities, risks deskilling students, and contributes to misinformation and environmental harm. The paper urges universities to reassert their values: have transparent debates, maintain independence from industry influence, preserve consent, and retain human judgement as central to education and research.

Key Points

  • AI adoption in universities is often assumed to be inevitable, but this is a narrative device not a necessity.
  • Industry funding of AI research risks conflicts of interest and distorting knowledge.
  • Uncritical AI use risks deskilling students (critical thinking, writing).
  • Universities adopting AI redefine what counts as knowledge and who defines it.
  • Call for transparency, debate, consent, independence, and retaining human judgment.

Keywords

URL

https://www.ru.nl/en/research/research-news/opposing-the-inevitability-of-ai-at-universities-is-possible-and-necessary

Summary generated by ChatGPT 5


‘It’s a monster’: How generative AI is forcing university professors to rethink learning


In a dimly lit, traditional university lecture hall, a monstrous, multi-limbed, glowing blue digital creature with glowing red eyes looms large behind a professor at a podium. Around tables in the foreground, other professors in academic robes express concern and confusion, some pointing at the creature, while a blackboard in the background reads "RETHINK CURRICULUM" and "HUMAN PROMPT." Image (and typos) generated by Nano Banana.
Described by some as a “monster,” generative AI is fundamentally challenging established educational paradigms. This image dramatically illustrates the immense, even intimidating, presence of AI in academia, compelling university professors to urgently rethink and innovate their approaches to learning and curriculum design. Image (and typos) generated by Nano Banana.

Source

The Irish Times

Summary

Professors in Ireland are rethinking what learning and assessment mean as generative AI becomes widespread. With students using tools like ChatGPT for brainstorming, summarisation, and essay writing, faculty are concerned not just about plagiarism but about diminished reflection, reading, and originality. Responses include replacing take‑home essays with in‑class/open‑book work, designing reflective and relational assignments, and rebuilding community in learning. Faculty warn education is becoming transactional, focused on grades over growth, and AI use may hollow out critical thinking unless institutions redesign pedagogy and policies.

Key Points

  • Widespread AI use by students undermines traditional essays and originality.
  • Professors replace take‑home essays with in‑class/open‑book assessments.
  • Assignments now stress reflection, relational thinking, vulnerability — areas AI struggles with.
  • Students under pressure turn to AI instrumentally, prioritising grades over growth.
  • Institutions face resource challenges in redesigning assessments and policies.

Keywords

URL

https://www.irishtimes.com/ireland/education/2025/09/09/its-a-monster-how-generative-ai-is-forcing-university-professors-to-rethink-learning/

Summary generated by ChatGPT 5