AI Literacy Is Just Digital and Media Literacy in Disguise


In a modern library setting, a diverse group of four students and a female professor are gathered around a glowing, interactive table displaying "AI LITERACY: DIGITAL & MEDIA LITERACY." Overhead, a holographic overlay connects "DIGITAL LITERACY," "MEDIA LITERACY" (with news icons), and "AI LITERACY SKILLS" (with brain and circuit icons), illustrating the interconnectedness of these competencies. Image (and typos) generated by Nano Banana.
This image visually argues that AI literacy is not an entirely new concept but rather an evolution or “disguise” of existing digital and media literacy skills. It highlights the interconnectedness of understanding digital tools, critically evaluating information, and navigating algorithmic influences, suggesting that foundational literacies provide a strong basis for comprehending and engaging with artificial intelligence effectively. Image (and typos) generated by Nano Banana.

Source

Psychology Today

Summary

Diana E. Graber argues that “AI literacy” is not a new concept but a continuation of long-standing digital and media literacy principles. Triggered by the April 2025 executive order Advancing Artificial Intelligence Education for American Youth, the sudden focus on AI education highlights skills schools should have been teaching all along—critical thinking, ethical awareness, and responsible participation online. Graber outlines seven core areas where digital and media literacy underpin AI understanding, including misinformation, digital citizenship, privacy, and visual literacy. She warns that without these foundations, students face growing risks such as deepfake abuse, data exploitation, and online manipulation.

Key Points

  • AI literacy builds directly on digital and media literacy foundations.
  • An executive order has made AI education a US national priority.
  • Core literacies—critical thinking, ethics, and responsibility—are vital for safe AI use.
  • Key topics include misinformation, cyberbullying, privacy, and online safety.
  • The article urges sustained digital education rather than reactionary AI hype.

Keywords

URL

https://www.psychologytoday.com/us/blog/raising-humans-in-a-digital-world/202510/ai-literacy-is-just-digital-and-media-literacy-in

Summary generated by ChatGPT 5


Universities can turn AI from a threat to an opportunity by teaching critical thinking


In a grand, tiered university lecture hall, a male professor stands at a podium addressing an audience of students, all working on laptops. Above them, a large holographic display illustrates a transformation: on the left, "AI: THE THREAT" is shown with icons for plagiarism and simplified thinking. In the middle, "CRITICAL THINKING: THE BRIDGE" connects to the right panel, "AI: OPPORTUNITY," which features icons for problem-solving and ethical AI use. Image (and typos) generated by Nano Banana.
Universities have the potential to transform AI from a perceived threat into a powerful educational opportunity, primarily by emphasising and teaching critical thinking skills. This image visually represents critical thinking as the crucial bridge that allows students to navigate the challenges of AI, such as potential plagiarism and shallow learning, and instead harness its power for advanced problem-solving and ethical innovation. Image (and typos) generated by Nano Banana.

Source

The Conversation

Summary

Anitia Lubbe argues that universities should stop treating AI primarily as a threat and instead use it to develop critical thinking. Her research team reviewed recent studies on AI in higher education, finding that generative tools excel at low-level tasks (recall and comprehension) but fail at high-level ones like evaluation and creativity. Traditional assessments, still focused on memorisation, risk encouraging shallow learning. Lubbe proposes redesigning assessments for higher-order skills—asking students to critique, adapt, and evaluate AI outputs. This repositions AI as a learning partner and shifts higher education toward producing self-directed, reflective, and analytical graduates.

Key Points

  • AI performs well on remembering and understanding tasks but struggles with evaluation and creation.
  • Current university assessments often reward the same low-level thinking AI already automates.
  • Teachers should design context-rich, authentic assessments (e.g. debates, portfolios, local case studies).
  • Students can use AI to practise analysis by critiquing or improving generated content.
  • Developing AI literacy, assessment literacy, and self-directed learning skills is key to ethical integration.

Keywords

URL

https://theconversation.com/universities-can-turn-ai-from-a-threat-to-an-opportunity-by-teaching-critical-thinking-266187

Summary generated by ChatGPT 5


AI technology ‘replacing critical thinking in university lectures’ – student


In a grand, Gothic-style university lecture hall, rows of students are seated, all intently focused on glowing laptops. At the front, a large, cold blue holographic display titled "AI LECTURE AUTOMATION SYSTEM" prominently states: "Critical Thinking: Replaced. Information: Delivered." A small whiteboard in the background sarcastically asks, "AI IS 'HERE,' WHERE'S THE PROF?!" Image (and typos) generated by Nano Banana.
According to a student’s observation, AI technology is alarmingly “replacing critical thinking in university lectures,” transforming the learning environment into one focused solely on information delivery. This dystopian image visualizes a future where traditional human instruction is minimized, and AI automates the lecture process, raising serious questions about the impact on students’ cognitive development and the very essence of higher education. Image (and typos) generated by Nano Banana.

Source

Waikato Times

Summary

A University of Waikato student has voiced concern that widespread use of AI in lectures is eroding students’ ability to think critically. Speaking anonymously, the fourth-year student said many peers now use ChatGPT to generate lecture notes, discussion questions, and ideas—essentially outsourcing thinking itself. While she acknowledged that AI has benefits when used judiciously, she worries it encourages intellectual passivity and dependence. The student warned that such habits could eventually harm employability, as employers increasingly seek graduates with strong analytical and critical-thinking skills.

Key Points

  • Students are using ChatGPT to generate lecture notes and workshop discussion prompts.
  • The student fears this practice undermines the purpose of higher education—to cultivate independent thinking.
  • She admits AI has value when used responsibly but sees overreliance as damaging to learning.
  • The trend risks producing graduates who lack the analytical abilities employers prize most.
  • The concern reflects wider tensions in universities over balancing AI’s benefits and harms.

Keywords

URL

https://www.waikatotimes.co.nz/nz-news/360843243/ai-technology-replacing-critical-thinking-university-lectures-student

Summary generated by ChatGPT 5


How AI Is Rewriting the Future of Humanities Education


In a grand, ornate university library, a group of diverse students and professors are seated around a long, candlelit wooden table, engaged in a discussion. Above them, a large holographic display titled "THE FUTURE OF HUMANITIES EDUCATION" shows a central figure holding "THE HUMANITIES MANIFESTO" with a quill, flanked by "PAST" and "FUTURE" panels detailing AI-powered interpretation, digital ethics, and evolving roles of AI in humanities. Image (and typos) generated by Nano Banana.
Artificial intelligence is not merely influencing but actively “rewriting” the trajectory of humanities education, prompting a re-evaluation of its foundational principles and methodologies. This image captures a moment of deep academic reflection, visualizing how AI is introducing new tools for interpretation, posing ethical challenges, and ultimately shaping a dynamic new future for the study of human culture and thought. Image (and typos) generated by Nano Banana.

Source

Forbes

Summary

Mary Hemphill argues that while AI is rapidly changing technical and STEM fields, its impact on the humanities may be even more profound. She sees AI not just as a tool but a collaborator—helping students explore new interpretations, generate creative prompts, and push boundaries in writing, philosophy, or cultural critique. But this is double-edged: overreliance risks hollowing out the labour of thinking deeply, undermining the craft faculty value. Hemphill suggests humanities courses must adapt via “AI-native” pedagogy: teaching prompt literacy, interrogative reading, and critical layering. The goal: use AI to elevate human thinking, not replace it.

Key Points

  • Humanities may shift from sourcing facts to exploring deeper interpretation, guided by AI-assisted exploration.
  • Students should be taught prompt literacy—how to interrogate AI outputs, not accept them.
  • “AI-native” pedagogy: adaptation of assignments to expect AI use, layered with critical human engagement.
  • Overreliance on AI can weaken students’ capacity for independent thinking and textual craftsmanship.
  • Humanities faculty must lead design of AI integration that preserves the values of the discipline.

Keywords

URL

https://www.forbes.com/sites/maryhemphill/2025/10/01/how-ai-is-rewriting-the-future-of-humanities-education/

Summary generated by ChatGPT 5


AI Is Robbing Students of Critical Thinking, Professor Says


In a grand, traditional university library, a menacing, cloaked digital entity with glowing red eyes representing AI looms over a group of students seated at a long table, all intensely focused on laptops with glowing blue faces. Thought bubbles emanate from the AI, offering to "GENERATE ESSAY," "SUMMARIZE," and "GIVE ANSWER." In the background, a visibly frustrated professor gestures emphatically, observing the scene. Image (and typos) generated by Nano Banana.
A prominent professor warns that the widespread use of AI is actively depriving students of opportunities to develop critical thinking skills. This image dramatically visualizes AI as a looming, pervasive force in the academic lives of students, offering quick solutions that may bypass the deeper cognitive processes essential for genuine intellectual growth and independent thought. Image (and typos) generated by Nano Banana.

Source

Business Insider

Summary

Kimberley Hardcastle, assistant professor of business and marketing at Northumbria University, warns that generative AI is not just facilitating plagiarism—it’s encouraging students to outsource their thinking. Based on Anthropic data, about 39 % of student-AI interactions involved creating or polishing academic texts and another 33 % requested direct solutions. Hardcastle argues this is shifting the locus of intellectual authority toward Big Tech, making it harder for students to engage with ambiguity, weigh evidence, or claim ownership of ideas. She urges institutions to focus less on policing misuse, and more on pedagogies that preserve critical thinking and epistemic agency.

Key Points

  • 39.3 % of student-AI chats were about composing or revising assignments; 33.5 % requested direct solutions.
  • AI output often is accepted uncritically because it presents polished, authoritative language.
  • The danger: students come to trust AI explanations over their own reasoned judgement.
  • Hardcastle views this as part of a larger shift: tech companies increasingly influence how “knowledge” is framed and delivered.
  • She suggests the response should emphasise pedagogy: design modes of teaching that foreground critical thinking over output policing.

Keywords

URL

https://www.businessinsider.com/ai-chatgpt-robbing-students-of-critical-thinking-professor-says-2025-9

Summary generated by ChatGPT 5