Homework Is Facing an Existential Crisis: Has AI Made It Pointless?


A split image contrasting traditional homework with AI-influenced study. The left side shows a frustrated teenage boy sitting at a cluttered desk under a lamp, struggling with a textbook and papers, with a large red 'X' overlaid on him, signifying the traditional struggle. The background features a messy bulletin board and bookshelves. The right side shows the same boy relaxed in a modern, blue-lit setting, calmly using a tablet. Above his tablet, a friendly, glowing holographic AI tutor figure appears, surrounded by flowing data, equations, and digital interfaces, representing effortless, AI-assisted learning. Image (and typos) generated by Nano Banana.
As AI revolutionizes learning, traditional homework faces an existential crisis. This image dramatically contrasts the classic struggle with assignments against the ease of AI-assisted learning, raising a fundamental question: has artificial intelligence rendered conventional homework pointless, or simply redefined its purpose? Image (and typos) generated by Nano Banana.

Source

Los Angeles Times

Summary

Howard Blume explores how the rise of artificial intelligence is forcing educators to reconsider the value of homework. According to the College Board, 84 per cent of U.S. high school students now use AI for schoolwork, leading some teachers to abandon homework entirely while others redesign tasks to make AI misuse harder. Educators such as Alyssa Bolden in Inglewood now require handwritten essays to limit AI reliance, while others emphasise in-class mastery over at-home repetition. Experts warn that poorly designed homework, amplified by AI, risks undermining learning and widening inequality. Yet research suggests students still benefit from meaningful, creative assignments that foster independence, time management, and deeper understanding. The article concludes that AI hasn’t made homework obsolete—it has exposed the need for better, more purposeful learning design.

Key Points

  • 84 per cent of U.S. high school students use AI for schoolwork, up from 79 per cent earlier in 2025.
  • Teachers are divided: some have scrapped homework, while others are redesigning it to resist AI shortcuts.
  • AI challenges traditional measures of academic effort and authenticity.
  • Experts urge teachers to create engaging, meaningful assignments that deepen understanding.
  • Poorly designed homework can increase stress and widen learning gaps, particularly across socioeconomic lines.
  • The consensus: students don’t need more homework—they need better homework.

Keywords

URL

https://www.latimes.com/california/story/2025-10-25/homework-useless-existential-crisis-ai

Summary generated by ChatGPT 5


Teachers Worry AI Will Impede Students’ Critical Thinking Skills. Many Teens Aren’t So Sure


A split image contrasting teachers' concerns about AI with teenagers' perspectives. On the left, a worried female teacher stands in a traditional classroom, gesturing with open hands towards a laptop on a desk. A glowing red 'X' mark covers the words "CRITICAL THINKING" and gears/data on the laptop screen, symbolizing the perceived threat to cognitive skills. On the right, three engaged teenagers (two boys, one girl) are working collaboratively on laptops in a bright, modern setting. Glowing keywords like "PROBLEM-SOLVING," "INNOVATION," and "CREATIVITY" emanate from their screens, representing AI's perceived benefits. A large question mark is placed in the middle top of the image. Image (and typos) generated by Nano Banana.
A clear divide emerges in the debate over AI’s impact on critical thinking: while many teachers express concern that AI will hinder students’ cognitive development, a significant number of teenagers remain unconvinced, often viewing AI as a tool that can enhance their problem-solving abilities. This image visualises the contrasting viewpoints on this crucial educational challenge. Image (and typos) generated by Nano Banana.

Source

Education Week

Summary

Alyson Klein reports on the growing divide between teachers and students over how artificial intelligence is affecting critical thinking. While educators fear that AI tools like ChatGPT are eroding students’ ability to reason independently, many teens argue that AI can actually enhance their thinking when used responsibly. Teachers cite declining originality and over-reliance on AI-generated answers, expressing concern that students are losing confidence in forming their own arguments. Students, however, describe AI as a useful study companion—helping clarify concepts, model strong writing, and guide brainstorming. Experts suggest that the key issue is not whether AI harms or helps, but how schools teach students to engage with it critically. Educators who integrate AI into lessons rather than banning it outright are finding that students can strengthen, rather than surrender, their analytical skills.

Key Points

  • Teachers fear AI use is diminishing critical thinking and originality in student work.
  • Many students view AI as a learning aid that supports understanding and creativity.
  • The divide reflects differing expectations around what “thinking critically” means.
  • Experts recommend structured AI literacy education over prohibition or punishment.
  • Responsible AI use depends on reflection, questioning, and teacher guidance.

Keywords

URL

https://www.edweek.org/technology/teachers-worry-ai-will-impede-students-critical-thinking-skills-many-teens-arent-so-sure/2025/10

Summary generated by ChatGPT 5


Most Teachers Rethinking How They Set Assignments Due to AI


A diverse group of eight teachers or educators are gathered around a conference table in a modern library or academic setting, engaged in a discussion. Two male teachers stand and point at a large, glowing holographic display above the table, which is split into two sections: "TRADITIONAL ASSIGNMENT DESIGN" and "AI-INTEGRATED PROJECTS." Each section contains pie charts, diagrams, and keywords like "CRITICAL THINKING," "HUMAN-AI COLLABORATION," and "ETHICS," illustrating a shift in pedagogical approaches. A large red bracket and arrow point from the traditional to the AI-integrated section, symbolizing the transition. Other teachers at the table are working on laptops with glowing interfaces. Image (and typos) generated by Nano Banana.
A significant majority of teachers—8 out of 10—are actively re-evaluating their assignment design strategies in response to the rise of AI. This shift reflects a crucial effort to adapt educational methods, ensuring assignments remain relevant, promote critical thinking, and address the capabilities and challenges presented by artificial intelligence. Image (and typos) generated by Nano Banana.

Source

Tes

Summary

A British Council survey of 1,000 UK secondary teachers reveals that 79 per cent have changed how they design assignments because of artificial intelligence. The rapid integration of AI tools into student learning is reshaping assessment practices and communication skills in classrooms. While 59 per cent of teachers are creating assignments that incorporate AI responsibly, 38 per cent are designing tasks to prevent its use entirely. Teachers report declines in writing quality, originality, and vocabulary, as well as shorter attention spans among students. Education leaders, including Amy Lightfoot of the British Council and Sarah Hannafin of the NAHT, call for guidance, training, and proportional expectations to help schools manage AI’s growing influence while maintaining academic integrity and creativity.

Key Points

  • 79 per cent of teachers have altered assignment design due to AI.
  • 59 per cent integrate AI intentionally, while 38 per cent design tasks to exclude it.
  • Teachers report reduced writing quality, narrower vocabulary, and shorter attention spans.
  • 60 per cent worry AI is changing how students communicate and express ideas.
  • Education unions call for clearer national guidance and funded teacher training on AI use.
  • Experts highlight the need to balance innovation with safeguarding originality and ethics.

Keywords

URL

https://www.tes.com/magazine/news/secondary/teachers-rethinking-assignments-artificial-intelligence

Summary generated by ChatGPT 5


My Students Use AI. So What?


A confident female teacher stands at the center of a modern classroom, holding up a tablet that displays a world map, symbolizing global connection. She looks directly at the viewer with a slight smile. Around her, a diverse group of college-aged students are seated at collaborative tables, actively working on laptops that show glowing digital interfaces. Above the entire scene, a large, vibrant word cloud hovers, with prominent words like "CREATIVITY," "INNOVATION," "COLLABORATION," "CRITICAL THINKING," and "PROBLEM-SOLVING," all associated with AI and learning. The words are illuminated with a soft, energetic glow. Image (and typos) generated by Nano Banana.
In a world where AI is ubiquitous, some educators are embracing its presence in the classroom. This image captures the perspective of a teacher who views AI not as a threat, but as an integral tool that can foster creativity, innovation, and critical thinking, challenging traditional views on technology in education. Image (and typos) generated by Nano Banana.

Source

The Atlantic

Summary

John McWhorter, a linguist and professor at Columbia University, argues that fears about artificial intelligence destroying academic integrity are exaggerated. He contends that educators should adapt rather than resist, acknowledging that AI has become part of how students read, write, and think. While traditional essay writing once served as a key training ground for argumentation, AI now performs that function efficiently, prompting teachers to develop more relevant forms of assessment. McWhorter urges educators to replace formulaic essays with classroom discussions, personal reflections, and creative applications that AI cannot replicate. Grammar and stylistic rules, he suggests, should no longer dominate education; instead, AI can handle mechanical precision, freeing students to focus on reasoning and ideas. For McWhorter, the goal is not to preserve outdated academic rituals but to help students learn to think more deeply in a changed world.

Key Points

  • The author challenges alarmist narratives about AI eroding higher education.
  • AI has replaced traditional essay writing as a mechanical exercise but not genuine thought.
  • Teachers should create assessments that require personal insight and classroom engagement.
  • Grammar and stylistic conventions are becoming obsolete as AI handles technical writing.
  • AI allows students to focus on creativity, reasoning, and synthesis rather than busywork.
  • The shift mirrors earlier transitions in media—from print to digital—without diminishing intellect.

Keywords

URL

https://www.theatlantic.com/ideas/archive/2025/10/ai-college-crisis-overblown/684642/

Summary generated by ChatGPT 5


Making Sense of GenAI in Education: From Force Analysis to Pedagogical Copilot Agents

by Jonathan Sansom – Director of Digital Strategy, Hills Road Sixth Form College, Cambridge
Estimated reading time: 5 minutes
A laptop displays a "Microsoft Copilot" interface with two main sections: "Force Analysis: Opportunities vs. Challenges" showing a line graph, and "Pedagogical Copilot Agent" with an icon representing a graduation cap, books, and other educational symbols. In the background, a blurred image of students in a secondary school classroom is visible. Image (and typos) generated by Nano Banana.
Bridging the gap: This image illustrates how Microsoft Copilot can be leveraged in secondary education, moving from a “force analysis” of opportunities and challenges to the implementation of “pedagogical copilot agents” that assist both students and educators. Image (and typos) generated by Nano Banana.

At Hills Road, we’ve been living in the strange middle ground of generative AI adoption. If you charted its trajectory, it wouldn’t look like a neat curve or even the familiar ‘hype cycle’. It’s more like a tangled ball of wool: multiple forces pulling in competing directions.

The Forces at Play

Our recent work with Copilot Agents has made this more obvious. If we attempt a force analysis, the drivers for GenAI adoption are strong:

  • The need to equip students and staff with future-ready skills.
  • Policy and regulatory expectations, from DfE and Ofsted, to show assurance around AI integration.
  • National AI strategies that frame this as an essential area for investment.
  • The promise of personalised learning and workload reduction.
  • A pervasive cultural hype, blending existential narratives with a relentless ‘AI sales’ culture.

But there are also significant restraints:

  • Ongoing academic integrity concerns.
  • GDPR and data privacy ambiguity.
  • Patchy CPD and teacher digital confidence.
  • Digital equity and access challenges.
  • The energy cost of AI at scale.
  • Polarisation of educator opinion, and staff change fatigue.

The result is persistent dissonance. AI is neither fully embraced nor rejected; instead, we are all negotiating what it might mean in our own settings.

Educator-Led AI Design

One way we’ve tried to respond is through educator-led design. Our philosophy is simple: we shouldn’t just adopt GenAI; we must adapt it to fit our educational context.

That thinking first surfaced in experiments on Poe.com, where we created an Extended Project Qualification (EPQ) Virtual Mentor. It was popular, but it lived outside institutional control – not enterprise and not GDPR-secure.

So in 2025 we have moved everything in-house. Using Microsoft Copilot Studio, we created 36 curriculum-specific agents, one for each A Level subject, deployed directly inside Teams. These agents are connected to our SharePoint course resources, ensuring students and staff interact with AI in a trusted, institutionally managed environment.

Built-in Pedagogical Skills

Rather than thinking of these agents as simply ‘question answering machines’, we’ve tried to embed pedagogical skills that mirror what good teaching looks like. Each agent is structured around:

  • Explaining through metaphor and analogy – helping students access complex ideas in simple, relatable ways.
  • Prompting reflection – asking students to think aloud, reconsider, or connect their ideas.
  • Stretching higher-order thinking – moving beyond recall into analysis, synthesis, and evaluation.
  • Encouraging subject language use – reinforcing terminology in context.
  • Providing scaffolded progression – introducing concepts step by step, only deepening complexity as students respond.
  • Supporting responsible AI use – modelling ethical engagement and critical AI literacy.

These skills give the agents an educational texture. For example, if a sociology student asks: “What does patriarchy mean, but in normal terms?”, the agent won’t produce a dense definition. It will begin with a metaphor from everyday life, check understanding through a follow-up question, and then carefully layer in disciplinary concepts. The process is dialogic and recursive, echoing the scaffolding teachers already use in classrooms.

The Case for Copilot

We’re well aware that Microsoft Copilot Studio wasn’t designed as a pedagogical platform. It comes from the world of Power Automate, not the classroom. In many ways we’re “hijacking” it for our purposes. But it works.

The technical model is efficient: one Copilot Studio authoring licence, no full Copilot licences required, and all interactions handled through Teams chat. Data stays in tenancy, governed by our 365 permissions. It’s simple, secure, and scalable.

And crucially, it has allowed us to position AI as a learning partner, not a replacement for teaching. Our mantra remains: pedagogy first, technology second.

Lessons Learned So Far

From our pilots, a few lessons stand out:

  • Moving to an in-tenancy model was essential for trust.
  • Pedagogy must remain the driver – we want meaningful learning conversations, not shortcuts to answers.
  • Expectations must be realistic. Copilot Studio has clear limitations, especially in STEM contexts where dialogue is weaker.
  • AI integration is as much about culture, training, and mindset as it is about the underlying technology.

Looking Ahead

As we head into 2025–26, we’re expanding staff training, refining agent ‘skills’, and building metrics to assess impact. We know this is a long-haul project – five years at least – but it feels like the right direction.

The GenAI systems that students and teachers are often using in college were in the main designed mainly by engineers, developers, and commercial actors. What’s missing is the educator’s voice. Our work is about inserting that voice: shaping AI not just as a tool for efficiency, but as an ally for reflection, questioning, and deeper thinking.

The challenge is to keep students out of what I’ve called the ‘Cognitive Valley’, that place where understanding is lost because thinking has been short-circuited. Good pedagogical AI can help us avoid that.

We’re not there yet. Some results are excellent, others uneven. But the work is underway, and the potential is undeniable. The task now is to make GenAI fit our context, not the other way around.

Jonathan Sansom

Director of Digital Strategy,
Hills Road Sixth Form College, Cambridge

Passionate about education, digital strategy in education, social and political perspectives on the purpose of learning, cultural change, wellbeing, group dynamics, – and the mysteries of creativity…


Software / Services Used

Microsoft Copilot Studiohttps://www.microsoft.com/en-us/microsoft-365-copilot/microsoft-copilot-studio

Keywords