
Source
The Conversation
Summary
This article argues that generative AI is shifting the locus of knowledge production from academic institutions into the hands of Big Tech platforms. As students and educators increasingly rely on AI tools, the power to define what counts as knowledge — what is true, what is cited, what is authoritative — is ceded to private firms. That shift risks marginalising critical thinking, local curricula, and disciplinary expertise. The author calls for reclaiming epistemic authority: universities must define their own ways of knowing, educate students not just in content but in evaluative judgement, and negotiate more equitable relationships with AI platforms so that academic integrity and autonomy aren’t compromised.
Key Points
- Generative AI tools increasingly mediate how knowledge is accessed, curated and presented—Big Tech becomes a gatekeeper.
- Reliance on AI may weaken disciplinary expertise and the role of scholars as knowledge producers.
- Students may accept AI outputs uncritically, transferring trust to algorithmic systems over faculty.
- To respond, higher education must build student literacy in epistemics (how we know what we know) and insist AI remain assistive, not authoritative.
- Universities should set policy, technical frameworks, and partnerships that protect research norms, attribution, and diverse knowledge systems.
Keywords
URL
Summary generated by ChatGPT 5