O’Sullivan, James, Colin Lowry, Ross Woods & Tim Conlon. Generative AI in Higher Education Teaching & Learning: Policy Framework. Higher Education Authority, 2025. DOI: 10.82110/073e-hg66.
Summary
This policy framework provides a national, values-based approach to guiding the adoption of generative artificial intelligence (GenAI) in teaching and learning across Irish higher education institutions. Rather than prescribing uniform rules, it establishes a shared set of principles to support informed, ethical, and pedagogically sound decision-making. The framework recognises GenAI as a structural change to higher education—particularly to learning design, assessment, and academic integrity—requiring coordinated institutional and sector-level responses rather than ad hoc or individual initiatives.
Focused explicitly on teaching and learning, the framework foregrounds five core principles: academic integrity and transparency; equity and inclusion; critical engagement, human oversight, and AI literacy; privacy and data governance; and sustainable pedagogy. It emphasises that GenAI should neither be uncritically embraced nor categorically prohibited. Instead, institutions are encouraged to adopt proportionate, evidence-informed approaches that preserve human judgement, ensure fairness, protect student data, and align AI use with the public mission of higher education. The document also outlines how these principles can be operationalised through governance, assessment redesign, staff development, and continuous sector learning.
Key Points
The framework offers a shared national reference point rather than prescriptive rules.
GenAI is treated as a systemic pedagogical challenge, not a temporary disruption.
Academic integrity depends on transparency, accountability, and visible authorship.
Equity and inclusion must be designed into AI adoption from the outset.
Human oversight and critical engagement remain central to learning and assessment.
AI literacy is positioned as a core capability for staff and students.
Privacy, data protection, and institutional data sovereignty are essential.
Assessment practices must evolve beyond reliance on traditional written outputs.
Sustainability includes both environmental impact and long-term educational quality.
Ongoing monitoring and sector-wide learning are critical to responsible adoption.
Conclusion
The HEA Policy Framework positions generative AI as neither a threat to be resisted nor a solution to be uncritically adopted. By grounding AI integration in shared academic values, ethical governance, and pedagogical purpose, it provides Irish higher education with a coherent foundation for navigating AI-enabled change while safeguarding trust, equity, and educational integrity.
Estimated reading time: 8 minutes
Moving from debate to action: Implementing a cross-departmental strategy to shape the future of GenAI in higher education. Image (and typos) generated by Nano Banana.
Debate continues to swing between those pushing rapid adoption and those advocating caution of GenAI, for example, panic about “AI taking over the classroom” and outrage at Big Tech’s labour practices. Both are important, but are these and other concerns causing inaction? In many cases, we are quietly watching students hand their data and their critical thinking over to the very Big Tech companies we are arguing against (while we still fly on holidays, stream on smart TVs and buy the same devices from the same companies). Pretending that GenAI in education is the one place we finally draw an ethical line, while doing nothing to make its use safer or more equitable, is not helpful. By all means, keep debating, but not at the cost of another three or four cohorts.
This opinion post suggests three things universities should address now: a minimal set of GenAI functions that should be available to staff and students, and a four-step teaching process to help lecturers rethink their role with GenAI.
Three things universities need to address now
1. Tell students and staff clearly what they can use (Déjà vu?)
Students and staff deserve clarity on which GenAI tools they have access to, what they can use them for and which ones are institutionally supported. Has your university provided this? No more grey areas or “ask your lecturer”. If people do not know this, it pushes GenAI use into secrecy. That secrecy hands more power to Big Tech to extract data and embed bias, while also quietly taking away their cognitive ability.
2. Untangle GenAI from “academic integrity”
Tightly linking GenAI to academic integrity was a mistake! It has created an endless debate about whether to permit or prohibit GenAI, which pushes use further underground. At this point, there is no real equity and no real academic integrity. Use of GenAI cannot simply be stopped, proved or disproved, so pretending otherwise, while holding endless anti‑AI discussions, will not lead to a solution. There is no putting GenAI back in the bottle!
3. Treat GenAI as a shared responsibility
GenAI affects curriculum design, assessment, student support, digital literacy, employability, libraries, disability support, IT, policy and everywhere in between. It cannot sit on the shoulders of one department or lead. Every university needs a cross‑departmental AI strategy that includes the student union, academic leads, IT, the data protection office, careers, student support, administration and teaching and learning personnel. Until leadership treats GenAI as systemic, lecturers will keep firefighting contradictions and marking assignments they know were AI-generated. Bring everyone to the table, and don’t adjourn until decisions have been made on students and staff clarity (even if this clarity is dynamic in nature – do not continue to leave them navigating this alone for another three years).
What GenAI functions should be provided
At a minimum, institutions should give safe, equitable access to:
A campus-licensed GenAI model One model for all staff and students to ask questions, draft, summarise, explain and translate text, including support for multilingual learners.
Multimodal creation tools Tools to create images, audio, video (including avatars), diagrams, code, etc., with clear ethical and legal guidance.
Research support tools Tools to support research, transcribing, coding, summaries, theme mapping, citations, etc., that reinforce critical exploration
Assessment and teaching design tools Tools to draft examples, case variations, rubrics, flashcards, questions, etc., are stored inside institutional systems.
Custom agents Staff create and share custom AI agents configured for specific purposes: subject-specific scaffolding for students, or workflow agents for planning, resource creation and content adaptation. Keep interactions within institutional systems.
Accessibility focused GenAI Tools that deliver captions, plain language rewrites, alt text and personalised study materials. Many institutions already have these in place.
Safer GenAI tools for exploration, collaboration and reflection. Now what do they do with them? This is where something like Gen.S.A.R comes in. A potential approach where staff and students explore together with GenAI, and one that is adaptable to different contexts/disciplines.
Gen.S.A.R.
Gen.S.A.R. is simply a suggested starting point; there is no magic wand, but this may help to ignite practical ideas from others. It suggests a shift from passive content delivery to constructivist and experiential learning.
GenAI exploration and collaborative knowledge construction
Scrutinise and share
Apply in real-world contexts with a low or no-tech approach
Reflect and evaluate
It keeps critical thinking, collaboration and real-world application at the centre, with GenAI as a set of tools rather than a replacement for learning. Note: GenAI is a set of tools, not a human!
Phase 1: GenAI, constructing, not copy-pasting
Students use GenAI, the lecturer, and reputable sources to explore a concept or problem linked to the learning outcomes. Lecturers guide this exploration as students work individually or in groups. With ongoing lecturer input, students may choose whether to use GenAI or other sources, but all develop an understanding of GenAI’s role in learning.
Phase 2: Scrutinise and Share
The second phase focuses on scrutinising and sharing ideas with others, not just presenting them as finished facts. Students bring GenAI outputs, reputable sources and their own thinking into dialogue. They interrogate evidence, assumptions and perspectives in groups or class discussion (social constructivism, dialogic teaching). The lecturer – the content expert – oversees this process and identifies the errors, draws attention to the errors and helps students clarify GenAI outputs.
Phase 3: Apply, low-tech, real-world
Screens step back. Students apply what they have discovered in low or no-tech ways: diagrams, mind maps, zines, prototypes, role plays, scenarios. They connect what they discovered to real contexts and show understanding through doing, making, explaining and practical application.
Phase 4: Reflect, evaluate and look forward
Students then evaluate and reflect on both their learning process and the role of GenAI. Using written, audio, video or visual reflections, they consider what they learned, how GenAI supported or distorted that learning and how this connects to their future. This reflective work, combined with artefacts from earlier phases, supports peer, self and lecturer assessment and moves us towards competency and readiness-based judgements.
Resourcing Gen.S.A.R. Yes, smaller class sizes and support would be required, but aspects of this can be implemented now (and are being implemented by some already). Time shifts to facilitation, co-learning, process feedback, and authentic evaluation (less three-thousand-word essays). This approach is not perfect but at least it’s an approach and one that draws on long‑standing learning theories, including constructivism, social constructivism, experiential learning, and traditions in inclusive and competency‑based education.
There’s No Stopping It, Time to Shape It
GenAI is not going away. Exploitative labour practices, data abuse and profit motives are real (and not exclusive to AI), and naming these harms is essential, but continuing to let these debates dominate any movement is not helpful. Universities can choose to lead (and I commend, not condemn, those who already are) with clear guidance, equitable access to safe GenAI tools and learning design. The alternative is all the risks associated with students and staff relying on personal accounts and workarounds.
For the integrity of education itself, it is time to translate debates into action. The genie is not going back in the bottle, and our profit-driven society is not only shaped by Big Tech but also by the everyday choices of those of us living privileged lives in westernised societies. It is time to be honest about our own complicity, to step out of the ivory tower and work with higher education students to navigate the impact GenAI is having on their lives right now.
Note: My views on GenAI for younger learners is very different; the suggestions here focus specifically on higher education.
Frances O’Donnell
Instructional Designer ATU
Exploring the pros and cons of AI & GenAI in education, and indeed in society. Currently completing a Doctorate in Education with a focus on AI & Emerging Technologies.
Passionate about the potential education has to develop one’s self-confidence and self-worth, but frustrated by the fact it often does the opposite. AI has magnified our tendency to overassess and our inability to truly move away from rote learning.
Whether I’m carrying out the role of an instructional designer, or delivering workshops or researching, I think we should work together to make education a catalyst of change where learners are empowered to become confident as well as socially and environmentally conscious members of society. With or without AI, let’s change the perception of what success looks like for young people.