Universities: GenAI – There’s No Stopping, Start Shaping!

By Frances O’Donnell, Instructional Designer, ATU
Estimated reading time: 8 minutes
A group of diverse higher education professionals and students standing in front of a modern university building, engaged in a collaborative discussion around a glowing digital interface displaying the Gen.S.A.R. framework. In the foreground, traditional open books and a graduation cap are intertwined with glowing neural network nodes, symbolizing the integration of generative AI with traditional academic foundations. Image (and typos) generated by Nano Banana.
Moving from debate to action: Implementing a cross-departmental strategy to shape the future of GenAI in higher education. Image (and typos) generated by Nano Banana.

Debate continues to swing between those pushing rapid adoption and those advocating caution of GenAI, for example, panic about “AI taking over the classroom” and outrage at Big Tech’s labour practices. Both are important, but are these and other concerns causing inaction? In many cases, we are quietly watching students hand their data and their critical thinking over to the very Big Tech companies we are arguing against (while we still fly on holidays, stream on smart TVs and buy the same devices from the same companies). Pretending that GenAI in education is the one place we finally draw an ethical line, while doing nothing to make its use safer or more equitable, is not helpful. By all means, keep debating, but not at the cost of another three or four cohorts.

This opinion post suggests three things universities should address now: a minimal set of GenAI functions that should be available to staff and students, and a four-step teaching process to help lecturers rethink their role with GenAI.

Three things universities need to address now

1. Tell students and staff clearly what they can use (Déjà vu?)

Students and staff deserve clarity on which GenAI tools they have access to, what they can use them for and which ones are institutionally supported. Has your university provided this? No more grey areas or “ask your lecturer”. If people do not know this, it pushes GenAI use into secrecy. That secrecy hands more power to Big Tech to extract data and embed bias, while also quietly taking away their cognitive ability.

2. Untangle GenAI from “academic integrity”

Tightly linking GenAI to academic integrity was a mistake! It has created an endless debate about whether to permit or prohibit GenAI, which pushes use further underground. At this point, there is no real equity and no real academic integrity. Use of GenAI cannot simply be stopped, proved or disproved, so pretending otherwise, while holding endless anti‑AI discussions, will not lead to a solution. There is no putting GenAI back in the bottle!

3. Treat GenAI as a shared responsibility

GenAI affects curriculum design, assessment, student support, digital literacy, employability, libraries, disability support, IT, policy and everywhere in between. It cannot sit on the shoulders of one department or lead. Every university needs a cross‑departmental AI strategy that includes the student union, academic leads, IT, the data protection office, careers, student support, administration and teaching and learning personnel. Until leadership treats GenAI as systemic, lecturers will keep firefighting contradictions and marking assignments they know were AI-generated. Bring everyone to the table, and don’t adjourn until decisions have been made on students and staff clarity (even if this clarity is dynamic in nature – do not continue to leave them navigating this alone for another three years).

What GenAI functions should be provided

At a minimum, institutions should give safe, equitable access to:

  • A campus-licensed GenAI model
    One model for all staff and students to ask questions, draft, summarise, explain and translate text, including support for multilingual learners.
  • Multimodal creation tools
    Tools to create images, audio, video (including avatars), diagrams, code, etc., with clear ethical and legal guidance.
  • Research support tools
    Tools to support research, transcribing, coding, summaries, theme mapping, citations, etc., that reinforce critical exploration
  • Assessment and teaching design tools
    Tools to draft examples, case variations, rubrics, flashcards, questions, etc., are stored inside institutional systems.
  • Custom agents
    Staff create and share custom AI agents configured for specific purposes: subject-specific scaffolding for students, or workflow agents for planning, resource creation and content adaptation. Keep interactions within institutional systems.
  • Accessibility focused GenAI
    Tools that deliver captions, plain language rewrites, alt text and personalised study materials. Many institutions already have these in place.

Safer GenAI tools for exploration, collaboration and reflection. Now what do they do with them? This is where something like Gen.S.A.R comes in. A potential approach where staff and students explore together with GenAI, and one that is adaptable to different contexts/disciplines.

Gen.S.A.R.

Gen.S.A.R. is simply a suggested starting point; there is no magic wand, but this may help to ignite practical ideas from others. It suggests a shift from passive content delivery to constructivist and experiential learning.

  • GenAI exploration and collaborative knowledge construction
  • Scrutinise and share
  • Apply in real-world contexts with a low or no-tech approach
  • Reflect and evaluate

It keeps critical thinking, collaboration and real-world application at the centre, with GenAI as a set of tools rather than a replacement for learning. Note: GenAI is a set of tools, not a human!

Phase 1: GenAI, constructing, not copy-pasting

Students use GenAI, the lecturer, and reputable sources to explore a concept or problem linked to the learning outcomes. Lecturers guide this exploration as students work individually or in groups. With ongoing lecturer input, students may choose whether to use GenAI or other sources, but all develop an understanding of GenAI’s role in learning.

Phase 2: Scrutinise and Share

The second phase focuses on scrutinising and sharing ideas with others, not just presenting them as finished facts. Students bring GenAI outputs, reputable sources and their own thinking into dialogue. They interrogate evidence, assumptions and perspectives in groups or class discussion (social constructivism, dialogic teaching). The lecturer – the content expert – oversees this process and identifies the errors, draws attention to the errors and helps students clarify GenAI outputs.

Phase 3: Apply, low-tech, real-world

Screens step back. Students apply what they have discovered in low or no-tech ways: diagrams, mind maps, zines, prototypes, role plays, scenarios. They connect what they discovered to real contexts and show understanding through doing, making, explaining and practical application.

Phase 4: Reflect, evaluate and look forward

Students then evaluate and reflect on both their learning process and the role of GenAI. Using written, audio, video or visual reflections, they consider what they learned, how GenAI supported or distorted that learning and how this connects to their future. This reflective work, combined with artefacts from earlier phases, supports peer, self and lecturer assessment and moves us towards competency and readiness-based judgements.

Resourcing Gen.S.A.R. Yes, smaller class sizes and support would be required, but aspects of this can be implemented now (and are being implemented by some already). Time shifts to facilitation, co-learning, process feedback, and authentic evaluation (less three-thousand-word essays). This approach is not perfect but at least it’s an approach and one that draws on long‑standing learning theories, including constructivism, social constructivism, experiential learning, and traditions in inclusive and competency‑based education.

There’s No Stopping It, Time to Shape It

GenAI is not going away. Exploitative labour practices, data abuse and profit motives are real (and not exclusive to AI), and naming these harms is essential, but continuing to let these debates dominate any movement is not helpful. Universities can choose to lead (and I commend, not condemn, those who already are) with clear guidance, equitable access to safe GenAI tools and learning design. The alternative is all the risks associated with students and staff relying on personal accounts and workarounds.

For the integrity of education itself, it is time to translate debates into action. The genie is not going back in the bottle, and our profit-driven society is not only shaped by Big Tech but also by the everyday choices of those of us living privileged lives in westernised societies. It is time to be honest about our own complicity, to step out of the ivory tower and work with higher education students to navigate the impact GenAI is having on their lives right now.

Note: My views on GenAI for younger learners is very different; the suggestions here focus specifically on higher education.

Frances O’Donnell

Instructional Designer
ATU

Exploring the pros and cons of AI & GenAI in education, and indeed in society. Currently completing a Doctorate in Education with a focus on AI & Emerging Technologies.

Passionate about the potential education has to develop one’s self-confidence and self-worth, but frustrated by the fact it often does the opposite. AI has magnified our tendency to overassess and our inability to truly move away from rote learning.

Whether I’m carrying out the role of an instructional designer, or delivering workshops or researching, I think we should work together to make education a catalyst of change where learners are empowered to become confident as well as socially and environmentally conscious members of society. With or without AI, let’s change the perception of what success looks like for young people.

Keywords


UWA and Oxford Partner for Generative AI in Higher Education


A digital illustration merging the architectural styles of the University of Western Australia (UWA) and the University of Oxford. A traditional university shield or crest is split in two, with one half featuring a classic coat of arms and the other half displaying generative AI code and glowing digital patterns, symbolizing their partnership in advanced education. Image (and typos) generated by Nano Banana.
Global collaboration in the age of AI: UWA and Oxford University join forces to pioneer the integration and study of generative artificial intelligence within the landscape of higher education. Image (and typos) generated by Nano Banana.

Source

University of Western Australia

Summary

The University of Western Australia and the University of Oxford announced a formal partnership that positions generative AI as a strategic driver in the future of higher education. The collaboration focuses on advancing responsible AI research, developing governance models and integrating generative AI into teaching and learning in ways that uphold academic integrity and inclusivity. Both institutions highlight that the rapid acceleration of AI requires coordinated international responses that balance innovation with ethical safeguards. The partnership will explore curriculum transformation, staff development and AI-informed pedagogical frameworks intended to support both student learning and broader institutional capability building. By aligning two globally significant universities, the initiative signals a trend toward cross-border cooperation designed to shape sector-wide AI standards. It also indicates growing recognition that AI adoption in higher education must be underpinned by shared values, transparent methodologies and research-based evidence. This collaboration aims to become a blueprint for how universities can jointly shape the future of AI-enabled education while ensuring that human expertise remains central.

Key Points

  • Major partnership between UWA and Oxford to advance responsible AI
  • Focus on governance, research and curriculum innovation
  • Reflects global shift toward collaboration on AI strategy
  • Emphasises ethical frameworks for AI adoption in higher education
  • Positions AI as core to long-term institutional development

Keywords

URL

https://www.uwa.edu.au/news/article/2025/november/uwa-and-oxford-partner-for-generative-ai-in-higher-ed

Summary generated by ChatGPT 5.1


AI Adoption & Education for SMEs

by Patrick Shields – AI PhD Researcher, Munster Technological University
Estimated reading time: 5 minutes
A female presenter stands at the head of a modern conference table, gesturing towards a large screen that displays a flowchart titled "AI ADOPTION & EDUCATION FOR SMEs." The flowchart outlines steps like "Identify Needs," "AI Tools," "Implement," and "Growth," with various icons representing different business functions. Around the table, a diverse group of seven professionals is seated, actively engaged with laptops. Holographic data visualizations and icons float above their desks, symbolizing the integration of AI. A humanoid robot stands in the background to the right, emphasizing the AI theme. The room has large windows overlooking a city, and bookshelves line the walls. Image (and typos) generated by Nano Banana.
Small and Medium-sized Enterprises (SMEs) are increasingly looking to leverage AI, but successful adoption requires proper education and strategic integration. This image represents the crucial need for training and understanding to empower SMEs to harness AI for business growth and innovation. Image (and typos) generated by Nano Banana.

Aligning National AI Goals With Local Business Realities

As third-level institutions launch AI courses across multiple disciplines this semester, there is a unique opportunity to support an essential business cohort in this country: the small to medium-sized enterprise (SMEs). In Ireland, SMEs account for over 99% of all businesses according to the Central Statistics Office. They also happen to be struggling in AI adoption in comparison to their multinational counterparts.

Recent research has outlined how SMEs are adopting AI in a piecemeal and fragmented fashion, with just 10% possessing any AI strategy at all. Not having a strategy may indicate an absence of policy, and therein lies a significant communication issue at the heart of the AI adoption challenge. Further insights describe how four out of five business leaders believe AI is being used within their companies with little to no guardrails. This presents a significant challenge to Ireland’s National AI Strategy, which was originally published in 2021 but has since been updated to include several initiatives, such as the objective of establishing an AI awareness campaign for SMEs. The Government recognises that to achieve the original goal of 75% of all businesses embracing AI by 2030, and all the investment that this will encourage, it will be essential to address the gap between SMEs and their multinational counterparts. Perhaps these endeavours can be supported at third-level, especially given the percentage of businesses that fall into the SME bracket and the demand for upskilling.

Turning AI Potential Into Practical Know-How

Having spent the summer months of 2025 meeting businesses as part of a Chamber of Commerce AI mentoring initiative in the South East of Ireland, I believe there is a significant education gap here in which third-level institutions could assist in meeting. It became clear that the business representatives that I spoke to had serious questions about how to properly commence and embrace their AI journeys. For them, it wasn’t about the technical element because many of their existing programs and applications were adding AI features and introducing new AI-enabled tools which they could easily access. The prominent issue was in managing the process of deploying the technology in a way that matched employee buy-in with integrated, sustained & appropriate usage for maximum benefit. They require frameworks and education to roll this out effectively.

A real-world story:

As I returned to my AI adoption PhD studies this Autumn, I had the pleasure of meeting a part-time student who was employed by a local tech company in Cork. He wished to share the story of an AI initiative his employer had embarked upon, which left him feeling anxious. The company had rolled out an AI meeting transcription tool to the surprise of its employees. There had been no prior communication about its deployment, and the tool was now in active use inside the organisation. This particular student felt that the AI was useful but had its limitations, such as, for example, not being able to identify meeting speakers on the AI-generated meeting transcripts. He had his doubts as to whether the tool would stay in use at his workplace in the future, and he had not received any policy documents related to its correct handling. He also stated that he was not aware if the organisation had an AI strategy, and the manner in which the technology had been integrated into daily operations had left him and his colleagues feeling quite uneasy. He felt that the team would have benefited enormously from some communication before and during the rollout. This same student was looking to commence a course in effective AI adoption and proclaimed his belief that the industry was crying out for more training and development in this area.

The above tale of potentially failing deployment is unfortunately not an isolated case. Reports in the US have shown that up to 95% of AI pilots are failing before they ever make it to full production inside organisations. There may be many complex reasons for this, but one must certainly be the lack of understanding of the cultural impact of such change on teams, compounded by many examples of inadequate communication. It appears to me that despite the global investment in technology and the genuine intention to embrace AI, organisations continue to struggle with the employee education aspect of this transformation. If employers will prioritise training and development in partnership with education providers, they may dramatically increase their chances of success. This may include the establishment of joint frameworks for AI deployment and management with educational courses aligned to emerging business needs. 

In adopting a people development approach, companies may not only improve the chances of AI pilot success but will foster trust, alignment and buy-in. Surely this is the real promise of AI, a better, brighter organisational future, starting this winter, where your greatest asset – your people, are not left completely out in the cold and supported by higher education.

Links

https://www.irishtimes.com/business/economy/2024/08/19/smes-account-for-998-of-all-businesses-in-ireland-and-employ-two-thirds-of-workforce/

https://www.tcd.ie/news_events/articles/2025/ai-expected-to-add-250bn-to-irelands-economy-by-2035

https://enterprise.gov.ie/en/publications/national-ai-strategy.html

https://enterprise.gov.ie/en/publications/national-ai-strategy-refresh-2024.html

https://www.forbes.com/sites/andreahill/2025/08/21/why-95-of-ai-pilots-fail-and-what-business-leaders-should-do-instead

Patrick Shields

AI PhD Researcher
Munster Technological University

I am a PhD Researcher at Munster Technological University, researching how small and medium-sized businesses adopt Artificial Intelligence with a particular focus on the human, strategic and organisational dynamics involved. My work looks beyond the technical layer, exploring how AI can be introduced in practical, low-friction ways that support real business outcomes.


Keywords