The Transformative Power of Communities of Practice in AI Upskilling for Educators

By Bernie Goldbach, RUN EU SAP Lead
Estimated reading time: 5 minutes
A diverse group of five educators collaboratively studying a glowing, holographic network of digital lines and nodes on a table, symbolizing their shared learning and upskilling in Artificial Intelligence (AI) within a modern, book-lined academic setting. Image (and typos) generated by Nano Banana.
The power of collaboration: Communities of Practice are essential for educators to collectively navigate and integrate new AI technologies, transforming teaching and learning through shared knowledge and support. Image (and typos) generated by Nano Banana.

When the N-TUTORR programme ended in Ireland, I remained seated in the main Edtech25 auditorium to hear some of the final conversations by key players. They stood at a remarkable intersection of professional development and technological innovation. And some of them issued a call to action for continued conversation, perhaps engaging with generative AI tools within a Community of Practice (CoP).

Throughout my 40 year teaching career, I have walked pathways to genuine job satisfaction that extended far beyond simple skill acquisition. In my specific case, this satisfaction emerged from the synergy between collaborative learning, pedagogical innovation, and an excitement that the uncharted territory is unfolding alongside peers who share their commitment to educational excellence.

Finding Professional Fulfillment Through Shared Learning

The journey of upskilling in generative AI feels overwhelming when undertaken in isolation. I am still looking for a structured CoP for Generativism in Education. This would be a rich vein of collective discovery. At the moment, I have three colleagues who help me develop my skills with ethical and sustainable use of AI.

Ethan Mollick, whose research at the Wharton School has illuminated the practical applications of AI in educational contexts, consistently emphasises that the most effective learning about AI tools happens through shared experimentation and peer discussion. His work demonstrates that educators who engage collaboratively with AI technologies develop more sophisticated mental models of how these tools can enhance rather than replace pedagogical expertise. This collaborative approach alleviates the anxiety many educators feel about technological change, replacing it with curiosity and professional confidence.

Mairéad Pratschke, whose work emphasises the importance of collaborative professional learning, has highlighted how communities create safe spaces where educators can experiment, fail, and succeed together without judgment. This psychological safety becomes the foundation upon which genuine professional growth occurs.

Frances O’Donnell, whose insights at major conferences have become invaluable resources for educators navigating the AI landscape, directs the most effective AI workshops I have attended. O’Donnell’s hands-on training at conferences such as CESI (https://www.cesi.ie), EDULEARN (https://iceri.org), ILTA (https://ilta.ie), and Online Educa Berlin (https://oeb.global) have illuminated the engaging features of instructional design that emerge when educators thoughtfully integrate AI tools. Her instructional design frameworks demonstrate how AI can support the creation of personalised learning pathways, adaptive assessments, and multimodal content that engages diverse learners. O’Donnell’s emphasis on the human element in AI-assisted design resonates deeply with Communities of Practice

And thanks to Frances O’Donnell, I discovered the AI assistants inside H5P.

Elevating Instructional Design Through AI-Assisted Tools

The quality of instructional design, personified by clever educators, represents the most significant leap I have made when combining AI tools with collaborative professional learning. The commercial version of H5P (https://h5p.com) has revolutionised my workflow when creating interactive educational content. The smart import feature of H5P.com complements my teaching practice. I can quickly design rich, engaging learning experiences that would previously have required specialised technical skills or significant time investments. I have discovered ways to create everything from interactive videos with embedded questions to gamified quizzes and sophisticated branching scenarios.

I hope I find a CoP in Ireland that is interested in several of the H5P workflows I have adopted. For the moment, I’m revealing these remarkable capabilities while meeting people at education events in Belgium, Spain, Portugal, and the Netherlands. It feels like I’m a town crier who has a notebook full of shared templates. I want to offer links to the interactive content that I have created with H5P AI and gain feedback from interested colleagues. But more than the conversations at the conferences, I’m interested in making real connections with educators who want to actively participate in vibrant online communities where sustained professional learning continues.

Sustaining Innovation with Community

Job satisfaction among educators has always been closely tied to their sense of efficacy and their ability to make meaningful impacts on student learning. Communities of Practice focused on AI upskilling amplify this satisfaction by creating networks of mutual support where members celebrate innovations, troubleshoot challenges, and collectively develop best practices. When an educator discovers an effective way to use AI for differentiation or assessment design, sharing that discovery with colleagues who understand the pedagogical context creates a profound sense of professional contribution.

These communities also combat the professional tension that currently faces proficient AI users. Mollick’s observations about blowback against widespread AI adoption in education reveal a critical imperative to stand together with a network that validates the quality of teaching and provides constructive feedback. When sharing with a community, individual risk-taking morphs into collective innovation, making the professional development experience inherently more satisfying and sustainable.

We need the spark of N-TUTORR inside an AI-focused Community of Practice. We need to amplify voices. Together we need to become confident navigators of innovation. We need to co-create contextually appropriate pedagogical approaches that effectively leverage AI in education.


Keywords


A History Professor Says AI Did Not Break College; It Exposed How Broken It Already Was


A dramatic, conceptual image showing a crumbling, old-fashioned column (representing "Traditional College Structure") with cracks widening as digital light and AI code seep into the fissures, emphasizing that AI revealed existing weaknesses rather than caused the damage. Image (and typos) generated by Nano Banana.
Unmasking the flaws: A history professor’s perspective suggesting that AI merely shone a light on the structural vulnerabilities and existing problems within higher education, rather than being the sole source of disruption. Image (and typos) generated by Nano Banana.

Source

Business Insider

Summary

This article features a U.S. history professor who argues that generative AI did not cause the crisis currently unfolding in higher education but instead revealed long-standing structural flaws. According to the professor, AI has exposed weaknesses in assessment design, unclear expectations placed on students and unsustainable workloads carried by academic staff. The sudden visibility of AI-generated essays and assignments has forced institutions to confront the limitations of traditional assessment models that rely heavily on polished written output rather than demonstrated cognitive processes. The professor notes that AI has unintentionally highlighted inequities in student preparation, inconsistencies in grading norms and the mismatch between institutional rhetoric and actual resourcing. Rather than attempting to suppress AI, the article argues that higher education should treat this moment as an opportunity to redesign curricula, diversify assessments and rethink the broader purpose of university education. The piece positions AI as a catalyst for long-overdue reform, emphasising that genuine improvement will require institutions to invest in pedagogical redesign, staff support and clearer communication around learning outcomes.

Key Points

  • AI highlighted systemic weaknesses already present in higher education
  • Exposed flaws in assessment design and grading expectations
  • Revealed pressures on overworked teaching staff
  • Suggests AI could drive constructive reform
  • Encourages rethinking pedagogy and institutional priorities

Keywords

URL

https://www.businessinsider.com/ai-didnt-break-college-it-exposed-broken-system-professor-2025-11

Summary generated by ChatGPT 5.1


‘We Could Have Asked ChatGPT’: Students Fight Back Over Course Taught by AI


A digital illustration of a group of diverse students standing in a classroom, looking frustrated and pointing towards an empty podium where a holographic projection of a generic AI avatar is visible. The text "WE COULD HAVE ASKED CHATGPT" is superimposed above the students. Image (and typos) generated by Nano Banana.
The revolt against automation: Capturing the frustration of students pushing back against educational institutions that rely on AI to replace human instructors. Image (and typos) generated by Nano Banana.

Source

The Guardian

Summary

Students on a coding apprenticeship at the University of Staffordshire say they were “robbed of knowledge” after discovering that large portions of their course materials—including slides, assignments and even voiceovers—were generated by AI. Despite university policies restricting students’ use of AI, staff appeared to rely heavily on AI-generated teaching content, leading to accusations of hypocrisy and declining trust in the programme. Students reported inconsistent editing, generic content and bizarre glitches such as a mid-video switch to a Spanish accent. Complaints brought little change, and although human lecturers delivered the final session, students argue the damage to their learning and career prospects has already been done. The case highlights rising tensions as universities increasingly adopt AI tools without transparent standards or safeguards.

Key Points

  • Staffordshire students discovered widespread use of AI-generated slides, tasks and videos.
  • AI usage contradicted strict policies prohibiting students from submitting AI-generated work.
  • Students reported generic content, inconsistent editing and AI voiceover glitches.
  • Repeated complaints yielded limited response; a human lecturer was added only at the end.
  • Students fear lost learning, reduced programme credibility and wasted time.

Keywords

URL

https://www.theguardian.com/education/2025/nov/20/university-of-staffordshire-course-taught-in-large-part-by-ai-artificial-intelligence

Summary generated by ChatGPT 5


UWA and Oxford Partner for Generative AI in Higher Education


A digital illustration merging the architectural styles of the University of Western Australia (UWA) and the University of Oxford. A traditional university shield or crest is split in two, with one half featuring a classic coat of arms and the other half displaying generative AI code and glowing digital patterns, symbolizing their partnership in advanced education. Image (and typos) generated by Nano Banana.
Global collaboration in the age of AI: UWA and Oxford University join forces to pioneer the integration and study of generative artificial intelligence within the landscape of higher education. Image (and typos) generated by Nano Banana.

Source

University of Western Australia

Summary

The University of Western Australia and the University of Oxford announced a formal partnership that positions generative AI as a strategic driver in the future of higher education. The collaboration focuses on advancing responsible AI research, developing governance models and integrating generative AI into teaching and learning in ways that uphold academic integrity and inclusivity. Both institutions highlight that the rapid acceleration of AI requires coordinated international responses that balance innovation with ethical safeguards. The partnership will explore curriculum transformation, staff development and AI-informed pedagogical frameworks intended to support both student learning and broader institutional capability building. By aligning two globally significant universities, the initiative signals a trend toward cross-border cooperation designed to shape sector-wide AI standards. It also indicates growing recognition that AI adoption in higher education must be underpinned by shared values, transparent methodologies and research-based evidence. This collaboration aims to become a blueprint for how universities can jointly shape the future of AI-enabled education while ensuring that human expertise remains central.

Key Points

  • Major partnership between UWA and Oxford to advance responsible AI
  • Focus on governance, research and curriculum innovation
  • Reflects global shift toward collaboration on AI strategy
  • Emphasises ethical frameworks for AI adoption in higher education
  • Positions AI as core to long-term institutional development

Keywords

URL

https://www.uwa.edu.au/news/article/2025/november/uwa-and-oxford-partner-for-generative-ai-in-higher-ed

Summary generated by ChatGPT 5.1


This is not the end but a beginning: Responding to “Something Wicked This Way Comes”

By Kerith George-Briant and Jack Hogan, Abertay University Dundee
Estimated reading time: 5 minutes
A conceptual illustration showing a digital roadmap splitting into two distinct, glowing paths, one labeled "Secure Assessment" and the other "Open Assessment." The background blends subtle academic motifs with swirling binary code, symbolizing the strategic integration of Generative AI into higher education assessment practices. Image (and typos) generated by Nano Banana.
Navigating the future: The “Two-Lane Approach” to Generative AI in assessment—balancing secure testing of threshold concepts (Lane 1) with open collaboration for developing AI literacy and critical thinking (Lane 2). Image (and typos) generated by Nano Banana.

O’Mahony’s provocatively titled “Something Wicked This Way Comes” blog outlined feelings we recognised from across the sector, which were that Generative AI (GenAI) tools have created unease, disruption, and uncertainty. In addition, we felt that GenAI provided huge opportunities, and as higher education has led and celebrated innovation in all disciplines over centuries, how this translated into our assessment practices intrigued us. 

At Abertay University, we’ve been exploring the “wicked problem” of whether to change teaching practices through a small-scale research project entitled “Lane Change Ahead: Artificial Intelligence’s Impact on Assessment Practices.” Our findings agree with O’Mahony’s observations that while GenAI does pose a challenge to academic integrity and traditional assessment models, it also offers opportunities for innovation, equity, and deeper learning, but we must respond thoughtfully and acknowledge that there are a variety of views on GenAI.

Academic Sensemaking

To understand colleagues’ perspectives and experiences, we applied Degn’s (2016) concept of academic sensemaking to understand how the colleagues we interviewed felt about GenAI. Findings showed that some assessment designers are decoupling, designing assessments that use GenAI outputs without requiring students to engage with the tools. Others are defiant or defeatist, allowing limited collaboration with GenAI tools but awarding a low percentage of the grade to that output. And some are strategic and optimistic, embracing GenAI as a tool for learning, creativity, and employability.

The responses show the reasons for unease are not just pedagogical; they’re deeply personal. GenAI challenges academic identity. Recognising this emotional response is essential to supporting staff if change is needed.

Detection and the Blurred Line

And change is needed, we would argue. Back in 2023, Perkin et al’s analysis of Turnitin’s AI detection capabilities revealed that while 91% of fully AI-generated submissions were flagged, the average detection within each paper was only 54.8% and only half of those flagged papers would have been referred for academic misconduct. Similar studies since then have continued to show the same types of results. And if detection isn’t possible, setting an absurd line as referred to by Corbin et al is ever more incongruous. There is no reliable way to indicate whether a student has stopped at the point of using AI for brainstorming or has engaged critically with AI paraphrased output. Some may read this and think that it’s game over, however if we embrace these challenges and adapt our approaches, we find solutions that are fit for purpose.

From Fear to Framework: The Two-Lane Approach

So, what is the solution? Our research explored whether the two-lane approach developed by Liu and Bridgeman would work at Abertay, where:

  • Lane 1: Secure Assessments would be conducted under controlled conditions to assure learning of threshold concepts and
  • Lane 2: Open Assessments would allow unrestricted use of GenAI.

Our case studies revealed three distinct modes of GenAI integration:

  • AI Output Only – Students critiqued AI-generated content without using GenAI themselves. This aligned with Lane 1 and a secure assessment method focusing on threshold concepts.
  • Limited Collaboration – Students used GenAI for planning and a minimal piece of output within a larger piece of assessment, which did not allow GenAI use. Students developed some critical thinking, but weren’t able to apply this learning to the whole assessment.
  • Unlimited Collaboration – Students were fully engaged with GenAI, with reflection and justification built into the assessment. Assessment designers reporting that students produced higher quality work and demonstrated enhanced critical thinking.

Each mode reflected a different balance of trust, control, and pedagogical intent. Interestingly, the AI Output pieces were secure and used to build AI literacy while meeting PSRB requirements, which asked for certain competencies and skills to be tested. The limited collaboration had an element of open assessment, but the percentage of the grade awarded to the output was minimal, and an absurd line was created by asking for no AI use in the larger part of the assessment. Finally, the assessments with unlimited collaboration were designed because those colleagues believed that writing without GenAI was not authentic, and they believed that employers would expect AI literacy skills, perhaps not misplaced based on the figure given in O’Mahony’s blog.

Reframing the Narrative: GenAI as Opportunity

We see the need to treat GenAI as a partner in education, one that encourages critical reflection. This will require carefully scaffolded teaching activities to develop the AI literacy of students and avoid cognitive offloading. Thankfully, ways forward have begun to appear, as noted in the work of Gerlick and Jose et al.

Conclusion: From Wicked to ‘Witch’ lane?

As educators, we have a choice. We can resist, decouple from GenAI or we can choose to lead the narrative strategically and optimistically. Although the pathway forward may not be a yellow brick road, we believe it’s worth considering which lane may suit us best. The key is that we don’t do this in isolation, but we take a pragmatic approach across our entire degree programme considering the level of study and the appropriate AI literacy skills.

GenAI acknowledgement:
Microsoft Copilot (https://copilot.microsoft.com) – used to create a draft blog from our research paper.

Kerith George-Briant

Learner Development Manager
Abertay University

Kerith George-Briant manages the Learner Development Service at Abertay. Her key interests are in building best practices in using AI, inclusivity, and accessibility.

Jack Hogan

Lecturer in Academic Practice
Abertay University

Jack Hogan works within the Abertay Learning Enhancement (AbLE) Academy as a Lecturer in Academic Practice. His research interests include student transitions and the first-year experience, microcredentials, skills development and employability. 


Keywords