Blended Learning Guidance Support Teaching Tools

Artificial Intelligence and Teaching and Learning

Artificial Intelligence and Machine Learning

Purpose

This guidance supports staff in using Generative AI (GenAI) responsibly and effectively in teaching and learning. It aligns with the University of Manchester policies and promotes academic integrity, digital fluency, and inclusive education.

This article covers:

  1. Core Principles for Using GenAI
  2. Recommended Tools
  3. Practical Uses of GenAI in Teaching

Core Principles for Using GenAI

To achieve best practice when using GenAI, you must adhere to the below core principles.

Support, not substitute

AI should enhance—not replace—student thinking, creativity, and academic development.

Transparency

Staff should be open about when and how AI is used. Students must cite AI-generated content appropriately.

Accuracy

Always verify AI outputs against trusted academic sources.

Privacy

Never input personal, confidential, or sensitive data into AI tools.

Copyright

Do not upload or share copyrighted materials with AI platforms.

For more information, please consult the following guides:

Recommended Tools

Below are some AI tools recommended by the University.

Microsoft Copilot: The University recommends using Microsoft Copilot for AI-related work. It is GDPR-compliant and protects University and personal data. Even though Copilot is GDPR compliant, you should ALWAYS carefully consider whether adding personal information into an AI tool is necessary.

AI Hub: The University of Manchester Central resource for AI tools, training, and policy updates.

For more information, please consult the following guide:

Practical Uses of GenAI in Teaching

You may be wondering what exactly are the benefits of GenAI in education. This section highlights how GenAI, when used correctly, enhances the teaching and learning experience for both educators, and students.

For Educators:

  • Design inclusive learning experiences, especially for students with neurodiverse needs or English as an additional language.
  • Encourage critical engagement with AI outputs to build digital literacy.
  • Use AI to support feedback generation, content summarisation, or idea generation (with caution and transparency).

Academic Staff should take some time to understand how AI is being used within their respective industries and start to plan how these approaches are reflected within the Curriculum.

For more information, please consult the following guide:

For Students (with guidance):

  • Brainstorm and structure assignments.
  • Clarify complex concepts or summarise readings.
  • Translate or refine grammar in their own writing.
  • Explore alternative perspectives or mind map ideas.
  • Cross-check understanding with course materials and academic sources.
  • Generate tasks and activities that support student revision.

For more information, please consult the following guide:

Key Principles for AI-Resilient Assessment Design

  • Focus on Higher-Order Thinking Skills: Design tasks that require critical analysis, creativity, problem-solving, ethical reasoning, and reflection—areas where AI struggles to fully replicate human thought.
  • Emphasize Process Over Product: Assess not just the final submission but the student’s learning process, you could:
    • Have students submit drafts of larger pieces of work as they progress through the course and encourage reflection on how the work has evolved.
    • Include how AI has been used in their work and encourage reflection of its usefulness.
  • Integrate AI Responsibly: We can’t turn Ai off! Using AI will become a key life skill. So having students think about how AI will impact not only their studies but also their working lives will help them understand the issues in a broader context.
  • Diversify Assessment Types: A broader range of assessment types not only helps make AI use more challenging but also allows students with different needs an opportunity excel in different modes of assessment.
  • Set Clear AI Use Policies: Establish transparent guidelines on acceptable AI use, addressing academic integrity, privacy, intellectual property, and ethical considerations.
  • Promote Student Accountability: Include components like follow-up discussions or interviews about submitted work to verify understanding and discourage AI misuse.

Avoid the Exam ‘Gold Rush’

AI creates challenges with assessment; it may be tempting to revert to a model of 100% invigilated exam to avoid AI issues. However, good assessment should be authentic – reflecting the real challenges students will face in their working lives – this is not exams. Additionally, not every student functions well in exam conditions, so assessment solely by exam means you are limiting some students in reaching their full potential.

Practical Strategies

  • Process-Product Assessment: Evaluate both the final output and the collaborative steps students take with AI tools, encouraging deeper engagement and learning.
  • Frequent Low-Stakes Quizzes: Increase quizzes to lower stakes and encourage ongoing learning checks, making it harder to rely solely on AI for success.
  • AI-Resistance Assessment Scale (AIAS): A tool to design tasks that promote critical thinking and responsible AI use, assessing both the process and product of student work.
  • Use of AI Detection Tools: While detection tools have limitations, they can be part of a broader strategy alongside redesigned assessments and policies.

Further Resources

Academic Integrity and Referencing

Students must cite any AI-generated content used in their work. For example, using the **Harvard Manchester** style, AI tools should be referenced under the “Software” category. Uncited use of AI-generated content may be considered plagiarism or academic misconduct.

Refer students to:
Library support for Digital Skills

Academic Integrity

Event details (The University of Manchester Library)

Further Reading & Resources

An Essay in Seconds: What AI Reveals About Broken Assessments and the Need for Pedagogical Reimagining

Generative artificial intelligence (AI) in education – GOV.UK

Russell Group principles on generative AI in education.pdf

How we use generative AI tools | Communications | University of Cambridge

generative-ai | Policies | People Experience | Vanderbilt University


Other articles from the FSE Digital and Flexible Learning Toolkit