XClose

Teaching & Learning

Home
Menu

AI in education: your questions answered

All the latest questions and answers on AI in education at UCL.

Do you have a question that is not answered? Submit through this form. 

Is there an institutional licence for any AI software?  

UCL staff and students can access Microsoft CoPilot, which can be used for both text and image generation. With commercial data protection, this is intended as a more secure alternative than other GenAI services. If you wish to use GenAI, then this is the safest way to do so. 

If you're logged into Microsoft CoPilot with your UCL credentials, what goes in – and what comes out – is not saved or shared, and your data is not used to train the models. 

Find out more and how to access Microsoft CoPilot on UCL's Microsoft Office 365 Sharepoint site.

This is a complex, ethically problematic and fast moving area. There are real concerns about generative AI tools' use and management of data, lack of transparency about the training data they have used, and ethical concerns about the way that they have trained and checked their models. Read an introduction to what GenAI tools are and how they work.   

What do I tell my students about using AI? 

It's important that we help to build students' critical AI literacy and to create an environment of transparency where students can voice their concerns.

We recommend that educators take time to explore the capabilities and fallabilities of AI with students and signpost where to find guidance. To familiarise yourself, perhaps start with our brief introduction to GenAI.  

Students (and staff) can now undertake an optional course  Generative AI and Academic Skills on UCL Moodle. The course explores GenAI's current capabilities and limitations, ethical issues surrounding these technologies and their use in scholarly activities. The 'essential' part of the course takes about three hours to complete.

We need to encourage students to treat anything GenAI produces with scepticism. These tools can generate superficial, inaccurate and unhelpful outputs but that appear to be original and thoughtful. 

We need to be clear with students about when and how use of AI is permitted in assessment. Read guidance for staff on the three categories of AI use in assessments

Where students are using generative AI in assessed work, they should acknowledge it openly. Find guidance for students on how to acknowledge the use of generative AI

How can AI support learning? 

There are many ways that this technology can be useful for students and many opportunities for educators.

For example, Professor Sharples suggests that prompting the AI to act as a 'Socratic opponent' could be a particuarly useful tool for students. A dialogue with AI can provide a helpful challenge to a student's thinking, encourage them to ciritically engage with the technology, and could be used to inform an argumentative essay. Watch highlights of his talk on GenAI and education

Generative AI shows the potential to supplement certain academic tasks, such as summarising long articles (under 3,000 words), acting as a tutor for students, or suggesting scholars to include in a literature review. There are also developments in translanguaging generative AI and new AI tools for sign languages. 

However, at present, limitations prevent it from replacing traditional learning and assessment methods.

A recent ChangeMakers project in the SSEES at UCL found that those students who may seek to use generative AI tools for academic misconduct (such as ghost-writing essays) are unlikely to receive better than below-average marks. For students looking to cut corners, generative AI tools may prevent failure, but they are highly unlikely to deliver better-than-average marks.

Generative AI tools still require much supervision and evaluation of their outputs. In an academic setting, this requires a contextual understanding of the subject matter.

When do I need to decide which category of AI use to apply to my assessments? 

Staff should discuss with students the category that their assessments fall into at beginning of each module, and again as specific assessment deadlines approach. Read guidance for staff on the categories

If students use GenAI inappropriately, is this considered contract cheating?

No, this will be classified as collusion or plagiarism: defined as the representation of other people’s work or ideas as the student’s own without appropriate referencing or acknowledgement. This includes the use of GenAI tools that exceeds that permitted in the assessment brief.

What do I do if I suspect a student of using GenAI inappropriately?

In the first instance you should report it in the same way as any other Academic Misconduct, using the appropriate form, to your Exam Board Chair who may initiate an Investigatory Viva. Find more details on the process in UCL's Academic Manual. The purpose of the viva will be to assess whether, on the balance of probabilities, there is prima facie evidence that would support the conclusion that the work, or sections of it, were not authored by the student. Find guidance on conducting an Investigatory Viva for academic year 2023-24.

Please also refer to UCL's Feedback and Assessment SharePoint site to access UCL’s most up to date policies and guidance.