XClose

UCL News

Home
Menu

More secure generative AI tool available for staff and students 

12 March 2024

UCL staff and students can now access Microsoft Copilot. Microsoft Copilot is a safer generative AI tool than using the consumer-oriented generative AI services, provided you log in with your UCL credentials.  

two students working at desktop computers

What can generative AI do? 

Generative AI is a type of artificial intelligence technology that can produce various types of content, including text and images. It uses deep-learning models which are trained on very large amounts of data. Based on this data, it can generate content by making new predictions based on the patterns it finds.  

This type of technology is not brand new, but has advanced considerably recently, leading to many more users and a buzz around its popularity. 

Copilot draws on large language models like GPT-4 and image generator DALL-E, as well as data from public websites, meaning it can generate both text and images. 

There is a lot of potential for UCL staff and students in using Copilot including its potential impact on productivity, removing inefficiencies, supporting creativity and content generation through the way we work and study. But a lot is still unknown, and there are important cautions to be aware of before using AI at UCL which are covered below.  

You can read an introduction to generative AI for more information on how it works, and its strengths and weaknesses. 

Copilot gives better protection 

Using public AI services could expose UCL's data. Generally, if you are using ‘open’ internet tools not under license, then data put in will become public and is usually considered the property of the AI developer. 

If you sign into Microsoft Copilot with your UCL credentials you can be confident that: 

  • your prompts, questions and results are protected 
  • the chat data is not saved 
  • Microsoft has no eyes-on access.  

This means no one can view your data and your data is not used to train the AI models.  

Microsoft Copilot is grounded in data from the public web and provides verifiable answers with citations, along with visual answers that include graphs and charts. It is designed in line with Microsoft's AI principles.  

Remain cautious of data protection  

Although data is more secure when using Microsoft Copilot, staff should still be careful of what they enter into the tool. 

You should not enter personal, confidential or special category data into the tool. We advise that you only enter content for which you have the copyright.

Remember all the same rules of data protection apply to AI as it does to any other app we use.  

If you are logged with your UCL credentials, Copilot will not store your queries or results, so data that remains within the tool will not be subject to Freedom of Information and Subject Access Requests.   

Remain cautious of inaccuracies  

You should treat what is generated with scepticism. Generative AI tools have been known to produce false or misleading results, often known as ‘hallucinations’. This can be caused by biases in the data, gaps in the data or incorrect assumptions.  

The ethical concerns and limitations of generative AI tools apply to Copilot, as to other available services. Please be aware that the output can be superficial and inaccurate.   

How to use it 

To access Copilot, visit the UCL Microsoft 365 SharePoint site.  

You must sign in with your UCL credentials to have the benefit of Microsoft's commercial data protection.  

Copilot applied to different uses  

Different staff and student groups in UCL will use Copilot in different ways through their work and study.  

We are encouraging groups of staff with similar roles to come together and work through more specific guidance, FAQs and training opportunities. The aim is for our community to become familiar with the opportunities and cautions in using Copilot.  

A working group for AI in Education is well-established and you can find resources and guidance on using AI in education on the UCL generative AI hub.   

We will be working with experts across our community to share further guidance on working groups and best practice in the coming months.  


Further learning