XClose

Teaching & Learning

Home
Menu

How we raised our NSS feedback and assessment scores by 26% in three years

Caroline Garaway, Senior Lecturer in UCL Anthropology, explains the department’s comprehensive approach, which included constant dialogue with students.

A student in the Petrie Museum

24 April 2018

UCL Anthropology has dramatically improved NSS scores on feedback and assessment over the last few years, taking them from a stubbornly low 50’s score in the years leading up to 2014 to an above-UCL and -sector average by 2017.

The department recognised that they were failing on several levels, and took a broad approach to tackling the issue, focusing on everything from making sure students recognised feedback when they got it, to increasing the transparency of feedback deadlines, undertaking a thorough analysis of the undergraduates’ assessment journeys and improving marking criteria and consistency in the quality and quantity of feedback.

Dr Garaway says: "Critical to the success of the approach were the tangible improvements we made. But it was also very much about the management of student expectations and constant dialogue with them about what they felt was needed and what was being done."

Step 1: learn from best practice

  • We took lessons from UCL Linguistics, who have very good rates of student satisfaction with assessment and feedback;
  • attended UCL Arena sessions;
  • carried out desk research.

Step 2: identify easy wins

  • To help students recognise feedback, we created a checklist of different types of feedback for each module and put it on moodle
  • We now make sure that students know where, when and how they can get further feedback. Simple name changes can help: we changed staff ‘office hours’ to ‘Student Feedback & Consultation hours.’
  • We made clear to students the department’s commitment to improving assessment and feedback and their role in helping us.

Step 3: carry out more intensive work to identify problems and solutions

With the help of the UCL Arena Centre, we ran the TESTA (Transforming the Experience of Students Through Assessment) process in 2015-16 including:

  • An audit of the number and types of assessments set across our undergraduate programmes from first to third year;
  • a standardized student questionnaire relating to the learning outcomes of assessment that allowed us to compare ourselves with the HE social science sector more widely;  
  • focus groups to hear from our students about how they perceived their assessment ‘journey’.

Alongside the TESTA, the department used a tool to map the pattern of assessments over each academic year, revealing bottlenecks with clusters of assessments, usually at the end of terms.

There was much to be satisfied about but also much to learn:  

  • assessment types were varied (which was much appreciated by students) but there were too many of them and intense deadline bottlenecks, usually at the ends of term;
  • the nature of assessment was not always clear at the point of module selection and as modules got under way, deadlines could change or conflicting information could be found in different places;
  • and when feedback did arrive, sometimes late, that feedback was, at times, minimal, inconsistent and unclear.

Step 4: implement solutions

Deadlines

Individual module convenors are encouraged to space deadlines across terms and even into the holidays, reducing bottlenecks, allowing effective use of feedback and enabling staff to choose times when they will be able to get the marking done, well, and in time.

They are also advised to consider reducing the number of assessments. All module deadlines, for feedback as well as for hand in, are published in an open spreadsheet, enforcing the early communication of deadlines and helping students with module choice. Any deadline changes are managed centrally and convenors experiencing problems with their marking load ask for marking assistance and/or communicate changes with students well in advance. This transparency has resulted in a reduction of 90% in late feedback.

Marking criteria and feedback

We adapted the UCL framework to improve our departmental marking criteria and created a new feedback rubric, which was tested and revised with staff and students. The assessment criteria for each individual piece of coursework (standard or bespoke) are explicitly stated on a standardised moodle front page for each course. The role of the departmental writing tutor has been enhanced, providing stand-alone sessions for students on understanding marking criteria and weekly student feedback consultation hours to discuss their feedback. Minimum standards for feedback have also been developed and we run sessions with all new Teaching Fellows and Postgraduate Teaching Assistants to give examples of good feedback.

Step 5: create a cycle of continuous improvement

Our next plan is to improve the way we support assessment and feedback literacy among students. There are examples of pioneering use of inexpensive new technology elsewhere in the UK sector to help students engage with marking criteria, write essays and access further study skills support, but also to provide video feedback to improve student satisfaction.

Whilst low NSS scores may have provided the impetus for an overhaul of assessment practice in the department, improved NSS scores are, of course, not the biggest gain. Instead the process has increased our collective understanding of the issues and problems faced by all in the department when it comes to assessment, be that students, academics or our professional services staff.  This development of mutual respect and responsibility is the key to improved outcomes, not just for assessment and not just for students but for the wellbeing of the department as a whole.