XClose

Teaching & Learning

Home
Menu

Peer review with the Moodle Workshop activity – a close look

Now on the second iteration, Dr Mira Vogel reports on some opportunities and lessons learned.

Dr Mira Vogel takes part in a workshop at the UCL Education Conference 2018

31 July 2015

Working with Digital Education, The UCL Arena Centre have been trialling Workshop (Moodle’s peer assessment tool) to run a peer review activity with participants.

The scenario

Participants write a 500 word case study about any aspects of learning, teaching and assessment mapped to aspects of the UK Professional Services Framework, and review three others.

The review takes the form of summary comments (i.e. no numeric marks, no rubric, no structured questions to answer).

They have roughly a week to prepare the submission and a week to carry out the assessments. Participation is strongly encouraged but not compulsory.

From the evaluation of the first iteration

36 participants gave feedback at the end of 2014.

  • 29 participants found the experience of giving assessment positive (fine, good or excellent, 12, 14 and 3 respectively) while 7 found it unsatisfactory or poor (5 and 2 respectively).
  • Receiving assessment was less positive (fine, good or excellent, 6, 3 and 0 respectively) while 4 found it unsatisfactory and 3, poor.

The concept was good and the software worked fine but the management needed some attention.

The first problem was one of disorientation – “finding my feedback was not straightforward”.

  • We addressed this in the next iteration by using the instructions (in the Workshop settings) and announcements in person and via the News Forum.

The second and related problem was to do with lack of notification – “it wasn’t very clear how to use the system and no emails were received”; “working fine but it needs to be improved – notification; instructions”; “I did not receive any alert or instructions on how to check if the feedback from my colleague was in”.

  • We addressed this by putting diary entries for each group leader to notify, remind and instruct participants about what to do at each stage.

The third problem was that several participants didn’t receive any reviews – this was because the activity was grouped with a consequently smaller pool or reviewers for each submission, coupled with the fact that it wasn’t a compulsory activity, and exacerbated by the fact that Moodle doesn’t send out alerts when the phases switch e.g. from submission to assessment.

  • We straightforwardly addressed this by removing the groups setting and undertaking to notify students about what to expect and when.

Decisions, decisions – settings and reasons

Below are some of the less obvious settings, our rationale, and implications.

  • Grading strategy: Comments – this gives a single field into which participants type or paste summary comments.
  • Grades: none; neither for the submission nor the peer assessment.
  • Instructions for submission: as briefly as possible what participants need to do to make a successful submission.
  • Submissions after the deadline: we left this set to No (unchecked) because rather than manually allocating submissions to reviewers we wanted Moodle to handle this with a scheduled allocation at the submission deadline. Workshop (unlike Turnitin Peermark) does this once only, which means that unless somebody was prepared to go into the Workshop and manually make allocations for late submissions, those late submissions would go unreviewed. Disallowing late submissions would give a very hard cut-off for submissions but greatly reduce the admin burden. This is what we ultimately decided to do, hoping that we could increase participation through good instructions and some scheduled reminders.
  • Instructions for assessment: since the activity required reviewers to leave just a single summary comment, all we did here was direct attention to the guidance on relating the case study to the Professional Services Framework, and remind about the lack of autosave in Moodle form fields.
  • Students may assess their own work:  we left this set to No (unchecked) since one aim of the activity was to share practice.
  • Overall feedback mode: this is the setting that gives a text field for the summary comments; we set it to Enabled And Required.
  • Maximum number of feedback files: set to zero, since we wanted the experience of reading the feedback to be as seamless as possible.
  • Use examples: for this low stakes peer review we didn’t require participants to assess examples.
  • Open for assessment / open for submission: we set the the assessment phase to begin directly as the submission phase closed; this meant that we’d also to set up Scheduled Allocation to run at that time.
  • Switch to the next phase after the submissions deadline: we set this to Yes (checked); in combination with Scheduled Allocation this would reduce the amount of active supervision required on the part of staff.
  • Group mode: we left this set to No Groups. Groups of four (learning sets which we call Quartets) had been set up on Moodle but the previous iteration had shown that when applied to a Workshop (set not to allow self-assessment) the would diminish the pool of possible submissions and possible reviewers, and was vulnerable to non-participation.
  • Grouping: constrasting with Groups, this allows a given activity or resource to be hidden from everyone except the chosen grouping. We’d set up Groupings in the Moodle area corresponding to UCL schools, because the sessions (and therefore the deadlines) for them happen at different times. So we set up Moodle workshops which were duplicates in every respect except the dates.
  • Scheduled allocations: these can be set up via a link from the dashboard.
  • Enable scheduled allocations: Yes (checked) for the reasons above. This would happen once at the end of the Submission Phase.
  • Number of reviews: we set three per submission but if (rather than focusing on ensuring that each submission got three reviews) we wanted to shift the emphasis onto the reviewing process we could have set three per reviewer.
  • Participants can assess without having submitted anything: we left this set to No (unchecked) reasoning that participants were more likely to receive reviews if we kept the pool of reviewers limited to those who were actively participating. (That said, we could do with finding out more about how Workshop allocates reviews if they are set to allocate to reviewers rather than to submissions.)

Dates for diaries

That said, where participants are unfamiliar with the process any peer review activity needs quite active supervision.

For this reason, Arena staff (who have many other commitments) put dates in their diaries to monitor participation and send reminders, as well as to maintain awareness of which phases the activity was in. Of particular note, to release the feedback to participants a staff member needs to actively close the activity by clicking something in the Workshop dashboard.