- About CPD4HE
- Project Team
- Project Blog
- Team Area
- Assessment and feedback to students
- Academic Literacies
- Learning, Teaching and Technologies
- Research-Teaching Relationships
- Quality in Higher Education
- Values in Higher Education
- Designing and Planning Teaching
- Designing the Curriculum
- Skills in Higher Education
- Author(s): Dr Rosalind Duhs
- Title: Assessment and feedback to students
- Subject: HE - Education
- Keywords: UKOER, UKPSF, OMAC, CPD4HE, assessment, feedback
- Language(s): English
- Material type(s): Text, Presentation
- File format(s): ZIP, HTML, PDF, RTF, ODT, PPT, DOC
- File size: Various
- Publish Date: 6th July 2011, 12th August 2011
- Licence: CC-BY-NC-SA
Assessment for Learning I
Explanation of PowerPoint Presentation for Workshop Facilitators and HE Teachers
This introduction to assessment for learning is based on the power point presentation entitled ‘Assessment for Learning I’. The slides are reproduced and comments made below each to support anyone who would like to use the presentation or parts of it for session facilitation.
This text will also be useful for self-study. Links to further resources are provided. Teachers in higher education will find the approaches and techniques suggested here useful. They involve participants in active learning which is effective for all teaching in higher education (Prince, 2004).
 A note on the meaning of ’participant’ and ’student’. ’Participant’ is used to denote staff taking part in the session. ’Student’ is used for the university learners they teach.
The reason for starting with a quick round of introductions is that discussion is important for learning of this kind. Workshop participants are much more likely to take part in an exchange of views if they have introduced themselves in the learning environment. Hearing your own voice and simply saying who you are and what you do initially in a learning situation makes it less challenging to speak when more complex issues are raised. This strategy helps to overcome reluctance to talk.
The points on interest in assessment and concerns are valuable. If the areas which are most relevant to participants are highlighted, they will engage and feel motivated.
It is useful to record the key concepts which emerge during this opening discussion on a flip chart which you can display and refer to as you work through the rest of the workshop. Participants will then know that their concerns are being addressed and you can ensure that you make links to their experience and teaching context.
Intended learning outcomes help the facilitator to clarify what participants should be able to do by the end of the session. They provide a structure and lead into relevant and helpful learning activities to enable participants to attain the outcomes which have been specified.
These definitions are important to teachers who work with assessment. It will be difficult for participants to follow the session if they are not familiar with these basic terms.
Basing learning on personal experience is an effective way of creating links between previously-acquired knowledge and new knowledge. The discussion of experience often highlights central issues which can also be referred to subsequently. The ‘think pair share’ model, whereby individual reflection is followed by a discussion between neighbours, then opened out to the whole group, usually works well.
In this example, the ‘think pair share’ approach also enables participants to consider the situation of the learner whose performance is being assessed. Empathy with the ‘victim’ of assessment helps teachers to appreciate that some forms of assessment can be stressful and may not reflect ability fairly.
This slide builds on the Slide 5 activity. Participants are now invited to focus on a predicament which is all too common, students not knowing what is expected of them. Anna is a real student who studied at UCL so this is an authentic case. It is useful for participants to suggest their own strategies. These can be further developed through discussion and built on as the workshop progresses.
Alverno College in the United States pioneered an approach to assessment which promoted multifaceted (all-round) learning. Mentowski’s (2000) research follows the development of college students and their perceptions of the impact of their learning on their subsequent careers. Each of the bullet points in this slide is clarified below.
Authenticity (’contexts related to life roles’) is important if assessment is to be useful in the long term. For instance, students are unlikely to be required to sit down and write long texts about what they know by hand within strict time constraints during their professional life, as they are during traditional examinations. They are more likely to be asked to present their ideas, enter into a dialogue with clients, patients, colleagues, or customers, and use their judgement to make decisions. Assessment tasks involving presentations, role play, negotiation, and the opportunity to complete projects are examples of assessment tasks which are closer to life roles.
Sharing learning outcomes and assessment criteria with students from the start of a course or module is basic good practice. Students who know what is expected of them are more likely to rise to challenges and do better. Self assessment is a central part of the Alverno model. Professional learning and general preparation for work involve the ability to identify areas for further development. Self-awareness and reflection on learning are therefore important attributes for learners whatever their area of study.
’Multiplicity’ implies that a variety of assessment tasks should be used and build on each other. They should become broader and increasingly complex over time as students learn more and develop. It makes sense to assess in different ways because many skills and abilities can then be demonstrated ranging from academic writing encompassing argument and analysis to clear oral communication of complex ideas.
Here is a link which provides information on assessment tasks related to a range of learning outcomes (see pp.35-36). The approach is applicable to all disciplines although it was written for engineering.
A useful brief overview is also available here:
Feedback on performance to guide future improvement is an essential building block of learning. ’External perspectives’ and ’performance’ suggest that the skills needed to demonstrate knowledge and understanding are also assessed (eg writing ability, presentation skills).
Participants’ ideas as to why we assess student learning can be generated through discussion in small groups. They can be typed on-screen directly onto an additional power point slide or a separate (full screen) word document or a paper flip chart which can be displayed. Another technique is to project slides onto a whiteboard and write on that round the text on the slide. This approach is often useful when comments or explanations need to be added.
Participants can compare their ideas with those on slide 9. The facilitator will not need to explain much as these points are not difficult. It is good to leave participants the space to read this slide themselves, reflect, and make any points or ask questions.
This slide illustrates the ‘backwash’ phenomenon. In essence, student approaches to learning are shaped by assessment, especially summative assessment (Struyven, Dochy, & Janssens, 2005). As they progress through higher education, students become steadily more strategic in selecting topics of study which they believe will be relevant for summative assessment.
The focus now turns to learning outcome 1 on formative assessment. The type of learning fostered through assessment is considered first. There is an overlap between formative and summative assessment so much of this section is relevant to both.
If participants prefer, it is possible to alter the order of slides. If time is short, aspects of summative assessment (learning outcome 2) can be discussed first, for example.
This slide gives participants the chance to speculate about what higher order learning might be. If this seems challenging or tedious, you could omit slide 12.
Workshop participants appreciate the opportunity to consider this slide and discuss it with their neighbours. Subjects vary; it is often necessary to begin with ’adoptive learning’ before ’adaptive learning’ is undertaken. However, the teacher’s aim should be to stretch and challenge students so assessment tasks should be designed to assess adaptive learning whenever possible.
This slide relates to the concept of higher order learning. Higher order learning is active as it involves processing knowledge. Learning through understanding occurs when learners are involved in working with knowledge, not only listening to the teacher. The quote (White, 1966) comes from a novel by Nobel Prizewinning Australian writer, Patrick White.
The Structure of the Observed Learning Outcomes (SOLO) can be difficult to understand and it is advisable to critique it. Biggs differentiates phases of learning and portrays them like steps, the pinnacle of attainment being the ability to carry out challenging tasks such as ’generalise, hypothesise, reflect’. These belong to the category of ’extended abstract’. Here complex relationships between areas of knowledge are applied to newly-shaped notions and discoveries (the round shape). Learning may not follow this model sequentially and is unlikely to be clear-cut. Learners naturally move between phases.
Participants need to study this figure and consider their personal interpretation before the group discusses it. Participants’ subject areas, student groups, and the level of the courses they teach will influence their approach.
Biggs’ model is useful because it acts as a springboard for considering assessment tasks designed to test a range of learning outcomes
The issue of motivation is key to worthwhile, deep approaches to study in HE. Motivation is strongly linked to assessment (Struyven et al., 2005). If learners are looking for a short cut to qualification rather than focusing on a more profound involvement in learning, they will not benefit to the full from their university education. Enthusiasm and interesting ways of working with content help students to approach learning with more engagement, as does encouraging feedback.
Participants recognise this type of comment from their own learning histories. They seldom admit to making them, however. The problem with this type of feedback (which is very common) is that students may not be able to use it to improve. They may not know how to analyse, write logically, or why a section of their text is not relevant. They may be encouraged by knowing that their work is good, interesting, or even very good, but they may not be able to replicate that quality in future work if they do not understand what has led to these positive comments.
All these points are straightforward and it is easy to appreciate why these strategies have been recommended.
There is a bit too much material here. It might be better to divide slide 20. It’s best to get participants to consider this content and discuss it. If you simply read slides out loud, participants might just as well read them at home. No one can focus on reading a content-rich slide and listen to explanations at the same time. It’s also helpful to break content up and use the animation options to show and discuss one point at a time. Try to avoid intricate animation which will distract your audience (although you might find it entertaining yourself).
Slides 22 and 23 are normally supplied as a handout. They refer to written work such as essays and reports. This material could be studied at home and discussion could take place in a virtual learning environment (VLE).
SENLEF is ’Student Enhanced Learning through Effective Feedback’. The 2004 HEA project generated useful materials including case studies (Juwah et al., 2004).
The aim of this activity is to enable participants to reflect on how the approaches to providing feedback to students in Slide 25 can be integrated into their teaching. Diverse subjects can lead to interesting debates on the principles.
These short cuts have all been tried and tested. Recent projects on audio feedback show that it works well. This link provides references to a number of case studies. http://www.bioscience.heacademy.ac.uk/resources/projects/merry.aspx
Luke’s blog is authentic. It gives rise to some relevant and compelling discussion on the impact of end of course assessment. Many students who have succeeded in the traditional exam system like to continue with exams. However, Struyven et al (2005) suggest that a range of modes of assessment are more likely to result in deep approaches to learning. They refer to a range of studies which confirm this
There is some confusion about marking systems in higher education. Some assessment tasks are objective. When this is the case, for example in mathematics, a range of marks including very high marks are awarded. When subjective judgements are made (in the case of assessment tasks where a variety of responses are possible such as essays), the allocation of marks is an art rather than a science. Often a combination of criterion- and norm-referenced approaches is used. Initially, criteria and marking schemes are applied. If too many students fail or score high marks, norm-related adjustments may take place. There is a sense that a normal distribution of marks, with most examinees scoring around the 2.1 level, should occur. In fact, pure criterion-referencing where each student’s performance is considered without regard to the performance of others in the cohort, is often regarded as best practice. It would be possible for an exceptional cohort of students to achieve a lot of high scores. But in the real world, the judgement of experienced academic staff is relied on to achieve reliability in the sense of the fair allocation of marks to students (Sadler, 2009). Double marking and external markers strengthen the impression of inter-scorer reliability.
The different assessment tasks suggested here are discussed in relation to the demonstration and testing of a range of skills. Too often in the UK, writing is the mainstay of assessment tasks. When academic staff use oral assessment, such as short vivas, they are generally favourably impressed by the way they can gauge student understanding through face to face dialogue. In many other European countries, oral assessment is the norm. One of the advantages is that marks can be generated quickly. Perhaps shortened traditional exams could be combined with vivas.
The themes which have emerged during the session are drawn together in this slide. Ultimately learning naturally has to be done by students themselves. They cannot expect to be spoon-fed as this would not achieve higher order learning. A choice of assessment task is motivating. These examples are more likely to provide students with the opportunity to approach their studies in an engaged way and develop skills and attributes which will serve them well beyond university. This approach to learning and assessment is thoroughly developed at Alverno as outlined in slide 7.
These suggestions for a variety of ways of approaching learning and assessment can be supported by the provision of links to resources.
There are many teaching, learning and assessment resources available from US, Australian and UK universities and the Higher Education Academy, for example:
Invite questions to conclude, and link to what participants have come up with. Invite them to consider what they will be introducing in their teaching and assessment as a result of the session.
Biggs, J. (2003). Teaching for Quality Learning at University. 2nd ed. Buckingham: The Society for Research into Higher Education & Open University Press.
Mentowski, M. and Associates (2000). Learning that lasts: integrating learning development, and performance in college and beyond. San Francisco: Jossey-Bass.
Juwah, C., Macfarlane-Dick, D., Matthew, B., Nicol, D., Ross, D., & Smith, B. (2004, 13 March 2011). Enhancing student learning through effective formative feedback. from http://www.heacademy.ac.uk/assets/York/documents/resources/resourcedatabase/id353_senlef_guide.pdf
Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of Engineering Education, 93(3), 223-231.
Sadler, D. R. (2009). Grade integrity and the representation of academic achievement. Studies in Higher Education, 34(7), 807-826.
Struyven, K., Dochy, F., & Janssens, S. (2005). Students’ perceptions about evaluation and assessment in higher education: a review. Assessment & Evaluation in Higher Education, 30(4), 331–347.
White, P. (1966). The Solid Mandala. New York: Viking Press.
Assessment and feedback to students by Dr Rosalind Duhs is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.0 UK: England & Wales License.
Based on a work at www.ucl.ac.uk.
Contact us: email@example.com