Academic Digital Event

Assessment for Learning #iblc10

 Assessment for Learning

Mark Russell, Dominic Byrne

Issues in the sector with assessment…

Assessment is one of the drivers of learning, all learning leads to assessment..

End-loaded assessment, lack of student engagement in the course itself & rising student numbers. Summative assessment model has pushed out the formative assessment largely… chances of low-stake assessment/feedback & feed-forward opportunities are lowered – and students don’t get a great experience from this.

Uni of Herts perspective – support transformational change across the institution not just the ESCAPE project – what are the barriers stopping change. Provide guidance & resources to practitioners so they could use them without having to be as engaged as the central teams are. Transcend disciplinary boundaries.. very strong usually – Blended Learning conferences are unusual…

Trying to support the whole institution, not just the enthusiasts. Help staff make transformational changes esp round assessment. Hints & Tips, some quick how-tos… needed to built on those things that work (so look at the literature)

Good assessment:

  • Engages students with the assessment criteria
  • Supports personalised learning
  • Ensures feedback leads to improvement
  • Focuses on student development
  • Stimulates dialogue
  • Considers student & staff effort.
    • Not hard to see principles, Graham Gibbs, David Wheat, etc
    • 2 Centres for Learning – Uni of Herts & Oxford Brookes.
    • Liz McDowell, Assessment for Learning– esp student characteristics, etc..
    • Hard for those interested in using, but confusing – which to use?! And are they institution specific…
  • Now 6 themes of good practice (see above!) – supported by all the other principles, etc named above… Not just engaging at hints & tips level = but at a conceptual level.
  • Magic Number around 7 – can be remembered …
  • Tool Kit approach – 1-2-1 or workshop type sessions. Stimulates dialogue & conversation, and raises the perception of assessment practices.
    • Assessment Learning Diagnostic. Appreciative approach (what are you doing well & what could do you more of?) but essentially, what is the problem that you have? Uses the questions from the other principles (e.g. Gibbs, NSS, etc.) – stimulates thinking as to what good practice looks like… identify which is ‘the biggest hole in the bucket’ to focus on first…
    • Features & Consequences – helps staff understand what their diagnostic might mean… Tie into bigger debates/strategies, etc. re: retention.
    • Collected case studies as to what has happened with assessment, staff don’t know what is to be done with the data they give in, then track over (using Word comments) how this fits within the big 6 themes.
    • Not expecting everyone to want to engage – so use 3 minute TIPS (themes in practice). Camtasia stuff.
  • About people, barriers/boundaries. Not all volunteers – some were volunteered by Heads of School… So used ‘appreciative enquiry’ – look at & grow the positive. We’re not here to ‘fix a problem’, but wanted to do an audit of what was good about what they were doing & grow that – improve or increase elements that work. Very reflective on practice. Mapped those to the themes… Give a short presentation about what their module was about… Helen started to ID how mapped to a theme.. Fishbone analysis & timelines…
  • Ideas came from the tutor, about what was/wasn’t working, and started to look at the strengths of sports students & found assessments that built to strengths (sociable, practical, etc.). Students undertook Belbin tests… Wiki discussion working really well – can see the discussions – he had a fixed time of the week to log in & see what was going on. Tutor using Wikis, videos, discussions, etc. – he had support initially and now the tutor is do it himself. Students can critique the video…. e.g. students say I don’t understand x – students/tutor can help each other. Ongoing engagement. Nearly didn’t do it as felt too risky – would be easy to leave it as it was, but felt confident of support so felt ready to try it.
  • Respond to the challenge on a local level (module leaders effect the change – take the risk to take the change – have ownership of the change).
  • Hard to get people to reflect on their own practice in a way that will effect change.
  • USE TECHNOLOGY TO SUPPORT THE PEDAGOGY. All passionate about the subject, but not about the technology – so have to show them that it works…


  • Can access StudyNet, ask via Cloudworks.
  • JISC funded project, intention is to provided the community with stuff – is available on StudyNet.
  • The module worked on, was it a module that had a high failure rate? Student engagement & staff time was the issue – what is the evidence that it worked? Got congratulations on exam performance, etc. – the bottom line things that others are interested in! (Wasn’t low failure rate, was lack of attainment)
  • Drivers – doing things better in less time. Satisfaction, able to provide good feedback & get to know our students.