Categories
Digital

Essay Feedback via Screencasting

Interesting – have seen a few people experimenting with on-screen marking – general consensus was took longer – but this article indicates actually starting to save time:

http://www.sxc.hu/photo/1403785
http://www.sxc.hu/photo/1403785

A simple, inexpensive technique allowing lecturers to record personalised video feedback while marking papers is being used to improve the student experience, and could be scaled up to help online learners.

Lecturers at Cardiff Metropolitan University are using screen-capture technology to give students the impression of being present during the marking process, allowing more targeted feedback to be given.

Using widely available computer recording programs, many of which are free, lecturers record themselves marking students’ papers, explaining where errors have been made and showing corrections. They then upload the footage to their university’s learning platform or email clips direct to students.

Read full article.

Categories
Academic

Feedback: Before, During, After

http://www.sxc.hu/photo/866529

I’m currently working on a project called ‘FASTECH’, funded by JISC, so any stories on Feedback/Assessment are of interest. I’ve also been working on a module called Manipulating Media for the past 18 months, which we’ve developed with clear expectations of the assignment, ongoing ‘consultancy’, and a mix of peer/tutor feedback:

In a paper entitled “Reconceptualising assessment feedback: A key to improving student learning?”, published in the latest issue of Studies in Higher Education, the researchers say that a “fault line” exists between secondary and tertiary education.

In particular, they say that young people develop a set of expectations about academic support as a result of their experience at school, but when they get to university, these expectations are shattered by what is on offer.

To address this, the authors advise that the first year of higher education should be viewed as a transitional stage between the supported learning provided in secondary education and the independence currently expected at university.

During this year, students should be given “preparatory” guidance before an assignment, “in-task” guidance during the project and “performance feedback” at the end.

The authors, Chris Beaumont and Michelle O’Doherty of Edge Hill and Lee Shannon of Liverpool Hope, say universities should change their approach from isolated “events” of summative performance feedback to a continual “guidance process”.

This should include a greater emphasis on verbal and one-to-one interaction between tutor and student, they say. They also suggest that feedback should be standardised to a greater degree.

Read full story.

Categories
Academic Digital

JISC e-Learning Webinars: Making Assessment Count

Friday 3rd February 2012 1-2pm

Online via Blackboard Collaborate

Presenters: Professor Peter Chatterton (Daedalus e-World Ltd) and Professor Gunter Saunders (University of Westminster)

The objective of Making Assessment Count is primarily to help students engage more closely with the assessment process, either at the stage where they are addressing an assignment or at the stage when they receive feedback on a completed assignment. In addition an underlying theme of MAC is to use technology to help connect student reflections on their assessment with their tutors. To facilitate the reflection aspect of MAC a web based tool called e-Reflect is often used. This tool enables the authoring of self-review questionnaires by tutors for students. On completion of an e-Reflect questionnaire a report is generated for the student containing responses that are linked to the options the student selected on the questionnaire.

The session will provide an overview of MAC and highlight some of the variant MAC processes that are being developed by six different universities, as well as drawing out strengths and weaknesses of MAC. There will be a demonstration of how the e-Reflect tool works but the presenters will also show how MAC can work without that tool.  Participants will be engaged by seeking their views on the affordances offered by MAC as well as their input into identifying barriers and enablers in applying MAC in their own institutional and subject contexts.

The webinar is free to attend.

The Webinar

Joined about 20 minutes into the event after teaching ‘Social Media for Job Hunting‘. Thanks to @sarahknight for sending me the login details which hadn’t arrived! 

MACE: https://sites.google.com/a/staff.westminster.ac.uk/mace/home

Immediate reaction from staff is that the workload is likely to be high re learning journals, but find that a few comments actually doesn’t take that long, especially in comparison to the improvement demonstrated from students.

Offers the opportunity for small, but detailed, positive feedback suggesting actions… rather than “So..?” as a typical written comment on an assignment.

What should the balance be between e & f2f feedback:

My comment:

I’m interested to see how coaching practice, etc. is impacting upon how things work. With manipulating-media.co.uk we give them ‘consultancy sessions’ as a group as feedforward, before they submit their assignments, and they write reflective blog post every week. They tend to use FB to connect with each other. Agree that we should look at the e, but ultimately it’s about ensuring that it meets the needs of the course.

Some chat comments:

  • The record of what everyone says is increasingly important, a big advantage of ‘e’.
  • Is dialogue about scaffolding or about generating cognitive conflict? different processes and different models of dialogue; is all dialogue equally productive?
  • Awarding micro grades for demonstrating action on feedback

Further links:

Categories
Academic Digital Event

Effective Assessment in a Digital Age Workshops @jisc

From challenge to change…

Using principles of good practice, work with colleagues towards an effective model for the use of technology in assessment and feedback.

A series of free workshops based around the JISC Effective Assessment in a Digital Age publication and associated online resources will take place during January – March 2011. Workshops will be held in London (20 January 2011), Birmingham (16th February 2011), Bristol (March 2011, date to be confirmed) and Newcastle (24 March 2011). The JISC e-Learning Programme will be working in partnership with the JISC Regional Support Centres on these events.

These workshops will be exploring how the use of technology in HE and HE in FE, linked to principles of good practice in assessment and feedback, can help promote more effective learning. These workshops, which draw on the work of recent JISC-funded projects as well as related significant developments in the area of assessment, will have a practical, hands-on flavour with a focus on how to move from current challenges towards sustainable change.

The workshops will be suitable for:

  • Lecturers, tutors and course leaders who design assessment and feedback for their learners on HE-level courses
  • Intermediaries with a role in supporting practitioners with assessment, and technology-enhanced assessment (learning technologists, e-learning/ILT champions, staff developers, educational developers, academic registry)

Further information together with the registration form for the London workshop is now available from www.jisc.ac.uk/assessworkshops

Original copy: @jamesclay

Categories
Academic Digital Event

Assessment for Learning #iblc10

 Assessment for Learning

Mark Russell, Dominic Byrne

Issues in the sector with assessment…

Assessment is one of the drivers of learning, all learning leads to assessment..

End-loaded assessment, lack of student engagement in the course itself & rising student numbers. Summative assessment model has pushed out the formative assessment largely… chances of low-stake assessment/feedback & feed-forward opportunities are lowered – and students don’t get a great experience from this.

Uni of Herts perspective – support transformational change across the institution not just the ESCAPE project – what are the barriers stopping change. Provide guidance & resources to practitioners so they could use them without having to be as engaged as the central teams are. Transcend disciplinary boundaries.. very strong usually – Blended Learning conferences are unusual…

Trying to support the whole institution, not just the enthusiasts. Help staff make transformational changes esp round assessment. Hints & Tips, some quick how-tos… needed to built on those things that work (so look at the literature)

Good assessment:

  • Engages students with the assessment criteria
  • Supports personalised learning
  • Ensures feedback leads to improvement
  • Focuses on student development
  • Stimulates dialogue
  • Considers student & staff effort.
    • Not hard to see principles, Graham Gibbs, David Wheat, etc
    • 2 Centres for Learning – Uni of Herts & Oxford Brookes.
    • Liz McDowell, Assessment for Learning– esp student characteristics, etc..
    • Hard for those interested in using, but confusing – which to use?! And are they institution specific…
  • Now 6 themes of good practice (see above!) – supported by all the other principles, etc named above… Not just engaging at hints & tips level = but at a conceptual level.
  • Magic Number around 7 – can be remembered …
  • Tool Kit approach – 1-2-1 or workshop type sessions. Stimulates dialogue & conversation, and raises the perception of assessment practices.
    • Assessment Learning Diagnostic. Appreciative approach (what are you doing well & what could do you more of?) but essentially, what is the problem that you have? Uses the questions from the other principles (e.g. Gibbs, NSS, etc.) – stimulates thinking as to what good practice looks like… identify which is ‘the biggest hole in the bucket’ to focus on first…
    • Features & Consequences – helps staff understand what their diagnostic might mean… Tie into bigger debates/strategies, etc. re: retention.
    • Collected case studies as to what has happened with assessment, staff don’t know what is to be done with the data they give in, then track over (using Word comments) how this fits within the big 6 themes.
    • Not expecting everyone to want to engage – so use 3 minute TIPS (themes in practice). Camtasia stuff.
  • About people, barriers/boundaries. Not all volunteers – some were volunteered by Heads of School… So used ‘appreciative enquiry’ – look at & grow the positive. We’re not here to ‘fix a problem’, but wanted to do an audit of what was good about what they were doing & grow that – improve or increase elements that work. Very reflective on practice. Mapped those to the themes… Give a short presentation about what their module was about… Helen started to ID how mapped to a theme.. Fishbone analysis & timelines…
  • Ideas came from the tutor, about what was/wasn’t working, and started to look at the strengths of sports students & found assessments that built to strengths (sociable, practical, etc.). Students undertook Belbin tests… Wiki discussion working really well – can see the discussions – he had a fixed time of the week to log in & see what was going on. Tutor using Wikis, videos, discussions, etc. – he had support initially and now the tutor is do it himself. Students can critique the video…. e.g. students say I don’t understand x – students/tutor can help each other. Ongoing engagement. Nearly didn’t do it as felt too risky – would be easy to leave it as it was, but felt confident of support so felt ready to try it.
  • Respond to the challenge on a local level (module leaders effect the change – take the risk to take the change – have ownership of the change).
  • Hard to get people to reflect on their own practice in a way that will effect change.
  • USE TECHNOLOGY TO SUPPORT THE PEDAGOGY. All passionate about the subject, but not about the technology – so have to show them that it works…

QUESTIONS

  • Can access StudyNet, ask via Cloudworks.
  • JISC funded project, intention is to provided the community with stuff – is available on StudyNet.
  • The module worked on, was it a module that had a high failure rate? Student engagement & staff time was the issue – what is the evidence that it worked? Got congratulations on exam performance, etc. – the bottom line things that others are interested in! (Wasn’t low failure rate, was lack of attainment)
  • Drivers – doing things better in less time. Satisfaction, able to provide good feedback & get to know our students.