The new edition of the toolkit is now available from Routledge in paperback or hardback. This new edition would not have appeared without valuable work from Joe Penketh and many academic friends including Kay Sambell, Sue Beckingham, Peter Hartley, Steve McHanwell, Pina Franco, Marita Grimwood, Mark Glynn, Michelle Morgan, and Belinda Cooke, with helpful encouragement from Karen Hustler of AdvanceHE. Sally Brown made wonderful additions to the toolkit.
9th April 2025
Reflection (from my archives)
Folk told me that this little paper had gone missing from the Escalate collection. Here it is as I originally sent it way back! Any feedback gratefully received as ever. ev-ref-y.doc (1552 downloads) It was called ‘Evidencing Reflection: putting the ‘w’ into reflection’.
Choosing and Using Fit-for-Purpose Assessments: Kay Sambell and Sally Brown
You might have found this already on Heriot-Watt University’s website. I think it is a great source for ideas on diversifying assessment.
Choosing-and-using-fit-for-purpose-assessment-methods-1.pdf (1872 downloads)Change of email address! (now happened!)
Google decided to introduce steep charges to custom email addresses such as my former one, so it has disappeared from September. You can still contact me via my other standard gmail address which remains free, which you can obtain by DM-ing me on Twitter @Racephil,
My website should continue as normal.
Sally’s latest podcast on assessment
Now added to the series compiled by Kevin Merry of De Montfort University is an audio podcast by Professor Sally Brown on ‘Speaking Specifically about Seminal Assessment Influences’. I think this is excellent, and readers of my website will find it fascinating. The link to Sally’s website post is: https://sally-brown.net/2022/07/18/speaking-specifically-about-seminal-assessment-influences/
Podcast by DMU about Ripples model
Kevin Merry of De Montfort University posed a series of questions to me this week about learning, feedback and assessment, and I am pleased to link this post to the recording of our discussion. Thanks Kevin for such interesting questions, and giving me the opportunity to respond to them. All feedback welcome. https://anchor.fm/kevin-merry/episodes/Episode-11-Ripples-on-a-Pond-with-Phil-Race-e1g66qm
Do we assess what we’ve taught, or something else?
This was the title of a little contribution I made last week to the University of Kent’s series on Digitally Enhanced Education, published on u-tube It took me a minute or two to get into the swing of it, but I think I raised some important questions. Thanks to Dr Phi Anthony and colleagues for organising the sessions, and making the whole Webinar including my slides and talk available at:
Backwards to normal?
Now that UK Education Ministers are assuming that the worst of the pandemic is over, there’s talk of how to measure school-based achievement in 2022. This seems to me to be going back to using time-constrained, on-site unseen written assessments, at least in part, with all the stress and anxiety for school students that they convey. In the UK Guardian newspaper on 30 September, the minister for education, Nadhim Zahawi said, talking about public exams for schoolchildren in the coming year, “We are committed to rigorous standards being fairly applied, and exams are the fairest way to assess students, which is why they will take place next year“. I would argue that this is certainly a contested matter and, in my view, completely wrong.
In higher education globally, when traditional exams were no longer able to be on the menu in 2020-1, much energy and creativity focused on how else we could gauge students’ learning, what else could be used as evidence of their achievement, and how might this lead to long-term improvements in education. A flavour of the wide range of possibilities is reflected in the work of Sally Brown, Kay Sambell and contributors at http://sally-brown.net/kay-sambell-and-sally-brown-covid-19-assessment-collection/, for example, many of these features are directly transferable to schools’ education.
I’ve long been aware of the limitations of traditional exams and am hoping that the fruits of many practitioners’ labours during the pandemic will inform the future shape of assessment, bringing into play the many other ways to assess than those of the old exam paradigm.
My thoughts are presently focusing round the following questions:
- What are the main dangers in our attempts to quantify students’ learning, using this particular mode of assessment?
- How far are we now straying in our test conditions from the everyday learning environment (not least during the pandemic) which surrounds learning, progress and everyday practice?
- Whereas traditional exams can address issues around who is actually completing the assignment (‘whodunit?’ I call it) to some extent, which is not really addressed at all well by traditional essays for example, what do unseen, time-constrained exams actually measure, and what are their principal failings?
- How far are we still from really being able to measure how well students have succeeded in their efforts to embrace our syllabus content?
- How distant is our syllabus content from the authentic reality of what we hope students will achieve?
I plan in future posts to explore some of these questions with fresh views inspired by our experience of the pandemic, and to propose some ways forward to make sure that assessment adds value to the student experience rather than being a cause of dismay, disappointment, and disillusionment.
A better 2021?
Last year was unprecedented?
Here’s wishing you a much better one for 2021. For me, last year also had the pain leading the a replacement hip this month, so not much sitting at a computer – but now I’m back again.
Still thinking – and a great deal to ponder about learning and assessment in particular. Are we heading in better directions at last in higher education? Waiting, watching and hoping.