Assessment Blog | RM

The future means change for High Stakes Assessment

Written by Lucy Glynn | Mar 23, 2022 1:30:10 PM

In February, a group of experts from across the education industry were brought together by the Westminster Education Forum (WEF), to discuss the future of assessment in England. Ian Castledine, RM’s Head of Proposition, was on the panel, and I recently caught up with him to discuss some of the highlights.

Why do you think that forums like WEF are so important?

I think platforms like the Westminster Education Forum are important, mostly because you're getting a range of different views, from a range of different sources. Subsequently, there is healthy discussion and debate that ultimately leads to better decision making.

For example, one of the topics that came up at the forum was coursework. Specifically, looking at coursework within the assessment landscape across England. One speaker was so compelling that you could come away thinking, “why would anybody in their right minds think that adding coursework to English assessment was a good thing.” However, an equally distinguished expert posed a solid opposing argument and that allowed for open conversation about its value.

My personal view is that coursework is just such an important thing to consider building into your assessment landscape. The risk has always been around trust. How do you know that your students aren’t going and seeking help? Which would result in you assessing somebody else’s work, not your students. However, if you use coursework with careful consideration, or to correlate with results from multiple methods of assessment you can start to triangulate your student’s true ability and build a more accurate picture.

To me, forums like WEF are crucial for hearing the breadth of debate and discussion and allowing you to make informed judgments about things.

The discussion at the forum was centred around the future of high-stakes assessment in England – what did you take away from the debate?

What I took from the whole forum is that there is industry-wide recognition that the pandemic has accelerated change. And that there is a willingness to change amongst educationalists. But equally, there's a call from the industry to say don't throw the baby out with the bathwater. What I think is meant by that is, just because summative exams have got their limitations, doesn't mean that they are without use in the process of assessment.

For example, there is currently an emerging debate around the future of GCSEs. The GCSE system in England dates back nearly 40 years, and a lot has changed both in the skills that we want students to have as well as a much better understanding of adolescent brains and the impact of mental health challenges. Does the current mode of assessment of GCSEs reflect that learning – or could it build on some of the learning from approaches emerging in technical or vocational qualifications?

But, as you look at using technology to support more creative means of assessment, there are challenges to be addressed such as digital poverty. And a view expressed across the panel was of the importance of establishing assessment that is increasingly fair, increasingly inclusive and that best supports all learners.

And to that point, do you have a strong opinion?

Absolutely. We [RM] see value in a range of assessment styles and methodologies and types being used to lead to qualifications. I personally think that the broader the range of evidence you've got about candidate ability the better. It both validates and consolidates the authenticity of the result.

The question should always come back to, what is the purpose of this assessment? Because the qualifications should be aligned to the thing that happens next for a student, whether that be a GCSE, A-level, vocational or professional qualification. It needs to make sense and be applicable to the next phase of that student’s journey. Whether you’re an employer, a college admissions officer or for whatever purpose you are using assessment, having an assessment system that gives you a fair and accurate view of what you need to know about a student is so important. And for the student, having an experience of learning and assessment that has purpose, will lead to more engaged and better-skilled individuals with the best chance of success in the future.

What's interesting for me, is being a part of the conversation and seeing the changes. Let’s take the New Zealand Qualifications Authority (NZQA), for example. They are on a journey of transforming their assessment processes through digital assessment that builds on some significant changes to their approach to learning and assessment that were made a few years ago.

Assessment for them is no longer simply a quantitative measure. Instead, assessors look at all the evidence that's provided through the responses that students provide in their assessments and consider whether they are meeting or exceeding the required standard.

They have learned to use data and insights to look at how students respond to questions and adjust their methodology, ultimately leading to improved educational outcomes. What is important to note, with this case, is that it takes time. NZQA have been working incrementally with RM over a period of years to adopt digital assessment and have engaged stakeholders consistently throughout – including teaching staff and students. They have enabled schools and students to opt into digital assessment when it was right for them to do so. Such a shift cannot be rushed.

In education, leaders often think about or talk about the need to develop 21st-century skills. Does our high-stake assessment provision reflect the same approach?

In some cases, yes.

My son is prepping for his A levels now. If I think about his computer science course as an example, he is certainly learning skills in programming and software design, which if he was to start work at RM next week, are inherently transferable and would give a strong grounding. It is interesting to consider how different subject areas need to adapt to meet the emerging needs in the world and in the workplace; maybe there is a place for more open book exams which enable students to research and process information on the fly, more typical of the skills that are needed in the workplace. What about the development and assessment of softer skills, so highly regarded by employers?

Is it right that we force people back to pen and paper for longer exams? Personally, the way that I process information, refine my thoughts, and reach a final piece of work, is iterative. Adopting on-screen assessment is one way of enabling that type of 21st-century skill to be used and assessed, moving on from the analogue limitations of pen and paper.

Of course, in other areas simply trying to replace paper is not the answer, and RM are working with several organisations to think through how different subject areas which lend themselves well to pen and paper could be digitised – again to better reflect the emerging needs and skills expected in working life.

Going back to my earlier point, the qualification needs to be applicable to the next phase in a student’s life, and therefore so does the assessment process. Your result should not necessarily be affected by poor memory recall, or poor writing skills, if that is not the purpose of the assessment. For example, if you are being assessed in computer science, and the goal is to make sure that you are ready for a career in that area, the main thing that an employer is looking for would be for you to have the skills to excel in the workplace.

As you highlight, the conclusion from the panel was that “change was needed”, in your opinion, where do we start?

I'll come back to the answer that I gave when I spoke at the forum, which was: We start by engaging all the stakeholders. Work out how you engage all your stakeholders and keep them engaged.

That includes the students and the teachers, because young people must be at the heart of the conversation. For examinations that are so high profile and central to the student experience, learners must be fully included in the path to changes. Having their input opens the conversation around the purpose and helps to determine where technical solutions can bring more inclusive, fairer, and relevant assessments for everyone.