Centre determined grading in 2021

Due to Covid-19, exams were cancelled for a second time in 2021. Centre determined grading (CDG) replaced the usual standardised assessments for qualifications like GCSEs and A levels in Wales.

Compared to the exams-based system, two key aspects of the CDG approach were more responsibility for schools and colleges for grading, and more flexibility in the approach to assessment and grading.

The CDG approach was needed in a difficult context where exams could not be held to ensure learners could get qualifications and progress.

But the approach is also a case study that we can learn from. One way of learning from it is to seek to better understand the perspective of school and college staff that took on the extra responsibility and workload to deliver it.

What research did we commission?

We commissioned Opinion Research Services (ORS) to conduct a two-stage research project. A survey was disseminated to centres via exam officers and received just under 400 responses. A further 30 depth interviews were conducted with survey participants.

The focus of the study design was qualitative, to capture more in-depth insights to generate understanding of the perspective of centre staff. We did not design the research to be quantitative, to generate statistics that are representative of the teaching profession in Wales. This means we need to be careful about generalising the findings of this study to the whole population of teachers in Wales.

Their perspective will be informed by considerations that we as a regulator would be unaware of, but the opposite is also true. Centre staff will not be aware of all the considerations and constraints involved in implementing an emergency substitute to a national assessment system in an emergency context. This limits how definitive we can be when using these research findings on the overall conclusions we can draw about the CDG approach and how it was implemented. Others involved in the process might have different views and a different perspective. And different sorts of evidence would be needed to evaluate, for example, the effect of the approach on results.

What did we find?

We asked participants to explain their approach to assessment. The number of assessments organised for the purpose of grading varied substantially although these assessments are also likely to have varied in length. Some participants noted that they selected evidence for grading based on what they felt would be acceptable for the awarding body. Respondents tended to grade for the learners they taught and pick a sample for a colleague in the centre to moderate. Most stated that it was easy to agree grades with colleagues during internal standardisation and that they felt this operated as an effective quality assurance process.

Where staff collaborated with other centres, respondents reported additional benefits, such as codeveloping centre policies, adopting effective practice, and gaining confidence in grading. Nonetheless, participants commented that grading with other centres did not work as well when it increased the time pressure on staff and lacked organisation. 

Although the research highlighted some positive aspects of CDGs, participants also raised fundamental with this sort of approach to assessment, as well as some concerns about implementation. For example, centre staff pointed to potential inconsistency in approaches to assessment and grading across centres, leading to possible unfairness for learners. Participants tended to have greater confidence in the grades awarded in their centre than they did in grades awarded in other centres. Further, although views on the manageability of the process were mixed, most said centre determined grading took a substantial amount of time, impacted on their teaching time, and on their personal wellbeing. There were also some criticisms of aspects of the process such as external quality assurance, the absence of new unseen assessments based on the usual standardised assessments and limited or delayed guidance and training.

What did participants think about the future?

Having experienced CDGs, participant opinions varied regarding future involvement with grading the next time qualifications are reformed.

Most of those who said that they would like greater involvement in the future felt that greater weighting given to teacher assessment would increase fairness and equity.

Those who opposed greater involvement felt that external assessments would be fairer and more equitable to learners as they are more impartial and more consistent. They also argue external assessment is more manageable for teachers and centres.

Overall, the degree to which opinions varied in this study is not surprising to us as it reflects the benefits and drawbacks of greater involvement of teachers, and the differences of opinion that exist on the topic, especially in a high stakes assessment context. Proponents and opponents of greater involvement of teachers tend to draw on different fairness and equity arguments and have different perspectives on manageability for teachers and centres. These differences of opinion are important to reflect on when thinking about future qualifications.

What have we taken from the findings overall?

The findings imply to us that while the flexibility of the CDG approach was critical to ensure learners received qualifications in an emergency context, that flexibility does lead to differing and sometimes negative impacts that polarised opinions among the research participants. The findings from this research, including the impacts of the CDG approach on teachers and on teaching time, have been influential in our preference for exams returning this summer, as well as how we have thought about contingency plans. When we think about longer term reforms to qualifications, the context is likely important to the acceptability of the impacts of an approach like centre determined grading. It is likely that this very extreme approach to assessment would be less acceptable outside of an emergency context.

Acknowledgements

I would like to thank participants in the research for their time and for the insights they provided. I would also like to thank all those involved in delivering qualification grades in 2021, including centre staff and awarding body staff, for the hard work to ensure that a solution was found for awarding qualifications in difficult circumstances.

Full report and summary report.

By Tom Anderson, Head of Research and Statistics