Blog: Beka Goh, Design Project Lead
Formative Assessment – understanding students’ misconceptions
Understanding misconceptions is at the heart of formative assessment. Beka Goh, Mathematics Mastery’s Design Project Lead, explains our new partnership with Eedi as we start to offer new diagnostic tests to our partner schools.
Evidence points to high quality formative assessment leading to the greatest learning gains for students. However, the terms assessment for learning (AfL) and formative assessment are used to describe a broad range of practices, some highly effective and some less so. In this blog, we’ll look at what is meant by formative assessment, some practices that are effective and how our new assessment offer fits with this. We’ll also consider the role of summative assessment and how we’ve aligned this to our curriculum.
What do we mean by formative assessment?
Dylan Wiliam, whose seminal work with Paul Black (Inside the Black Box, 1998) made formative assessment a priority for policy makers and schools alike, draws a distinction between AfL and formative assessment. He says that assessment for learning – where the intention is to help you meet students’ needs better – only becomes formative assessment when the evidence you gather is used to adapt teaching to address students’ needs. He says, “if you’re not using the evidence to do something that you couldn’t have done without the evidence, you’re not doing formative assessment.” (‘Assessment for Learning: What, why and how?’, 2006). Formative assessment is when you actively use evidence of student learning to make professional judgements about what to do next in your teaching.
What makes effective formative assessment?
If we’d played a game of word association before you read this, I’d bet that if I’d said ‘assessment’, the word ‘dialogue’ wouldn’t have immediately sprung to mind. Perhaps more likely would be words such as examination, data, grades, spreadsheets, tracking or progress. Dialogue is also something that maths teachers may not traditionally associate with their subject. Yet it is by giving students opportunities to talk, and by listening carefully to what they say, that we gather some of the richest data on their understanding, in order to influence our next moves.
In a lesson plenary, I once saw a teacher using a diagnostic question (one designed to identify and understand students’ mistakes and misconceptions), of the multiple-choice variety found on Craig Barton’s popular website. The teacher posed the question, had students write their chosen answer (A, B, C or D) on mini-whiteboards and then asked students to hold them up. There were a range of answers, approximately half correct, and the rest distributed across the three remaining answers, each aimed at diagnosing a specific misconception. The teacher asked a student with the correct answer to explain how they knew it was correct, and then the lesson ended. As far as I could see, the teacher wrote nothing down, and so it was unclear how they might have used the students’ responses to move learning forward. The intention was there, but not the execution.
Now imagine if that teacher had instead withheld the correct answer and given students with all four answers the opportunity to justify why they believed their answer to be correct. By listening to their explanations, the teacher would gain useful insights into students’ understanding, as well as being able to probe them with timely questioning. The data from this dialogue could then inform the teacher’s next move: which questions to ask (and to whom), which tasks to choose, and which examples to give. By engaging in dialogue with each other, those speaking would become more aware of their own reasoning and level of understanding and some might become aware of their own mistakes. Those listening would also become more aware of their own understanding, by comparing others’ ideas with their own. Is there a danger here that some students who provided correct answers initially might change their minds, following another student’s faulty reasoning? Perhaps. But this itself informs teachers that the student wasn’t so secure in their understanding after all, once again providing food for thought about what the next steps might be.
Of course the quality of the question and its multiple choice answers are crucial in defining how useful the data gathered is for formative use. Then there’s also the quality of the evidence we gathered from the class. Dylan Wiliam regularly comments on this point, saying that correct answers from confident, articulate students are not a good indication of what is happening in the heads of all students in the class, yet it is often these students who are most willing to share their thoughts. So how can we gather more information from more students? Beyond classroom dialogue, there are many ways of improving the quality and equity of the data we collect, from mini-whiteboards to lollipop sticks, and more recently technology has begun to offer new ways to gain insight into our students’ thinking.
Mathematics Mastery has recently teamed up with edtech organisation Eedi. Co-founded by Craig Barton, Eedi is an online platform that uses diagnostic questions, to identify student misconceptions. The platform uses multiple choice questions (MCQs), mapped onto our scheme of work, to provide teachers with useful student data that they can act on.
As an example:
There are number of features to this question that are worth noting:
• There are a number of ‘distractors’ that are designed with a common misconception in mind. For example, a student who selects option ‘A’ may think that sum can be calculated by summing the numerators and the denominators.
• There are two correct responses, expressed in different ways.
• The question is drawing from a fairly narrow domain. Making the question specific enables the feedback to be focused and effective.
Learners can access the tailored sets of MCQs in lessons or at home, as part of a class discussion (as described above), or during times of individual work. Students not only select the answer(s) they think is correct, but give a reason for their answer, which gives the teacher valuable insight into student understanding and provides a springboard for dialogue. So, MCQs can be a useful tool to expand the scope of insights that teachers can gather into their students’ mathematical conceptions.
Where does summative assessment fit with this?
One of the strategies offered by Wiliam and Hodgen in ‘Mathematics Inside the Black Box’ is to use summative tests formatively. However, Daisy Christodoulou, in ‘Making Good Progress?’, recommends proceeding with caution. She says that while summative assessments can be used to inform teaching, for example comparing two classes with similar ability profiles to compare performance in a specific domain, this will only allow teachers to make general inferences about students’ performance in that domain. This is because in a summative assessment, a limited number of questions are used to assess each domain, making it impossible to make a reliable inference about a student’s learning within a subcategory. Summative assessment questions are also designed specifically to distinguish between students. Christodoulou points out that this contributes to questions being less precise and specific than they would be if they had been designed with formative assessment in mind, making it far harder to really understand what conceptions students hold simply by looking at their correct or incorrect written responses.
We therefore recognise that schools need high quality summative assessments at certain points too, and that’s why we’re teaming up with experts from AQA to hone our own exam question writing. We’ll be using their advice to produce an end of year, curriculum-aligned summative assessment to support schools in making a judgement about students’ mathematical performance as a whole.
So, where does this leave us?
At Mathematics Mastery we want to help teachers get the best possible regular insights into their students’ understanding – insights that provide clear guidance for their next teaching moves. We’re excited that our partnership with Eedi will help achieve this, building on the foundation of classroom dialogue that we believe underpins a holistic approach to formative assessment.