Home

If we consider two of Dylan William’s five main AFL strategies from our work with the SSAT EFA project:

  • Engineering effective classroom discussion, questions and learning tasks that elicit evidence of learning
  • Providing feedback that moves learning forward

There are a range of opportunities that the iPads offer to support this. I have found Socrative to be a very useful app in this respect. I have used it very successfully with my Year 7s over the last week. My starting point was to think about the key terms and key knowledge that I wanted the students to gain while studying the topic (elections and democracy). I then constructed multiple choice tests on Socrative. Each test was designed to test recall and understanding of the key information that we covered in a previous activity or previous lesson. Within the tests I built in wrong answers that students might commonly select (subject/topic specific misconceptions). Without iPads, I would have two options open to me. I could either take the tests in, mark them and then plan my lesson around the results (picking up on mistakes and misconceptions in the process). Alternatively, I could get students to mark or peer-mark and attempt to pick out class misconceptions. The latter option is better in that I could deal with misconceptions in the same lesson; it is inferior though in other ways as I lose valuable lesson time and I won’t pick up ‘trends’ across the class unless I survey the class on every question (another huge waste of time).

What Socrative has allowed me to do is get a live feed on student answers, then break down each question in front of the class and ask questions about why they think certain answers were or were not selected. I can target individual students to explain their answers. I can also adapt my teaching then and there to re-explain a key concept.

 

For example, I ran a quiz on the key terms we had learnt to do with Parliament. The live chart below shows that some questions were answered really well, others not well at all.

Screenshot 2015-11-23 10.38.52

By tapping on the question number or the ‘percentage correct’, you can then access information about a question. I do this ‘live’ on the board. By opening the first question and tapping the ‘How’d we do?’ tab, you can see where the incorrect answers were. Clearly the class understood the name of the second chamber but I was able to quickly clarify why the incorrect answers were incorrect (not just that they were incorrect.)

IMG_0359

The second question already showed me that the majority of the class did now know this (36% got it correct). I showed the breakdown to the class (see below) and was therefore able to spend more time questioning and explaining why the incorrect answers were wrong. The incorrect answers were obviously all wrong – but crucially, they were all key terms linked to the overall topic in some way, rather than random incorrect words. This allowed me to explore the confusion and address the misconceptions at the point of the mistake (rather than the following lesson). Rather than just address the fact that Lords are also known as Peers, it was more important for the students to understand why Lords cannot be MPs. The other two misconceptions brought interesting discussion – Lords can be ministers and judges can be Lords; however, all Lords are not otherwise known as judges or ministers. It was important for students to understand this.

IMG_0358

Robert Bjork’s research on multiple choice questions concluded:

“..properly constructed multiple-choice practice tests can be important learning events for students. Achieving “proper construction” of such tests—which requires that incorrect alternatives be plausible, but not so plausible that they are unfair—is, however, a challenge. As any teacher who has used multiple-choice tests can testify,writing good multiple-choice items is very hard work, whereas writing poor ones is relatively easy. Thus, when people accuse multiple-choice tests of being bad tests, that accusation, statistically, has some truth to it.”
To conclude, Socrative multiple choice questions are an excellent AFL strategy if the questions are structured to address student misconceptions and if answers are analysed immediately at the conclusion of the test. The test can then be given again as a starter (maybe in the following lesson) to check that the misconceptions have been removed.
Further reading:
This article explains Hinge Questions really well – the idea of constructing a single multiple choice question at a key point in the lesson to assess misconceptions.
This article argues that multiple choice questions can provide real depth and rigour if constructed properly

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s