My school has a policy of using personal learning checklists (PLCs). Originally, we used ones provided by our partners, but their usefulness was limited. Last year, Tom Sherrington offered a critique of ‘can do’ checklists and I agree with much of what he wrote.
But surely there must be value in students being able to diagnose for themselves whether they know ‘stuff’ or not? So are these criticisms of all use of checklists or just against the ineffective uses described?
I think we have established a way of using checklists as effective diagnostic tools and also as tools useful for retrieval practice and establishing good independent learning methods.
To achieve this, we have written our own checklists with the guiding principle that they must be useful learning resources for students.
For this to be the case, students need to be able to assess themselves accurately against the criteria, something not possible in many of the examples Tom gave. Secondly, there needs to be a mechanism by which students can easily get useful feedback on their assessments.
To achieve the first aim, we wrote all of our criteria (for GCSE, these were based on knowledge components of the specification) as questions or tasks (see example below). Rather than ‘can do’ lists, they become ‘do’ lists. In this way they serve as both self-assessment checklists and also useful retrieval practice.
Initially, we asked students to use the checklist as a kind of primitive SLOP, just ‘P’ really. Students would work through the checklist doing what each task asked. If they could do it correctly (self or teacher checked) they could tick or ‘green’ their first check box. If not, they would mark the box red.
For this to be effective, students must have a way of knowing if there responses are correct. This was partly achieved by making tasks knowledge focused, and in doing so, removing a lot of the grey area around how good an answer is. To help with assessment, we also made knowledge organisers (KOs) that complement the checklists. By using the two together, students could easily self-test and check the accuracy of their responses.
In his piece, Tom suggested that the emphasis should move from over-diagnosis of knowledge gaps and instead focus on learning to plug the gaps. The KOs that we ahve made also act as a learning tool that allow students to easily find the factual information that they are lacking. Where students have graded criteria as red, the KO provides the resource for them to do follow up learning work.
On subsequent reviews of the checklist, students would review all criteria again, including those they marked green, but this time, to speed up the process, those criteria previously marked green might just be done mentally or verbally with a peer, rather than written or drawn out. However, answers to those marked red would still need to be written out. Students would then check answers and complete the second box on their checklist as red or green.
We decided that each criterion should be reviewed six times. Not because six has any special significance, but because it seems a balance between being achievable and it helps us to reinforce the idea that even previously green stuff needs to be revisited again and again because it can easily be forgotten.
A further criticism of the ‘can do’ lists was that it is impossible for students to self-assess for complex statements, for example “I can explain the importance of ethical, environmental and legal considerations when creating computer systems.” I agree wholeheartedly and we have indeed done what Tom suggested would be required to be able to assess this. We have broken down complex tasks into the pre-requisite knowledge that would support a detailed response. We then use class time to model how to use the knowledge to build these complex responses.
We’ve been using our new checklists this year and they have helped students to build confidence. They work a bit like boiling the frog, in that they make learning accessible, it gives students who need it building blocks with accessible compliance based learning that can then be built on later. They have also helped to motivate students by providing them with ownership and through a record of learning, as well as being good learning and diagnostic tool as described above.
I’m a bit gutted that I joined the twitter party a little late and have only recently come across freely shared SLOP. Exactly how I see these checklists working alongside SLOP in the future I do not know. I am not sure where they are complementary or if SLOP does a similar thing more thoroughly. However, in the spirit of sharing, I have made all of our checklists and KOs available here if you want to try yourself.