Continuing on my data-trends this week, I’ve realized (consciously, at least) that when quantitative data is shown to students, it’s a report of some kind: a report card, a progress report, test scores, etc. It’s all final information – it isn’t actionable.
Yesterday, we played a review game where students submitted answers to a Google form which then spit out a grade based on those submissions. The purpose was to get them thinking about the information and to give them some immediate feedback – as a class – on how they did.
Seeing the gauges along with their place against all reported classes helped visualize where they fell on a preparedness scale in context.
It also helped the class see what specific standards they needed to go back and look over. I didn’t want individual information for this activity because the goal was generalize, actionable information.
Michael Pershan prompted my thinking this morning with his video (and comments) entitled, “Why Our Hints Don’t Help.” It’s extremely insightful (go take a look) and really helped me think through how I talk about quantitative information. It should be a call to action, not just a report.
It also changes the way students see these scores. They aren’t final any more – it’s a spot check on their progress. It’s a reality check for their perception of how well they have learned the information. It also leads to more good questions like, “Where can we find more information on [X]?” It’s a visual prompt, more than anything, which helps set the course for subsequent learning.