Return to Diagnostic Assessment
I give students frequent formative assessments to monitor their progress with the material. These include formal biweekly (and sometimes weekly) “short cycle assessments”, which I track in an excel tracker. This excel tracker, as seen below, tracks overall mastery of individual students and the class as a whole. This allows me to target particular students for remediation if only a fraction of the students did not master the skill, or the whole class if the entire class needs a re-teach.
When I give students their short cycle assessments on the computer, I can track their mastery by standard. As seen below, I can track mastery of each question by class, and each question lists the standards tested.
I also give students exit tickets daily, which allows me to track their mastery day-by-day. I try to have a variety of question types on the exit tickets, depending if I want to test quick mastery of the mechanics of the skill or whether or not students can explain their thinking. I create my groups for station rotation based on exit ticket data so that I can target students in small group reteach the next day properly.
Learning from Mistakes
With short cycle assessments, I have students complete quiz corrections, then a reassessment. There are some weekly quizzes on fluency skills, such as the integers one attached below, that I have student re-take with different numbers each week and replace their quiz grade with their overall highest grade. Once they reach 100%, they no longer have to retake the quiz and qualify for an ice cream social. So far, 10% of the class has achieved 100% on the quiz. This is to encourage students to achieve mastery on the skill and learn from their past mistakes. My weekly integers quiz has gone from 25% overall mastery the first time students took it to 60% overall mastery the seventh time.
I also do projects during the unit as a mid-unit assessment as well. For example, with the Fornite project shown below, I had students create a table, graph, and equation for a player who wins 3 out of every 4 games that he plays. They then had to explain if the relationships was proportional and why. While 95% of the class was able to successfully make a table, graph the relationship, and write an equation, only 25% was able to successfully articulate why the relationship was proportional. I knew to focus more of my instruction on the concept of proportionality instead of the mechanics due to their performance on this project. They had in their Do Nows every day a question asking if a relationship was proportional, and if so, why.
For example, the student who completed the project to the far left was able to write a table and graph for the relationship, but failed to explain why it was proportional or write an equation. The example in the middle had a graph, table, and equation, but did not think carefully enough about which variable ought to be independent or dependent. The example to the right had all of the elements, but could still work on making writing more precise. Her project also showed that she thought more critically about why she was making certain decisions, such as independent and dependent variables, and was able to fix her mistakes when she realized them. She was learning through sitting down and doing the project.
I was able to use these observations from halfway through the unit to inform my instruction for the rest of the proportional relationships unit.
i-Ready Lesson Quizzes
When students work on i-Ready lessons, I can also track how much they are passing those lessons. This allows me to further target students for differentiated instruction in i-Ready and/or being pulled for small group reteaches of lessons they struggle to pass on i-Ready.
Student-by-student, I can also see by what margin students passed and how often they retook a lesson. For example, the student below had to retake “Recognizing proportional relationships” lesson a second time in order to pass, and only barely passed with a 75%.
Return to Diagnostic Assessment