The Missing Link
It is that time of year again—data analysis and reporting out on proficiency. It is also that time of year when everyone is hoping to have good news to share or thinking about ways to characterize the information when it is not good news.
We spend a lot of time focusing on how to characterize proficiency, and often this is by percentages. We hope to see the percentages improve as a way to generalize the data in a way most people understand. Unfortunately, this data does not always tell the real story. That alone could be its own post.
However, I want to focus on something else today. What if we reported information based on which students were not meeting proficiency? Would this be a better way of discussing the types of supports needed for students? In Indiana, I spent a lot of time each year sharing out that we attained somewhere around 80% of Grade 3 students meeting proficiency on the annual reading assessment. It became much more powerful when we shifted the narrative to 1 in 5 Hoosier students cannot meet this threshold. It seemed like everyone believed 80% proficient was a good number until we say 1 in 5. The second value makes everyone more uncomfortable.
Shifting the narrative to who is not proficient also allows specific conversations related to student populations of greatest need. Which populations contribute to the not proficient category? Are students receiving the right supports? Are students receiving supports at all?