State Advocacy

2021 Assessments: The Former Education Commissioner’s Perspective

2021 Assessments: The Former Education Commissioner’s Perspective

Statewide assessments are more than just a box-checking exercise—they’re critical tools for ensuring transparency and promoting equity. Yet the 2021 assessments have been the subject of fierce debate, leaving policymakers, educators, and parents divided on the value and costs of testing students after an unprecedented school year.

We now know that most states will hold statewide assessments this year; however, state and local leaders must still determine how to effectively and equitably administer these tests. To learn more, DQC’s Allie Ball spoke to DQC Board Member Stephen Pruitt, former Kentucky Commissioner of Education and current president of the Southern Regional Education Board (SREB). Pruitt highlights the need for teacher data literacy and calls for more open conversations between state and local leaders about what we can learn from this year’s results.

Question #1: What are the biggest issues related to 2021 assessments that states must still address?

Pruitt: While there is certainly value in knowing where students stand, I understand why people are concerned what might come out of this year’s assessments and what that will mean. It is important to remember that this conversation is steeped in the history of statewide assessments—which are typically used as post-mortems for how teachers and schools did during a school year, and come with various stakes attached. People need to shift from that mindset, to using assessments as a tool to understand what kids know and need right now.

States should address concerns directly and make it clear that this year will not be business as usual—that this is the “year of the asterisk.” They should be very upfront about what this data can and cannot tell us, and how it will be used.

Question #2: What can we learn from local assessments, and how should state leaders be thinking about their role in the context of this year?

Pruitt: There is a lot to learn from local assessments, and I think the idea of using them in lieu of statewide assessment this year has some merit. However, it is important to consider their level of rigor, reliability, and validity, as well as their potential for bias. Developing and scoring assessments is about a lot more than just writing questions and marking them right or wrong. Most teachers only have to take one course on assessments, and that doesn’t get into the ins and outs of developing high-quality assessments.

There is an opportunity for investment in teacher assessment literacy. When I was Kentucky Education Commissioner, we went down the road of using a performance-based assessment at the local level. We had to provide a lot of training and support in order to get a cohort of teachers that understood assessment development and scoring. There’s also a need for state leadership in supporting local assessments. South Carolina, for example, proposed using a series of interim assessments developed by the state, but administered locally. This ensures that the tests have validity and reliability checks built-in, but still provide some local leeway.

Question #3: What should states do about this question of participation in this year’s assessments? Is there anything they should do to measure the learning of students who don’t participate?

Pruitt: State leaders first need to consider how they will report data from this year’s assessments—a large opt-out movement may impact that decision. If only 50 percent of students participate in an assessment, the results could be skew much lower or higher than the reality depending on which half that is. You may not be helping anyone by sharing that information. That’s why there is a 95 percent participation requirement in the first place—as your n-count decreases, the potential for erroneous conclusions increases. Whether or not they meet that 95 percent threshold, state leaders need to get out ahead on communications and make it clear what that data means and why it is not being used for accountability.

For those students who do not participate, state and local leaders also need to think about how to best understand how these students are doing. This requires shifting our assessment mindset from summative to formative—from evaluating proficiency to diagnosing learning. States could lead in this area by developing process guides for evaluating student’s current knowledge level, or sharing resources to help classroom teachers develop assessments. They should not just leave it up to districts to figure out—there needs to be a relationship, and states need to show leadership.

Question #4: What are the most important pieces of information we can gain from this year’s assessment data? In what ways will this information support COVID-19 recovery efforts heading into next year?

Pruitt: There are so many unknowns to this year’s assessments—and not just about how students will perform. We don’t know how many will participate, whether there will be a disproportionate number of students either sitting out or taking the test from any particular student group, and how that might influence the results. The best thing that can come out of this situation is a good conversation between state and local leaders about what we can learn from the data—one that goes beyond just mean score or overall participation. That information can also help to set priorities for the all the federal money they will be receiving.

I also hope that this year demonstrates that we need information beyond just performance to evaluate the type of education we’re providing students. We need to find a way to measure things that have been traditionally hard to measure. Chronic absenteeism cannot be the only measure for school quality. We should also be looking at factors like access to advanced coursework, career and technical education offerings, dual enrollment, and any other policies that sort students and create unnecessary barriers. I’m a big believer in assessments, but they’re not the only piece of data that matters.