Post

Long-awaited data from the most recent Civil Rights Data Collection (CRDC) has been released. This new data from the 2020-21 school year provides an important, mid-pandemic update on key indicators of opportunity for students in our nation’s public schools. As a researcher and advocate, I’m excited to explore the data, understand its limitations, and dig into new features and tools that came with this year’s release. Along with the data release, there is a new and improved data tool that visualizes the CRDC data for schools, districts, states, and the country, which allows for comparisons between district, state, and national averages — not just for this year, but for all the biennial collections going back to 2011-12. For example, if you want to see how the number of Black students participating in dual enrollment programs in your local high school compares to the district, state, and national averages, you can do that directly with the tool. Visit civilrightsdata.ed.gov to check out the data for the schools and districts in your community.

This new data from the CRDC is some of the best available to highlight the experiences of students across all U.S. public schools. EdTrust has used past CRDC data to identify opportunity gaps for students of color and students from low-income backgrounds. With the data provided by the CRDC, EdTrust was able to:

  • Identify the varied reasons that Black and Latino students have been locked out of advanced coursework
  • Find that, in many states, Black and Latino students are more likely to attend schools that have high percentages of novice teachers.
  • Highlight that most states are not providing adequate access to school counselors for students of color and students from low-income backgrounds — especially for students in earlier grades.
  • Shine a light on the use of exclusionary discipline practices that result in higher rates of suspension for Black and Native girls than their White peers.

Along with the data tool, the Office for Civil Rights (OCR) released briefs that tell us what the data says about student experiences during the 2020-21 school year and help us better understand how the data was collected and checked for quality. While the CRDC has always, and will continue to be, a powerful tool to understand students’ access to resources in schools, researchers and advocates should be thoughtful with how they use this year’s data, as there are several reasons that make this an anomalous year for the data: some students were in virtual school for most of the year, while others weren’t — which wasn’t consistent across the country — school climate data is different in virtual settings and more data was suppressed due to errors (such as seemingly inaccurate or missing data). Given the varied experiences of students learning virtually or in school buildings, here are some questions to ask when using the data:

Can I compare experiences of students by race/ethnicity to understand whether certain groups are being short-changed in specific schools or districts?

Yes. This is what the CRDC is for — that hasn’t changed. The CRDC provides direct comparisons of in-school resources for different groups of students. Data from the CRDC is especially useful to compare data between schools and districts. For example, I feel confident using the CRDC to examine differences in AP course enrollment for Black students in 2020-21 at the two high schools I attended (Career High School and Hillhouse High School).

Can I compare data across districts or across states?

Yes, but do so with caution. When making comparisons across districts or states, pay attention to missing data — especially for larger districts. For example, according to the 2020-21 CRDC, in New York state, fewer than 1% of students were suspended, but New York City Public School data on students who experienced exclusionary discipline (e.g., in- and out-of-school suspensions) was suppressed due to data quality reasons. This means that the 1% suspension data point for New York State does not capture the data for the one-third of the state’s students in New York City Public Schools. These patterns of missing data can skew state-level summaries, and if there are enough errors, missing data can skew national summaries as well. The key is to understand what the limitations might be and to be clear about those limitations when using the data.

Can I compare this year’s results to prior years to examine trends?

Yes and no. The differences between 2020-21 and other years for individual schools and districts reflect real differences in students’ experiences that should be explored; however, comparisons between districts and states are more challenging. Researchers, policymakers, and advocates can compare this new data against older data and other data sources as a “gut check” for whether the 2020-21 data seems irregular, but it should not be compared with prior years to make concrete determinations about trends. For example, in 2017-18, there were about 56,000 school counselors; in 2020-21, there were more than double the number of school counselors (127,000). This is a significant shift, but the data must be considered with what is known about how district leaders prioritized supporting students’ mental health needs during the pandemic and if they will continue those supports as Elementary and Secondary School Emergency Relief Fund (ESSER) funds expire in September.

The new CRDC is an essential tool that advocates, policymakers, and educators can use to further understand the 2020-21 educational landscape. While I look forward to digging into this year’s data to learn more about student experiences and school conditions in 2020-21, I will also keep in mind the limitations of the data and how to use the data responsibly.