Supporting Harvard’s Ranking of State Charter School Performance

Students participate in a writing class at KIPP Memphis Collegiate Middle School in Tennessee.

In November 2023, Harvard’s Program on Education Policy and Governance released a new state-by-state ranking of charter school students’ performance on the National Assessment of Educational Progress (NAEP), also known as the Nation’s Report Card. This ranking is based on charter students’ scores on 24 NAEP tests in math and reading from 2009 to 2019. It is the first ranking to compare charter student performance to the performance of all students in the United States.

The ranking has received positive feedback overall. According to one news report, the head of the KIPP Foundation, the nation’s largest charter school network, says the results “confirm our experience.” The National Alliance for Public Charter Schools comments that the new data are “sobering in many respects,” indicating that charter schools in many areas have room for improvement. Even states like Alaska, Massachusetts, New Hampshire, and Oklahoma, which ranked high in the standings, have embraced the ranking. Some critics of the ranking have not attacked its methodology directly, but rather objected to their home state’s placement. For example, one charter school advocate from Arizona objected to the ranking on the basis that the NAEP test is only taken by a randomly selected sample of students, despite the NAEP’s reliability being comparable to the U.S. Census Bureau’s methods.

In a recent blog post on NextSteps, Matthew Ladner expressed his doubts about Harvard’s findings. Ladner argues that the ranking did not take into account the proportion of charter students in special education programs or those who are English Language Learners. He also suggests that charter students should have been ranked based on state proficiency tests rather than the NAEP. Harvard responds to these criticisms in the technical version of their paper, published in the Journal of School Choice.

Harvard refutes these claims by emphasizing that they did account for special education and English Language Learner status in their analysis. In addition, they adjusted scores on the NAEP to consider various factors such as test-taker age, parents’ education levels, gender, ethnicity, English proficiency, disability status, eligibility for free and reduced school lunch, student-reported access to books and computers at home, and location. They also point out that using multiple tests over an 11-year period enhances the reliability of the results. Ladner suggests using the Stanford Education Data Archives (SEDA) instead, but Harvard argues that the NAEP tests are more preferable due to their consistency and avoidance of high-stakes testing pressures.

Harvard acknowledges that their rankings are not the final verdict on charter school quality, but they hope that accurate and straightforward criticism will contribute to the ongoing improvement of charter school assessments.

Paul E. Peterson is a professor of government at Harvard University, director of its Program on Education Policy and Governance, and senior editor at Education Next. M. Danish Shakeel is professor and the director of the E. G. West Centre for Education Policy at the University of Buckingham, U.K.

The post Defending Harvard’s Ranking of State Charter School Performance appeared first on Education Next.

Other articles

Post Image
Education
Will Fifth Attempt Finally Pass Sex Ed Law?

The state updated its sex education guidelines last year for students from pre-K …

Read More
Post Image
Education
Ohio State study finds that emphasizing test scores may increase the risk of violence for teachers

A recent study conducted by academic researchers at Ohio State University discov …

Read More
Education
Accusations of Call Deflection and Mismanagement Leveled Against Major Student Loan Company MOHELA

Two prominent educational organizations, the American Federation of Teachers and …

Read More