California’s misleading K-12 dashboard could lead to closure of the wrong schools

Teacher and students around a table
Ms. Hampton, English Language Arts teacher at Ánimo Mae Jemison Charter Middle School discusses text with students.
CREDIT: Green Dot Public Schools

Students at Ánimo Ellen Ochoa Charter Middle School in East Los Angeles are learning at one-and-a-half to two times the pace of their grade-level peers, based on their state standardized (CAASPP) test scores for the last three years compared to the average for the state.

But the California Department of Education has labeled Ochoa a “low performer,” based on how it ranks on various color-coded indicators on the California School Dashboard.

The department’s report of school performance — the state’s “dashboard” — is deeply flawed. For the 341 kids enrolled at Ochoa — 96% of whom are socioeconomically disadvantaged, almost all of whom are Latinx, and 24% of whom are still learning English — a flawed dashboard could lead to disaster. That’s because the school district could close the school based on its ranking.

Ochoa is part of the highly respected Green Dot Public Schools, a Los Angeles nonprofit educational organization, which was recognized by the U.S. Department of Education as a high-quality charter school operator during the Obama administration.

But in high-poverty middle schools such as Ochoa, students often arrive several years behind grade level. Few of them are “proficient” in math or reading. Ochoa’s students, while far behind, are making exceptional gains compared with students statewide. Yet the dashboard blends their test scores together with a year-to-year change measure that conceals both their high rate of growth and their low starting scores. These results only make sense when reported separately. They make no sense when blended together.

All but two states have viable measures of academic growth, designed to show whether students are catching up or falling further behind grade level. California does not.

Instead, it measures “change.” It takes a school full of kids and subtracts last year’s students’ test scores from this year’s students’ scores, as if kids in this year’s class were more or less just like kids in last year’s class. But for a middle school where kids in grades 6, 7 and 8 are tested, one grade level of kids departs and one arrives each year.

The dashboard also combines current test scores and change from last year’s class to this year’s class into one measure, another fatal logic error. These are two entirely different things.

Consider a middle school whose students were three years behind grade level, on average, last year. If this year they are only one year behind grade level, then that school would be performing spectacularly well. But its dashboard score for academics still would be orange or yellow — the equivalent of a D or C. If a school’s status is low on any indicator but it has made huge improvements, the best it can earn is yellow.

When the State Board of Education considered its change measure at a July 2018 meeting, a group of 14 respected academic experts warned in a letter that “The state’s current ‘change’ model is unacceptable — it profoundly fails the validity test, and therefore it does not accurately represent schools’ contribution to student achievement. Indeed, it is not clear what it represents at all.”

Later, in a paper for the “Getting Down to Facts” project, University of Southern California scholars Morgan Polikoff, Shira Korn and Russell McFall found the anticipated incorrect conclusions were indeed occurring as a result of the dashboard’s flaws: current change measures are inaccurate in their identification most of the time.

Fortunately, the California Department of Education has a group working on a new student growth measure. Unfortunately, the coronavirus pandemic forced cancellation of spring testing, which has put the 2020 dashboard on hold and delayed work on revamping the growth measure. The state should take the opportunity provided by this hiatus in dashboard reports to develop and put in place a credible measure of real growth and give it a prominent place of its own in the dashboard.

If it doesn’t, misleading ratings could give school boards license to close charter schools that are actually performing well. Some will be even more tempted to do so now that district funding will be cut sharply by the pandemic-induced recession, because many of the students at those closed charters will return to district schools, bringing state money with them.

We have pointed to other problems, which you can explore further on your own. We support accountability for charter schools, but like all schools, charters deserve to be evaluated based on sound evidence. The California Department of Education has not yet produced it.

•••

Steve Rees is founder of School Wise Press and leads their K-12 Measures team, which helps district and school planning teams make smarter use of their numbers. David Osborne, author of “Reinventing America’s Schools: Creating a 21st Century Education System,” (Bloomsbury, 2017), directs the K-12 education work of the Progressive Policy Institute.

The opinions in this commentary are those of the author. Commentaries published on EdSource represent diverse viewpoints about California’s public education systems. If you would like to submit a commentary, please review our guidelines and contact us.

EdSource in your inbox!

Stay ahead of the latest developments on education in California and nationally from early childhood to college and beyond. Sign up for EdSource’s no-cost daily email.

Subscribe