CREDIT: Green Dot Public Schools
Ms. Hampton, English Language Arts teacher at Ánimo Mae Jemison Charter Middle School discusses text with students.

Students at Ánimo Ellen Ochoa Charter Middle School in East Los Angeles are learning at one-and-a-half to two times the pace of their grade-level peers, based on their state standardized (CAASPP) test scores for the last three years compared to the average for the state.

But the California Department of Education has labeled Ochoa a “low performer,” based on how it ranks on various color-coded indicators on the California School Dashboard.

The department’s report of school performance — the state’s “dashboard” — is deeply flawed. For the 341 kids enrolled at Ochoa — 96% of whom are socioeconomically disadvantaged, almost all of whom are Latinx, and 24% of whom are still learning English — a flawed dashboard could lead to disaster. That’s because the school district could close the school based on its ranking.

Ochoa is part of the highly respected Green Dot Public Schools, a Los Angeles nonprofit educational organization, which was recognized by the U.S. Department of Education as a high-quality charter school operator during the Obama administration.

But in high-poverty middle schools such as Ochoa, students often arrive several years behind grade level. Few of them are “proficient” in math or reading. Ochoa’s students, while far behind, are making exceptional gains compared with students statewide. Yet the dashboard blends their test scores together with a year-to-year change measure that conceals both their high rate of growth and their low starting scores. These results only make sense when reported separately. They make no sense when blended together.

All but two states have viable measures of academic growth, designed to show whether students are catching up or falling further behind grade level. California does not.

Instead, it measures “change.” It takes a school full of kids and subtracts last year’s students’ test scores from this year’s students’ scores, as if kids in this year’s class were more or less just like kids in last year’s class. But for a middle school where kids in grades 6, 7 and 8 are tested, one grade level of kids departs and one arrives each year.

The dashboard also combines current test scores and change from last year’s class to this year’s class into one measure, another fatal logic error. These are two entirely different things.

Consider a middle school whose students were three years behind grade level, on average, last year. If this year they are only one year behind grade level, then that school would be performing spectacularly well. But its dashboard score for academics still would be orange or yellow — the equivalent of a D or C. If a school’s status is low on any indicator but it has made huge improvements, the best it can earn is yellow.

When the State Board of Education considered its change measure at a July 2018 meeting, a group of 14 respected academic experts warned in a letter that “The state’s current ‘change’ model is unacceptable — it profoundly fails the validity test, and therefore it does not accurately represent schools’ contribution to student achievement. Indeed, it is not clear what it represents at all.”

Later, in a paper for the “Getting Down to Facts” project, University of Southern California scholars Morgan Polikoff, Shira Korn and Russell McFall found the anticipated incorrect conclusions were indeed occurring as a result of the dashboard’s flaws: current change measures are inaccurate in their identification most of the time.

Fortunately, the California Department of Education has a group working on a new student growth measure. Unfortunately, the coronavirus pandemic forced cancellation of spring testing, which has put the 2020 dashboard on hold and delayed work on revamping the growth measure. The state should take the opportunity provided by this hiatus in dashboard reports to develop and put in place a credible measure of real growth and give it a prominent place of its own in the dashboard.

If it doesn’t, misleading ratings could give school boards license to close charter schools that are actually performing well. Some will be even more tempted to do so now that district funding will be cut sharply by the pandemic-induced recession, because many of the students at those closed charters will return to district schools, bringing state money with them.

We have pointed to other problems, which you can explore further on your own. We support accountability for charter schools, but like all schools, charters deserve to be evaluated based on sound evidence. The California Department of Education has not yet produced it.

•••

Steve Rees is founder of School Wise Press and leads their K-12 Measures team, which helps district and school planning teams make smarter use of their numbers. David Osborne, author of “Reinventing America’s Schools: Creating a 21st Century Education System,” (Bloomsbury, 2017), directs the K-12 education work of the Progressive Policy Institute.

The opinions in this commentary are those of the author. Commentaries published on EdSource represent diverse viewpoints about California’s public education systems. If you would like to submit a commentary, please review our guidelines and contact us.

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (4)

Leave a Comment

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Vincent Curtis Hunter 4 months ago4 months ago

    Flawed; excuse the simple minded parsimony here but the most compelling dependent variables probably ought to target longitudinal “parental/customer satisfaction”. Isn’t that the construct of greatest relevance to potential families looking to educate their kids?

  2. Gary Rubinstein 5 months ago5 months ago

    It seems like California has taken your suggestion and created a new growth model that aligns more with the rest of the country. They are going to roll it out officially for the 2023-2024 school year so the results will be in the beginning of the 2024-2025 school year.

  3. Jenny Grant Rankin, Ph.D. 3 years ago3 years ago

    Thank you Steve Rees, David Osborne, and EdSource for being such outspoken advocates of the need to get education data reporting right. Taking feedback like this into account can drastically improve our state’s (and other’s) monitoring of student progress and thus our ability to help schools and the students they serve.

  4. Brenda lebsack 3 years ago3 years ago

    Mr Rees, you make excellent points. It seems if traditional schools were under the same scrutiny as charter schools, many would not “cut the mustard” to continue their existence. As a former school board member, I can tell you the bias against charters is real. This is due to CSBA’s one-sided training of board members. I believe CSBA and CTA work closely together to maintain that monopolization of institutional control. It’s not … Read More

    Mr Rees, you make excellent points. It seems if traditional schools were under the same scrutiny as charter schools, many would not “cut the mustard” to continue their existence. As a former school board member, I can tell you the bias against charters is real. This is due to CSBA’s one-sided training of board members. I believe CSBA and CTA work closely together to maintain that monopolization of institutional control. It’s not in the best interest of students or families, but it is definitely in the best interest of bureaucratic power. I look forward in reading Reinventing American Schools because we are in desperate need of reform.