One district’s faulty data shouldn’t drive California’s math policy

A student practices graphing in Algebra I at Rudsdale Newcomer High School in Oakland, California.
Anne Wernikoff for EdSource

Why does the revised math framework for the entire state of California abandon peer-reviewed research in favor of magical thinking?

That’s a question we’ve been asking for more than a year now — with no meaningful answer.

As an organization of concerned citizens working for better governance, Families for San Francisco is sounding the alarm over California’s proposal to rely on San Francisco Unified School District’s exaggerated success claims as it revises our statewide math framework.

These claims have made their way into the introduction of the state’s proposed math framework (see page 17), have been widely cited by proponents of the new framework and have influenced the policy position of the National Council of Teachers of Mathematics.

We believe strongly that incorporating social justice into math education is a no-brainer. Culturally relevant curriculum leads to fuller participation, deeper comprehension and better outcomes for all.

But as our research into San Francisco Unified’s results has shown, these efforts will be doomed if they’re not grounded in facts and reality.

The proposed revisions to the state curriculum rely heavily on the district’s seven-year experiment in overhauling its high school math pathway. By delaying Algebra 1 by one year for all students and by mandating that all students be in the same course sequence through eighth, ninth and 10th grades, the district promised to revolutionize equitable math instruction.

Unfortunately, the evidence does not bear out these claims.

We took a deep dive into SFUSD’s data. Through California Public Records Act requests and other publicly available data, our team investigated SFUSD’s metrics for improved outcomes.

And when we looked closely, what we found was that these initiatives had not only not worked — they had made the existing inequities much worse.

Although our full report covers many misleading and unsubstantiated claims, two examples will suffice here.

Our first shock came when we compared San Francisco Unified’s claims about long-term outcomes with its actual data. The district’s hunch had been that delaying Algebra 1 to ninth grade would keep everybody in the same classrooms through the end of 10th grade, empowering more diverse students to access advanced mathematics successfully.

But that’s not what actually happened. Far from keeping everybody in the same course pathway, San Francisco Unified’s mandate has led to an inequitable patchwork scheme of costly and hidden workarounds — enriched for-pay options that affluent students could access but that their less privileged counterparts could not. We found that after seven years, the number of Black and brown students in the district reaching Algebra 2 by the end of 10th grade has declined, not risen.

The second troubling inconsistency is the most celebrated and also the most misleading. For the past few years, the district has proclaimed in presentations that its new pathway reduced its Algebra 1 repeat rate from 40% to 7%. Not only have they produced no evidence to support this claim, our public records requests and analysis have revealed that the district knew its student outcomes under the new system remained unchanged — but they continued to promote the claim anyway.

In their most widely cited presentation, we found a speaker note for slide 3 that accidentally tells the truth: the fact that the one-time drop from 40% to 7% was not a result of the changed sequence at all, but was rather was “a one-time drop” — a fluke that occurred when they canceled their gatekeeping exit exam:

But the slide itself, and their presentations, do not include this critical insight. The fact that they failed to explain their own analysis of the effect of dropping this test tells you everything you need to know about their standards for research.

There’s so much more in our report that we hope will help launch a statewide conversation on the math framework. In fact, the New York Times just published a front-page story questioning California’s proposed approach that cited our report. To our surprise, in the three weeks since we first published our report, the district has neither contacted us to request corrections nor made any other public response to our work beyond their comments in the New York Times. We find their silence perplexing.

But it shouldn’t take a scrappy local citizens organization and public records requests to determine whether research promises have been kept.

That’s what the academic peer review process is for.

We believe it’s irresponsible to base the future of math education in California on this flawed program without full peer review first. The state’s math framework should be where we come together to improve on what actually works, so we can produce diverse and mathematically well-educated high school graduates for generations to come.

•••

Seeyew Mo is the executive director of Families For San Francisco, a San Francisco-based 501(c)4 focused on issues of good city governance, quality education and affordable housing.

The opinions in this commentary are those of the author. If you would like to submit a commentary, please review our guidelines and contact us.

EdSource in your inbox!

Stay ahead of the latest developments on education in California and nationally from early childhood to college and beyond. Sign up for EdSource’s no-cost daily email.

Subscribe