California’s math framework lacks sound research evidence to justify its progressive agenda

Credit: Alison Yin for EdSource

The California State Department of Education has released a new draft of its curriculum framework for K-12 mathematics. While it is notably improved regarding opportunities for advanced work, the document is still woefully laden with dogma about politics and about how to teach math.

The framework promotes only the progressive-education approach to teaching math, calling it “student-led” instruction, “active learning,” “active inquiry,” and “collaborative” instruction. But evidence from the 1950s through recent times shows that this way of teaching math is ineffective. That evidence comes from scrutinizing carefully designed studies featuring randomized control and what are called quasi-experiments, which approximate the effect of a randomized assignment of students to different groups. Quasi-experiments look at cases, for example, where two adjoining districts with similar populations or two adjoining similar schools adopt different policies. Both sorts of studies are much stronger evidence than the case studies that progressive educators rely on.

In the spring 2012 issue of American Educator, the magazine of the American Federation of Teachers, top educational psychologists Richard E. Clark, Paul A. Kirschner and John Sweller summarized “decades of research” that “clearly demonstrates” that for almost all students, “direct, explicit instruction” is “more effective” than inquiry-based progressive education in math.

Clark, Kirschner and Sweller conclude that after “a half century” of progressive educators advocating inquiry-based teaching of math, “no body of sound research” can be found that supports using that approach with “anyone other than the most expert students.” Evidence from the best studies, they emphasize, “almost uniformly” supports “full and explicit” instruction rather than an inquiry-based approach. Yet when explicit, direct instruction is discussed in the proposed math curriculum (chapters 3 and 6), it is deprecated.

To be more specific, the framework uses the term “struggle” (or “struggling”) over 75 times, typically in phrases such as, “Students learn best when they are actively engaged in questioning, struggling, problem solving, reasoning, communicating, making connections, and explaining,” or “Teachers should also underscore the importance and value of times of struggle.” While the former is a mouthful and includes essentially everything and the kitchen sink—with a notable exception of “practice”— the latter is a direct pitch for “struggle.” It is as if the authors were guided by Mao Zedong’s old exhortation, “Dare to Struggle, Dare to Win.”

Is it true that student struggle is such a critical component of learning that it should be singled out and treated as primary? The framework offers a variety of cherry-picked citations supporting this idea. Yet, it carefully avoids mentioning that research warns against excessive struggle as time-wasting and discouraging, often leaving students with incorrect understanding. In the absence of such cautions, teachers are likely to walk away convinced that the more they let their students struggle—and struggle is common with the inquiry-based pedagogy promoted by the framework—the more they will learn. This is like saying a child should be tossed in the water rather than taught to swim.

This illustrates two related major flaws that underlie this draft framework: what does “research-based” mean, and the quality of its citations.

State-adopted education programs and recommendations are supposed to be “research-based.” This does not just mean an article or two in a peer-reviewed journal. It means there is a consensus or strong evidence of effectiveness in the published research. If no strong evidence exists, a practice should not be broadly recommended. If there is no consensus, both pro and con evidence should be cited. An example of that can be seen in the Institute of Education Sciences Practice Guides, which identify practices as having strong, medium or weak evidence.

None of this is indicated for “struggle,” or the framework’s push for “inquiry learning” over explicit instruction that is effectively unmentioned in the framework, or its ignoring of highly effective engagement with worked-out problems, or the framework’s lack of any recommendations regarding the proven effective spaced (or distributed) practice — the use of homework and quizzes intentionally spread over a period of weeks after learning a topic, to maximize retention. The focus on inquiry learning, which relies heavily on students’ struggles, has been discouraged by strong research. Distributed practice and use of worked-out examples are supported here and here, yet are ignored in the framework. Instead, the framework offers us “trauma induced pedagogy,” teachers who are considered exemplary for promoting “sociopolitical consciousness,” taking a “justice-oriented perspective,” and embedding “environmental or social justice” in the math work given to children. This is not even a weakly research-based pedagogical framework — this is an ideological manifesto.

In fact, poor and selective research citations undermine much of this framework’s recommendations. Dozens of citations refer to unpublished works on the website of Jo Boaler, one of the framework’s authors. More than five dozen citations of her published works exist in the framework, far more than anyone else’s, yet only a single one of her references was published in one of the top 100 influential education journals. Her 2008 study, cited seven times in the framework, had its accuracy and methodology called into serious question in an analysis by two California math professors and a statistician.

If the framework writers had wanted solid evidence, they would have relied on the final report and subgroup reports of the 2008 federal National Mathematics Advisory Panel. They would have made even more use of the federal Institute of Education Sciences practice guides, which are designed for teachers and curriculum writers. Instead, the framework’s writers pretend this high-quality evidence doesn’t even exist.

•••

Ze’ev Wurman is a research fellow at the Independent Institute, chief software architect with MonolithIC 3D, and former senior policy adviser with the Office of Planning, Evaluation and Policy Development at the U.S. Department of Education.

Williamson M. Evers is a senior fellow and director of the Center on Educational Excellence at the Independent Institute in Oakland, California. He is a former assistant secretary for planning, evaluation and policy development at the U.S. Department of Education.

The opinions in this commentary are those of the authors. If you would like to submit a commentary, please review our guidelines and contact us.

EdSource in your inbox!

Stay ahead of the latest developments on education in California and nationally from early childhood to college and beyond. Sign up for EdSource’s no-cost daily email.

Subscribe