Here’s some news from California’s first year of Smarter Balanced testing: The 2015 results show that high school juniors did about as well in demonstrating their readiness for college-level mathematics as they did during other recent years.

But you might not know it from the Early Assessment Program (EAP) data published by the state, which *seem* to show far higher percentages of students being “college ready” in math during prior years, before students took the Smarter Balanced tests.

The EAP is the California State University program that gives high school juniors early feedback about their college readiness and a chance to sidestep, at CSU and many community colleges, taking placement tests that could send them to remedial or developmental classes.

To understand why Early Assessment Program data don’t tell the whole story, you need to know how California’s new Smarter Balanced testing system – which students participated in for the first time in 2015 – differs from the state’s former STAR system when it comes to high school math.

With Smarter Balanced, there’s *only one* state math test for 11th graders. Most 11th graders – 92 percent of the nearly half-million high school juniors in California – took it during the spring of 2015. So, the results provide a reasonably fair overall picture of how many high school juniors were “college ready” or “conditionally ready” in math.

“Conditionally ready” means that students are regarded as being ready for college-level math if they do well enough in an approved math course during their senior year of high school.

But California’s old STAR testing system in high school math was organized completely differently.

Back then, and most crucially, high school juniors took *different* math tests depending on what math courses they took. In addition, only 11th graders who were enrolled in or had already completed Algebra II could choose to participate in the Early Assessment Program, which added material to the STAR math tests those students took that year. Most students who could participate did so.

These may sound like minor technical differences between the two testing systems, but their implications for how we understand the overall readiness of California high schoolers for college-level math are huge.

Consider 2013, the last year for which full STAR testing data are available. In that year, only a little more than half of California’s high school juniors had taken at least Algebra II. Virtually all the rest were unprepared for college-level math by definition: to be eligible for admission to CSU, students must complete *at least* Algebra II.

Thus, the results on the Early Assessment Program tests that students took before 2015 did *not* give an overall picture of the college readiness of California high school juniors in mathematics. Far from it. The results told us only about the college readiness of *some* 11th graders: those who had taken at least an Algebra II course and chose to participate in the EAP.

To get a broader picture, we need to put the old Early Assessment Program math results into proper context, as follows. Again, consider 2013.

- Without question, many students who took the old EAP in math did well: 60 percent qualified as “ready” or “conditionally ready” for college-level math in 2013.
- But taken as a proportion of
*all*high school juniors in the state, only about 27 percent tested as “ready” or “conditionally ready” for college-level math in that year.

From this broader perspective, we see that the 2013 math results were pretty much the same as the ones from the 2015 Smarter Balanced tests, when 29 percent scored high enough to be deemed ready for college-level math, whether outright or conditionally.

Thus, the Smarter Balanced scores do *not* reveal a large dip in readiness for college-level math among high school juniors compared with prior recent years, as initial reports seemed to indicate.

This should serve as a warning that we must take care to understand the structural quirks and underlying priorities of the testing systems that produce the data we use, lest we mislead ourselves.

California still has a lot of work to do to improve students’ math readiness and help more students transition successfully to community college and university. But we don’t need to underestimate the abilities of our students in the process.

•••

*Matthew S. Rosin** was EdSource’s senior research associate from 2007-2012. He holds a Ph.D. in Education from Stanford University. *

The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.

## Support independent journalism

If this article helped keep you informed and engaged with California education, would you consider supporting the nonprofit organization that brought it to you?

EdSource is participating in NewsMatch, a campaign to keep independent, nonprofit journalism strong. A gift to EdSource now means your donation will be matched, dollar for dollar, up to $1,000 per donation through the end of 2018. That means double the support for the reporters, editors and data specialists who brought you this story. Please make a contribution today.

## Comments (3)

## Comments Policy

The goal of the comments section on EdSource is to facilitate thoughtful conversation about content published on our website. Click here for EdSource's Comments Policy.

JMK3 years ago3 years agoI wrote this back when the tests came out. https://hypersensitivecranky.wordpress.com/2015/09/10/smarter-balance-tests-californias-juniors-did-well/ Having had a chance to look at my students' scores, I think the reason the math scores held steady has to do with the improved Algebra 2 results. In the old version, Algebra 2 students had to take the much harder A2 CST. Students who were beyond A2 had to take the much easier pre-summative test. Then both groups took the EAP packet. So every … Read More

I wrote this back when the tests came out.

https://hypersensitivecranky.wordpress.com/2015/09/10/smarter-balance-tests-californias-juniors-did-well/

Having had a chance to look at my students’ scores, I think the reason the math scores held steady has to do with the improved Algebra 2 results. In the old version, Algebra 2 students had to take the much harder A2 CST. Students who were beyond A2 had to take the much easier pre-summative test. Then both groups took the EAP packet. So every year, I had pre-calc kids who weren’t terribly good qualify as “Conditional” and other much better kids fail the test because they were in Algebra 2.

Now, my strongest algebra 2 kids scored as Conditional (level 3) while the weakest of my pre-calc kids got a 2. The first group is bigger than the second.

Don3 years ago3 years agoI understand the excellent clarification the author, Mr. Rosin, is making when assessment results are viewed through the appropriate lens. However, I don’t understand how those results can identify 2% more students as ready or conditionally ready, when 11th grade math scores were predictably low in the first year of SBAC administration (unlike the ELA scores which were aberrantly high).

Clarification?

Replies

navigio3 years ago3 years ago11th grade math proficiency rates were only about 25% on the CST(s).

If you consider the top two SBAC levels to be proficient, then they aren’t really ‘low’, predictably or otherwise.