Average scores flat in 3rd year of California’s Common Core-aligned tests

September 27, 2017

Statewide student results on the standardized tests measuring knowledge of the Common Core standards were essentially flat in 2016-17, after a year of strong gains.

The California Department of Education released scores for the third year of the Smarter Balanced assessment on Wednesday, about a month behind schedule. The results for all of California’s nearly 1,000 school districts, 11,000 schools and individual student groups can be found and compared on EdSource’s Smarter Balanced database.

About 3.2 million students in grades 3 to 8 and grade 11 took the tests, as required by federal law.

Slightly fewer than half of students — 48.56 percent — tested proficient in English language arts; that was about a half-percentage point drop from last year

In math, 37.56 percent of students were proficient in math, which was about a half-percentage point increase over a year ago. Proficiency is defined as meeting or exceeding standards, the top two of four achievement levels on the Smarter Balanced grading scale.

California State Universities and many community colleges use the results on the 11th grade tests to gauge students’ readiness for college-level work. In 2017, 28 percent of juniors were considered ready and 32 percent were considered conditionally ready with reading, writing and research skills; 13 percent were considered ready and 19 percent conditionally ready in math.

The combination of minor gains and losses put students 4.6 percentage points above 2015, the first year of the test, in both math and English language arts. But flat scores in 2016-17 compared to the year before also meant there was little progress in narrowing a cavernous achievement gap between the lowest and highest performing student groups.

Only 31 percent of African-Americans and 37 percent of Latino students met or exceeded standards for English language arts in 2016-17. That compares with 76 percent of Asian and 64 percent of white students (see chart for full breakdown).

The gap slightly widened for the state’s 1.2 million English learners, with about a 1 percent decline, to 12 percent meeting or exceeding standards in English language arts results.  This group contains current English learners only. Once  English learners become proficient in English, they are no longer included in this classification.

In a statement, State Superintendent of Public Instruction Tom Torlakson characterized a lack of improvement this year as maintaining progress from the year before. “I’m pleased we retained our gains, but we have much more work to do,” he said.

Others were more critical or said it was too early to judge.

“Math remains a huge issue,” said Arun Ramanathan, executive director of Pivot Learning Partners, a nonprofit organization that works with schools on improvement strategies. “There has been more attention on expanding STEM and computer coding, but if early elementary grade math is languishing, all of the other stuff is meaningless.”

And with scores of English learners lagging so far behind, “we should think about more native language instruction in math; let’s teach them English in English.”

“You should not make big policy judgments based on limited data, and test scores are limited data,” said David Plank, executive director of Policy Analysis for California Education, or PACE. “It would be an error to conclude after three years that academic standards have failed or the Local Control Funding Formula is not accomplishing its purpose” of directing additional resources to the state’s low-income students and English learners, he said.

Ramanathan agreed, saying “blaming LCFF for student performance is a bit of a stretch.” If there is some blame, it’s that the funding formula creates too many priorities to address, he said, “and core academics is not emphasized enough in the current accountability system.”

Looking ahead, Ramanathan said, “we all know funding is going to be tighter, so it will be important to figure out what strategies have worked in the state and where. Right now, we have no information about that.”

Noah Bookman, chief strategy officer for the CORE districts, a partnership of eight districts, said, “At year three, we cannot point to a major policy shift in practice that would result in big gains. Every district has schools where remarkable things are happening but not consistently.”

But some superintendents who did see improvement this year attributed progress to specific actions that their districts took.

“We believe the work we’ve done around early literacy is starting to pay off by 3rd and 4th grade, in terms of comprehension and reading endurance,” said Matt Navo, superintendent of Sanger Unified in the Central Valley, where 90 percent of students are low-income. Reading scores increased 1.5 percentage points to the state average of 48 percent meeting or exceeding standards. Math scores rose 2.5 percentage points to 2 percent above the state average. But 5th grade made remains “conceptually difficult,” he said. Teachers have expressed frustration over the “robust conceptual understanding and reasoning” that is well beyond what’s been asked before. Only 31 percent of 5th-graders met or exceeded standards.

In Long Beach Unified, which increased 3 percentage points in English language arts and 5 percentage points in math, Assistant Superintendent of Research and School Improvement Chris Lund, said, “There is never a simple fix or golden key,” but specific efforts are bringing results. He said these include:

McKinley Elementary, where about 92 percent students are low-income, and 36 percent are English learners, did all of those and other things, said Principal Scott Tardibuono, and the school raised the percentage of kids meeting or exceeding standards 24 percentage points, from 24 to 48 percent in reading, and 14 percentage points, to 38 percent in math, in 2017.

Tardibuono used money he controls under the funding formula  to lower class size in 5th grade the past several years, and students working in small groups developed problem-solving skills in math. But another difference, said Kate Pekar, McKinley’s teacher on special assignment, involved changing students’ mindsets, based on the writings of Stanford University Professor Carol Dweck, to help students believe in themselves and their ability to try harder.

“McKinley had been low-performing school, but we convinced families and students it didn’t have to be that way,” Tardibuono said.

Plank and Bookman noted that the new accountability system had not kicked in to prod districts to improve. But that’s about to change. This year’s test results will be among factors used to determine whether districts and low-performing student groups, such as low-income students and English learners, will receive assistance from the county offices of education.

Before the adoption in 2013 of the Local Control Funding Formula, standardized test scores were the sole gauge of a school’s and district’s performance as measured by the Academic Performance Index. The funding formula established a broader set of measures of school performance, including students’ readiness for college and careers, school climate as measured by suspension rates and student engagement as measured by chronic absenteeism and graduation rates.

Test scores and other performance indicators will be included in the color-coded “dashboard” ratings that the state Department of Education will release in early December. However, the percentage of students who meet or exceed the standards, while easy to explain, won’t be the measure the state will use to determine dashboard performance. It will use the distance on the scoring scale above or below Standard Met. The state has not yet made that calculation.

Plank acknowledged that by year three of testing, the time has come to see positive results, but “the missing piece is additional support from the state to help local educators.” Mastering the new standards will take time and require patience, he said. To abandon that effort “is to pretend that we know what is going to work to accelerate improvement for kids who need it most. And we don’t.”

Test issues?

English language arts scores in all of the other 13 states administering the Smarter Balanced tests also experienced a decline in the percentage of proficiency, ranging from 0.6 percent to 2.3 percent. For math, there were both gains and declines among those states (see state by state breakdowns)

Edward Haertel, professor emeritus at the Stanford Graduate School of Education and a specialist in assessment, wrote in an email that uniform declines are unusual and raise questions about the test itself.

“The fact that 14 out of 14 states show changes in the same direction pretty much confirms that these are not merely chance fluctuations. The obvious question that arises is whether there’s some reason the numbers for this year versus last year are noncomparable,” he said. “It’s possible there was in fact some slight overall decline in students’ proficiency and the test results are accurate, but as a psychometrician, I’d want to work hard at eliminating rival hypotheses before settling on that conclusion.”

Tony Alpert, executive director of the states-run Smarter Balanced Assessment Consortium, said in response, “There is no reason to believe there was anything wrong with the administration of the test. We are still getting data from states and will do a rigorous review of the data.” There was a big gain in scores last year, he said, and no substantial loss of ground this year. “We have every reason to believe data represent what students know and did.”

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article