Iwunze Ugo, a research associate who focuses on K–12 education at the nonprofit Public Policy Institute of California, or PPIC, says it’s too early to determine whether or not the state’s Local Control Funding Formula is working. But a recent report he co-authored shows that many districts and schools have a long way to go to achieve the goals envisioned when the radically different funding policy was approved by the legislature in 2012-13.
The formula gives districts with more low-income students, English learners and foster youth a greater share of money than those with predominantly wealthier, English-fluent students. One goal of this formula is to help districts close or narrow ongoing gaps in achievement between students considered to be disadvantaged (English learners and low-income students) and more affluent English speaking students.
The report called, “Student Achievement and Growth on California’s K-12 Assessments,” written with senior PPIC fellow Laura Hill, focused on these achievement gaps between low-income students and English learners and their wealthier, English-fluent peers on standardized tests in math and English language arts, specifically looking at progress made by 4th-graders in 2015 and 5th-graders in 2016. The report did not look at achievement gaps based on race or ethnicity.
Ugo and Hill found that many low-performing schools and districts where students posts little gains from one year to the next on the Smarter Balanced tests tended to have larger percentages of “high-need students” compared to those schools and districts where students got higher test scores and posted larger annual gains on the Smarter Balanced tests. They called these results “especially troubling because they indicate that disadvantaged students are falling further behind.” However, the report did identify some schools and districts that were beating the odds based on their demographics, as well as some that were doing worse than expected, given their relatively small percentages of English learners and low-income students.
Ugo spoke about the report’s findings and recommendations during a Public Policy Institute event in Sacramento, noting that English learners had an especially hard time on the tests and encouraging policy-makers to take that into consideration as they come up with new rules for reclassifying English learners as English-fluent.
Although high-needs students are falling further behind their wealthier, English-fluent peers overall, Ugo said there was one bright spot: high-needs students at schools with a greater concentration of low-income students and English learners were keeping up with disadvantaged students at schools with lower concentrations of these students.
This, Ugo said, suggests that LCFF concentration grants are helping to prevent students in schools with predominantly low-income and English learner students from falling further behind students with similar demographics at schools where they do not make up the majority – since schools with high concentrations of disadvantaged students are generally believed to have more challenges overall than those with fewer high-needs students.
In fact, this notion of leveling the playing field for schools with a greater proportion of challenges was the rationale behind awarding concentration grants in the first place, he said.
Ugo also cautioned that if the 11th grade test were to become a replacement for the now-defunct California High School Exam, or CAHSEE, that could affect the state’s dropout and graduation rates by establishing an additional barrier that may be difficult for some students to overcome.
He discussed his report and Sacramento presentation during an interview with EdSource. Below are excerpts from that conversation.
Q. Why did you think it was important to compare student test scores from one year to the next?
A. My background in research has been in education finance. But, with all the changes the state has been going through – with Local Control Funding and Common Core standards – an important part of seeing whether the reforms are working is looking at test scores.
We did a report last year, called “High-Need Students and California’s New Assessments.” After the first year, we compared gaps between students meeting the standards on the old tests to the new tests. After the second year, we wanted to look at two years of data.
Q. What did you find in your studies that you considered particularly significant?
A. English learners have had a harder time with the new tests than the old tests. It could just be a matter of adjustment, since the format of the new tests is different. They’re computer-based.
Over time, the old California Standards Test scores got better and better each year as students learned how to take the tests. With Smarter Balanced, starting from square one three years ago, a lot of those gains were taken away.
One thing we saw in the report from the first year was that schools with fewer disadvantaged students (English learners and low-income students) had an easier time adjusting and being more nimble. But schools with more disadvantaged students had a harder time with the tests and adjusting.
Q. Please elaborate on your finding that districts where students scored low and did not show significant growth from one year to the next tended to have higher percentages of high-need students, which you said indicates that “disadvantaged students are falling further behind.”
A. The first part of our report looked at achievement and growth differences. We saw how growth varied. Among some students with low achievement, there was low growth. But, some were growing pretty quickly, which was a good thing because they were catching up to their peers and could be closing the achievement gap.
We were troubled by districts that are low-achieving and also show low growth. That’s a sign they’re already behind, but are not making that much progress. Some of the higher-achieving districts are continuing to show higher growth because they’re adjusting to the Smarter Balanced tests, picking up on the Common Core and excelling. But the lower-achieving students and districts could be underperforming into the future.
“Districts may have difficulty taking on the improvement goals themselves. That’s been one of the biggest questions about the LCFF – whether districts are going to be able to execute these new responsibilities. It’s early and we will see if they’re up to the task.” – Iwunze Ugo
Q. Your report showed that most districts had similar growth in achievement, regardless of their percentages of disadvantaged students. How does this relate to the effectiveness of the Local Control Funding Formula, or LCFF, which directs more money to districts with low-income students, English learners and foster youth?
A. The rationale for giving districts more money in the first place – the concentration funds (for districts with high concentrations of low-income students and English learners) – is that beyond issues that your average disadvantaged student has, students at a school where there’s a large share of these students require even more additional resources.
What we saw was disadvantaged students at schools with a concentration of high needs students were doing as well as disadvantaged students at schools with smaller shares of these students.
The way that could be interpreted is that whatever those additional challenges are that arise, that extra funding was working in the sense that students at those more disadvantaged schools weren’t falling further behind due to those challenges.
Q. Still, your report notes that the percentage of disadvantaged students in schools can vary within a district and that some schools with more disadvantaged students than others score lower than the district’s overall average scores. Might this reflect on LCFF spending priorities?
A. It’s an important issue to raise that districts aren’t homogeneous and there can be different demographics throughout that vary from one school to another. But, that’s mainly a point of caution, not (an indication) that districts will fall down on the job and not direct funds to schools that have the highest amount of needs. It’s a concern, but not necessarily a problem yet. We haven’t been able to (study that).
Q. How might the overall low achievement of English learners relate to the way English learners are reclassified as English-proficient in the future?
A. About three-quarters of English language learners are also low-income (so are disadvantaged according to two state criteria). Under the old California Standards Tests, or CSTs, reclassification was based in part on how students did on the CST. Now that the state has gotten rid of the CST and put in new (Common Core) standards, reclassification was expected to also include Smarter Balanced scores. But on the new tests, English learners do relatively worse, so you can’t just apply the same cutoff scores from the CSTs to the Smarter Balanced tests because the tests themselves are harder. You’d be getting away from students’ fluency in English and moving toward proficiency on English language arts standards, which is different.
The state is in the process of changing to a new system and it’s not clear what components will be part of it.
Q. During your presentation in Sacramento, you mentioned that the previous California High School Exit Exam, or CAHSEE, prompted some students who couldn’t pass it to drop out of school. You said the state might consider using the Smarter Balanced 11th grade test as a high school exit exam, which could also affect students’ ability to graduate. Do you think that could impact dropout rates?
A. I didn’t mean to say that it would cause students to drop out. I was making the point that it has been a problem in the past that the end-of-high school test – whether it’s the CAHSEE or if it becomes Smarter Balanced – has been at times the last thing that students don’t meet, and then they don’t graduate. So, you’ll have students that have completed everything else (class requirements). As the state shifts away from the CAHSEE to possibly using Smarter Balanced, it could be that that issue persists – but in the form of the Smarter Balanced tests in math and English language arts (instead of the CAHSEE).
Q. Your report concludes that districts with a large number of struggling students may need more guidance from the state and suggests they look to schools and districts that have had greater success with high-need students as models for improvement ideas. Do you know if this is happening now?
A. The CORE districts in California (Fresno, Garden Grove, Long Beach, Los Angeles, Oakland, Sacramento, San Francisco and Santa Ana Unified), which include many of the largest districts, have a program or an initiative where they pair similar schools on either end of the under- and over-achieving spectrum that compare on demographics. With a system like that – that’s an example of how you can have districts that are similar in students but are vastly different in terms of achievement and growth – and they can lean on each other to share best practices. That would be a direct way for districts to bring each other up, where some are excelling and others aren’t.
Q. What significance will the results from the third year of testing now underway have on the trends you have been studying?
A. With other institutions around California taking a closer look at how individual schools are doing and how practices vary, we may be able to get a better look at how the reforms are doing.
The first two years lent themselves to reports, with year one looking at the tests and the second year looking at growth.
Going forward, studies will likely be more focused on program evaluation and it will be more important to see how students continue to grow.
As LCFF (Local Control Funding Formula) and Smarter Balanced and all the other reforms mature, I don’t think PPIC (Public Policy Institute) will be looking at annual progress.
Q. But, many people say you can’t really see trends until you have three years of data. Do you agree with that?
A. I do think it’s incredibly important to look at these results year-in and year-out. That’s why the LCAP (Local Control Accountability Plan) progress is so important.
So, I would agree it’s still early to tell whether the new standards and LCFF are working. There have been massive changes in the state’s education system and getting a sense of whether these reforms are working will take some time.
Q. Your report names the outlier schools and districts that performed better than expected based on their demographics, as well those that performed worse than would be expected with their student populations. As the state moves toward fulfilling the requirements of the Every Student Succeeds Act, or ESSA, it will need to identify the bottom 5 percent of schools for interventions. Do you think your lists could provide a sense of urgency to some of the lowest-performing schools and prompt them to make significant changes to help students succeed?
A. We named schools both at the high and low end. Our intention wasn’t to name and shame (the low-achievers). We thought it was useful to have numbers and names to illustrate some of the scenarios we discussed, such as low growth and low achievement.
I think with the state’s accountability system, they do measure how schools are doing, but it is more focused on continuous improvement, rather than on punishing low-performing schools and districts.
To be at the red level (lowest level on the state’s dashboard) is a cause for concern, but I think the state is moving away from that being seen in a punitive light and instead is seeing it as sort of a call for additional assistance.
Q. What kinds of assistance do you think low-performing schools and districts need?
A. I know the state is in the process of identifying which characteristics will qualify for additional interventions. One of the things they’re working on is the CCEE (California Collaborative for Educational Excellence). They’re still getting off the ground. But as they work to put together resources and materials, more help will be available.
And more informally, you have the possibility that similar districts in terms of demographics could serve formally or informally as models of best practices. Certainly, districts like Long Beach Unified could (work with others in this way).
One of the goals for the local control process is allowing districts to take that on themselves and look around the state and find (model programs and best practices) rather than the state telling them what to do.
Q. Do you think districts have the time and resources to identify model programs and best practices, without any guidance from the state?
A. Having that menu of options is part of the goal for local control.
Districts may have difficulty taking on the improvement goals themselves. That’s been one of the biggest questions about the LCFF – whether districts are going to be able to execute these new responsibilities. It’s early and we will see if they’re up to the task.
Q. When will you be able to say for sure whether or not LCFF is working?
A. It’s hard to tell. There will be milestones. LCFF is on track to be fully funded by 2020. In the most recent (Governor’s proposed 2017-18) budget, LCFF is funded at 97 percent. So, there are things like that on the horizon.
Although a lot of the major laws were passed two or three (or more) years ago – such as LCFF and adoption of the Common Core standards – teachers needed to be retrained and teachers have been getting new instructional materials, so there’s still a lot of transition going on.
I think we need to wait at least until that’s done. Then, it will be time to take stock in whether all this is working.
Q. What are you working on now?
A. With the big sort of funding piece and accountability in the state – broadly you could describe where we’re looking next as – how these new reforms are working for special populations, such as English learners, students with disabilities, and students at charter schools. These are small niches that are a feature of the state’s accountability system.
Our next report should be released in September.