California has always been a state of dreamers and idealists. That’s part of our legacy and a reason for our success. Yet, as our state’s long trail of innovators have shown us, success takes more than ideas — it also requires careful implementation. California’s school funding model is based on a powerful idea: improve outcomes by directing more resources to high-need students and use a multiple-measure accountability system that supports local decision-making. Unfortunately, the reality of the system doesn’t match the vision.
This week, the State Board of Education will hear feedback about the California School Dashboard — an online tool that shows how schools and districts are performing. The state says it’s “the next step in a series of major shifts in California K-12 schools, changes that have raised the bar for student learning, transformed testing and placed the focus on equity for all students.”
Yet, when student performance was worse than expected on the most recent state assessments, the State Board of Education didn’t raise the bar. Instead, they moved the goalposts. When flat average performance on statewide tests placed an increased number of schools in the lowest performance band (red) on the multi-colored dashboard, the state board changed the classification criteria to pull them back out. Despite the reality that many of these schools are struggling to serve one or more student groups, the board was apparently more concerned that the state would be required to provide assistance to a larger number of lower performing schools than anticipated.
This “technical fix,” as it was described, sends the wrong message to the public. It suggests the state would rather sweep these schools under the rug than take responsibility for helping them. More concerning, sugar-coating these schools’ performance could prevent educators from receiving the additional supports available to low-performing districts — supports they likely need to accelerate progress for historically underserved students.
The dashboard does an inadequate job of defining good or acceptable performance and of providing guidance on how districts might improve. The dashboard uses five color-coded categories (from highest to lowest: blue, green, yellow, orange and red) to display test scores, graduation rates, suspension rates, success in educating English learners, college and career preparation, and chronic absence. Many schools and districts fall into the yellow band, which is so broad that it includes schools where the average student is several years below grade level and schools where the average student is above grade level.
If one of the main goals is to highlight achievement gaps of subgroups through the Dashboard Equity Report — another important feature of the dashboard — it also misses the mark. On its landing page it displays only a school district’s overall performance, so users who want to view performance for specific student groups must drill down to find information that should be more readily available.
The target audience for the dashboard is also unclear. In our view, it’s not sophisticated enough to provide detailed guidance to school districts that, by and large, have more finely calibrated internal systems to measure student progress. Yet, it’s also not user-friendly enough to help most families analyze their school’s performance or compare it to neighboring schools. So whom exactly does it serve?
In order to uphold California’s tradition of innovation and our responsibility to the state’s 6.2 million public school students, we must do better. That begins with refining the California School Dashboard to increase its focus on equity. Prioritizing equity would mean being upfront about how many schools need assistance and making equity more prominent on the dashboard display — possibly by placing the names of student groups on the top level of the dashboard or by creating an equity rating for each indicator. Leading with equity doesn’t mean having important information one click away from the dashboard landing page. We say this not as opponents of the dashboard, but as “critical friends” who are willing to speak difficult truths on important subjects.
In trying to make sense of the dashboard, various observers have referred to it as “a report card” for California schools, school districts and county offices of education. If so, at the moment we think it deserves an incomplete grade. In order to preserve the original principles of equity in education that we intended when we adopted the Local Control Funding Formula, and to give a true picture of where California schools stand, the state should make a number of key changes to the dashboard. First, we should more clearly communicate goals and define performance levels on the dashboard, especially for historically underserved student groups facing opportunity and achievement gaps. Second, the state must address equity more clearly, by providing more guidance and a more robust and defined system of supports for struggling districts. Lastly, the dashboard must be refined to better display data, including student growth and year-to-year change, to make it both accessible and actionable.
Advocates of the current dashboard say it’s still in the early stages of design. We say that’s the best time to make changes: before the problems become entrenched and institutional inertia takes hold. It’s time to correct course and develop a school accountability system that is innovative not only in theory, but also in practice.
Vernon M. Billy is CEO & executive director of the California School Boards Association. Ryan J. Smith is executive director of The Education Trust–West.
The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, we encourage you to review our guidelines and contact us.
To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.
We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.
David B. Cohen 5 years ago5 years ago
The authors take for granted that the test scores successfully measure what they were intended to measure, which is always a debatable point, and moreso in the earliest stages of a new assessment system. They also omit the fact that there actually was good reason to question this year's scores, with 14 out of 14 states using the Smarter Balanced English language arts tests showing no gains – a significant statistical curiosity. See EdSource. I'm not … Read More
The authors take for granted that the test scores successfully measure what they were intended to measure, which is always a debatable point, and moreso in the earliest stages of a new assessment system. They also omit the fact that there actually was good reason to question this year’s scores, with 14 out of 14 states using the Smarter Balanced English language arts tests showing no gains – a significant statistical curiosity. See EdSource.
I’m not well-versed enough in the details of this technical fix to vouch for its merits and efficacy, but the authors don’t even mention the issue, favoring the dismissive rhetoric “moving the goalposts.”
I also question the importance of simplifying comparisons of schools. I know that serves real estate agents well, but the more we simplify in pursuit of ratings and rankings the more we distort. When it comes to this kind of data, compression is loss.
I do agree with the authors that the state should continue to work on the ease of use, clarity of presentation, and focus on equity in the dashboard.
And as a bonus suggestion, how about placing something at the bottom of every report so that we the citizens can hold legislators and our fellow voters accountable for the public school system they create? Provide a link to a page that offers the following data:
* District’s per pupil spending compared to state average;
* California per pupil spending compared to national average;
* Staffing ratio comparisons (for all types of staff);
* Library access and staffing comparisons;
* Your city and county voting results on the last three bonds or parcel taxes on ballots;
* Your legislators’ votes on education bills in the two most recent legislative sessions;
* The governor’s signed/vetoed education bills and education budget line-item vetoes.