Credit: Flickr

There is a deeply rooted impulse in American society — perhaps any society — to rank everything from restaurants and refrigerators to athletes and colleges.

That may help explain why pressures continue in California to rank its schools based on a single score of some kind, despite a major thrust in the state to move in the opposite direction.

Great Schools, the hugely popular parent-oriented website, has combined several indicators to come up with score between 1 and 10 for every school in the state.

Another notable push is coming from Los Angeles, where the state’s largest school district has been working over the past year on a plan to rank schools on a numerical 1-to-5 scale, a number that would be reached by combining students’ improvement on test scores and other factors.

But that plan is far from being implemented.  This week newly elected school board member Jackie Goldberg introduced a harshly critical resolution effectively calling on the district to abandon the idea in its current form. Among other assertions, she argued, “the value of a public school cannot be quantified in a single, summative rating, which can shame, penalize, or stigmatize schools, education professionals, students and entire communities.”

L.A. Unified’s efforts have echoes of California’s old Academic Performance Index, a single number usually ranging from 600 to over 800, assigned to every school and district based almost entirely on student scores on state and federally mandated tests. Using those numbers, schools were ranked on 1-to-10 scale, and to take into account the impact of the economic, racial and ethnic backgrounds of students, schools were ranked on a numerical scale comparing them to schools with similar student populations.

But in 2014 the state scrapped the API and the 1-to-10 rankings and in their place instituted an accountability system of “multiple measures” that include not only test scores, but also graduation rates, chronic absenteeism rates, suspension rates and harder-to-define indicators such as school climate and parent engagement.

According to a survey of the Education Commission of the States, all but five states have come up with a “summative” rating of some kind. In education testing parlance, that’s a single ranking that combines multiple indicators in the form of an A-F grade, a star or a tiered rating, like Massachusetts’ “Tier 1” to “Tier 4” scale.

With its multiple measures approach, California is definitely “bucking that trend,” Morgan Polikoff, a professor at USC’s Rossier School of Education, and Kate Kennedy, a research associate there, wrote in a paper last year.

Instead, the state  has developed the California School Dashboard, a relatively new system to rate schools using a complicated set of color codes ranging from red (the worst rating) to blue (the highest rating) on various indicators assigned to the school as a whole and numerous subgroups of students. But California schools are not assigned a single color to reflect their or their students’ overall performance.

Former State Board of Education President Michael Kirst, one of the chief architects of the new system, said there were many reasons not to do so — not least of which is that there is no scientific way to know what weights to assign to the different factors that make up a final rating. “What’s happened in the past and in other states is that they just grabbed numbers out of the air,” said Kirst, a nationally renowned Stanford University education scholar. “There’s no scientific basis for making these weightings and we couldn’t think of any way to do it.”

One of the main reasons for abandoning the API was the fact it didn’t capture everything happening in a school. As any parent or educator knows, so much more is going on in a school than just how students do on an annual standardized test. An overall ranking masks how subgroups of students are doing and doesn’t necessarily measure how students improved over time. It also didn’t measure at all what schools are doing to help students improve.

When the L.A. school board approved a resolution 18 months ago setting in motion a procedure to come up with an overall school ranking, its goal was to come up with “one comprehensive picture of school success.”

To its credit, the district is trying to come up with a measure that includes much more than test scores. In its draft form for high schools, 40 percent of L.A. Unified’s proposed ranking would be based on how much students’ scores improved over the previous year and 25 percent on the actual test scores. A measurement of “school climate” would make up another 20 percent of the ranking and a school’s success in preparing prepare students for college and careers the remaining 15 percent. Elementary and middle schools would have a similar approach.

Kirst said that this amalgam of measures would be an improvement over the old one-dimensional Academic Performance Index. But, he said, it still does not capture the nuances of what is happening in a school, or resolve the fundamental problem of how to weight the different measures that make up the ranking. “It’s just a number that is pulled out of the air in terms of its internal components,” he said.

L.A. Unified board member Kelly Gonez, a former teacher who is one of the ranking’s main backers, explained that its purpose is not to place blame or to be punitive, but to come up with a clearer view of schools with the greatest need, and to guide the district on where to best invest its resources.  “The goal is for us as a district and for the  broader community to get a sense of how our schools are doing, and how we can make sure that all schools are helping kids be prepared for college, careers and life,” she said in an interview on KPCC’s AirTalk.

For now, if only because of a change in the make-up of the board, the outlook for implementation of a new ranking system in Los Angeles is uncertain at best.

In her resolution attacking the whole idea, fellow board member Goldberg counters that “school ratings promote unhealthy competition between schools, exacerbate community antagonisms by producing artificial ‘winners’ and ‘losers’ and penalize schools that serve socio-economically disadvantaged students.”

Whatever happens in Los Angeles, Kirst points out that the California School Dashboard underwent a major revision last year, making it far more user-friendly than its first iteration. Other than in Los Angeles, “I don’t see any groundswell of objection,” he said.

California has put an enormous amount of effort and resources over the past five years into creating the dashboard and the  accountability system that it visually represents. The challenge for educators and elected officials — in Sacramento and at a local level — is whether they have the patience to give the new system time to work before jumping in and making changes to it, while at the same time assessing when and if reforms of the multi-layered system are necessary.

We need your help ...

Unlike many news outlets, EdSource does not secure its content behind a paywall. We believe that informing the largest possible audience about what is working in education — and what isn't — is far more important.

Once a year, however, we ask our readers to contribute as generously as they can so that we can do justice to reporting on a topic as vast and complex as California's education system — from early education to postsecondary success.

Thanks to support from several philanthropic foundations, EdSource is participating in NewsMatch. As a result, your tax-deductible gift to EdSource will be worth three times as much to us — and allow us to do more hard hitting, high-impact reporting that makes a difference. Don’t wait. Please make a contribution now.

Share Article

Comments (8)

Leave a Comment

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. CarolineSF 2 months ago2 months ago

    The L.A. school activists I talk to say the proposal for a single school rating is very much designed exactly to “place blame (and) be punitive.” They view that as a tactic of a faction of the LAUSD board that includes Gonez and that is associated with the charter sector’s now-open goal of placing all students in charter schools. You can read about that in this L.A. Times article: https://www.latimes.com/local/lanow/la-me-edu-charter-leaders-confidential-planning-20190702-story.html

  2. SD Parent 2 months ago2 months ago

    Well, this parent thinks that the state's dashboard/color-coding system is a pretty worthless school evaluation system. First, it's overly complicated. So many "metrics" makes it hard to deduce the information parents deem more important from the less important. Second, it gives more weight to improvement or decline in any given measure than the absolute score. For example, the state should have given two scores for such things as standardized test results: … Read More

    Well, this parent thinks that the state’s dashboard/color-coding system is a pretty worthless school evaluation system. First, it’s overly complicated. So many “metrics” makes it hard to deduce the information parents deem more important from the less important. Second, it gives more weight to improvement or decline in any given measure than the absolute score. For example, the state should have given two scores for such things as standardized test results: one for the actual score and another for the rate of improvement from the prior year. The current system just obscures how poorly most students are performing/learning.

    Third, the system allows school districts to use worthless “metrics” for some criteria. Parents in San Diego Unified will tell you that parent engagement measures nothing of value – namely whether parents log into an online grading portal and how many eblasts the district sends out – which negates its value and has caused even the parent leaders of district advisory groups to feel frustrated and unappreciated.

    Fourth, some metrics have unintended (I hope!) consequences that do not help students. For example, high school students in SDUSD are now being strong-armed into CCTE “pathways” limited to what the local school offers since the district doesn’t provide transportation that provides for meaningful school choice. But, hey, it looks good on the dashboard when students complete three years in culinary arts or automotive tech or broadcast journalism (the only options at one high school, which offers almost no AP classes and no college courses), so you gotta choose one…

    Bottom line, as far as colleges are concerned, the only criteria on the dashboard that measure student success are the actual scores on standardized testing and graduation, and that’s what matters to parents, too. That’s not to say that the other factors aren’t important, but only if meaningful metrics are used and the issues are addressed in a way that actually helps students.

  3. Eric Premack 2 months ago2 months ago

    I can't think of anything that I value in life that I would ever gauge with a single number. Not my spouse. Not my health. Not a good meal. Not a glass of wine. Not a mountain sunset. Not a good book. Certainly not a school nor a college/university. The Dashboard is still in its infancy and severely stunted by the absence of meaningful statewide data, … Read More

    I can’t think of anything that I value in life that I would ever gauge with a single number. Not my spouse. Not my health. Not a good meal. Not a glass of wine. Not a mountain sunset. Not a good book. Certainly not a school nor a college/university.

    The Dashboard is still in its infancy and severely stunted by the absence of meaningful statewide data, and by obscuring the absence of growth data with “change” data, but its limitations are no excuse for a single number approach.

  4. Manuel 2 months ago2 months ago

    I don't think that the School Performance Framework includes "much more than test scores." The last version (presented to the Parent Advisory Council on May 23, 2019) states that 80% will be derived from SBAC scores for elementary and middle schools while 65% will be the used for high schools. This is only because its proponents insisted that a "college and career readiness" component be included. It is "worth" 15% and includes 4-year graduation rate, … Read More

    I don’t think that the School Performance Framework includes “much more than test scores.” The last version (presented to the Parent Advisory Council on May 23, 2019) states that 80% will be derived from SBAC scores for elementary and middle schools while 65% will be the used for high schools. This is only because its proponents insisted that a “college and career readiness” component be included. It is “worth” 15% and includes 4-year graduation rate, A-G completion rate, AP pass rate on more than two exams, and EAP results. How any of the components are weighted has not been made public. This lends credence to Prof. Kirst’s assertion that the final number is pulled out of thin air. (There is another major criticism: there is no inclusion at all of parent participation and/or satisfaction of any stakeholders in the “school climate” component.)

    There is another problem with the SPF: 40% of the final number is supposed to come from “growth” in the SBAC. Various presentations have alluded to this value as the difference in SBAC score as a student advances from one grade to the next. However, a presentation by CORE Districts to the LAUSD Board on November 13, 2018, explicitly stated that the growth is actually derived by “predicting” what a student’s score should be based on a calculation that uses the same mathematical methods as Value Added Models. The only difference here is that what is solved for is not the “influence” of a teacher’s “effectiveness” but the “expected” score the student should have as determined by “demographic and other adjustments.” As Prof. Kirst point out, is there any scientific basis for making these adjustments? Of course, what is “behind this curtain” was never presented to the parents who participated in focus groups where the latest “interface” to SPF was discussed.

    Those are the nuts-and-bolts of how this particular sausage was made. Whether it will be approved by the Board in its current form is another matter, given Ms. Goldberg’s resolution.

    I do, however, have an issue over something that has never been acknowledged by the testing mavens: if the distribution of scaled scores for the entire state cohort (by grade, of course) does not change significantly year to year, is it fair to expect any “growth” in any school’s SBAC scores? If it is not mathematically possible, then how can SPF live up to its premise?

  5. Frances O'Neill Zimmerman 2 months ago2 months ago

    Largely due to lobbying efforts from the powerful California Teachers Association, the state abandoned carefully calibrated API reporting to the public because it got rid of all standardized tests whose results supported it. At the same time and for the same reason, California dropped the benchmark High School Exit Exam. For awhile, there was no reporting of California public school students' academic standing of any kind. Finally, when the new "dashboard" system was … Read More

    Largely due to lobbying efforts from the powerful California Teachers Association, the state abandoned carefully calibrated API reporting to the public because it got rid of all standardized tests whose results supported it. At the same time and for the same reason, California dropped the benchmark High School Exit Exam. For awhile, there was no reporting of California public school students’ academic standing of any kind.

    Finally, when the new “dashboard” system was put in place, no comparisons could be made from one year to the next because there was no data from the preceding year. The original new “dashboard” was unintelligible and has been re-worked, but it still reveals almost nothing for a family who wants to know where to send their kids for a solid K-12 public education. Maybe the Governor will fix this.

    No transparency, no accountability but a lot of smoke and mirrors.

  6. Bill Conrad 2 months ago2 months ago

    All one need to do is take a quick look at the 5 year pattern of LAUSD academic performance at http://sipbigpicture.com to see a failed school system! No amount of one number, two numbers, 10 numbers, or 100 numbers will be required to come to that realization. The state wants to broaden accountability to all manner of other indicators besides academic achievement which is and should be the main focus of the school districts! … Read More

    All one need to do is take a quick look at the 5 year pattern of LAUSD academic performance at http://sipbigpicture.com to see a failed school system! No amount of one number, two numbers, 10 numbers, or 100 numbers will be required to come to that realization.

    The state wants to broaden accountability to all manner of other indicators besides academic achievement which is and should be the main focus of the school districts! Just more fog of education!

    Until we seriously address the fundamental flaws in professional practices primarily through a T-Ball Color in the Lines Colleges of Mis-education system, we will continue to wander about within the fog looking for magical numbers!

    We need more chemistry and less alchemy in the magical world we call K-12 education. As long as the Organized Crime Network of Superintendents, state gumbas, and administrators keep us distracted on magical numbers, they can continue to maintain their 6-figure salaries, kickbacks, and perks!

    As it is all about the adults at the expense of children and families!

  7. Chris Bertelli 2 months ago2 months ago

    Please, please, please stop referring to the dashboard as an “accountability system.” It is no such thing. It is also not used to “rank” schools or districts.

  8. Eleanor Sledgewick 2 months ago2 months ago

    Great article! But I think it goes even beyond that. In our culture anything less than a perfect score is considered a failure. I noted this when after giving multiple Lyft drivers a 5 star rating, I gave a driver a four star rating because he was still very good, just not as good as some of the other drivers. The app immediately asked me what went wrong and gave me … Read More

    Great article! But I think it goes even beyond that. In our culture anything less than a perfect score is considered a failure. I noted this when after giving multiple Lyft drivers a 5 star rating, I gave a driver a four star rating because he was still very good, just not as good as some of the other drivers. The app immediately asked me what went wrong and gave me a list of critical elements that I could select.
    This inability to assess nuance, has unfortunately, provided grist for certain elements in our society to argue California schools are a failure when that is far from the case.