The Legislature has given the State Board of Education an extra year to complete the next phase of a new school accountability system required by the state’s 2-year-old funding law.

The state board had requested more time, which legislators included in Assembly Bill 104 (section 22), the catch-all “trailer bill” that enacts the state budget details into law. The trailer bill also is a way to expedite non-controversial issues that need quick action.

For months, the board has been struggling with an Oct. 1 deadline for adopting a set of “evaluation rubrics,” a set of uniform student and school performance standards. The Legislature mandated that the state board establish the standards and ultimately hold districts accountable for meeting them.

An example would be a statewide rate for high school graduation. The board would decide whether the target should be, say, 80, 85 or 90 percent and then require that districts report rates by school and student subgroups. Other likely metrics would be dropout rates and the percentage of students who graduate qualifying for admission to a four-year state university. The current rates in all of the measures vary widely by school and district, and by race and ethnicity. The state board would set not only realistic overall targets but also ways to measure progress toward the target for schools, districts and subgroups.

In writing the new finance law – the Local Control Funding Formula – the Legislature set eight academic priorities and said there should be evaluation rubrics for each of them, without specifying the number of standards. The Legislature also required districts to use the same metrics to determine  annual performance goals in their Local Control and Accountability Plans, which they are supposed to write after consulting with parents and teachers.

“For too many years, policy makers have picked numbers out of the air, like the 100 percent proficiency. We need to know how valid and reliable the metrics are and the research base behind them,” said State Board of Education President Michael Kirst.

Some of the eight priorities, like raising student achievement, have many metrics to choose from, including annual scores of the new Common Core exam, which will be reported this summer, or results on statewide science tests or high school Advanced Placement exams. Other priorities, such as parent involvement, student engagement and a district’s implementation of the new Common Core and science standards, have metrics that either may be difficult to quantify and compare statewide or, in the state board’s view, may be best left to schools and districts to determine, based on their own circumstances. One school’s priority for student engagement in an LCAP might be to increase participation in after-school clubs, while another school’s focus might be to reduce fighting and bullying.

“For too many years, policy makers have picked numbers out of the air, like the 100 percent proficiency” in English language arts and math that Congress adopted under the No Child Left Behind law, said state board President Michael Kirst. “We need to know how valid and reliable the metrics are and the research base behind them.”

In their discussions, board members reached a consensus that the goal of the evaluation rubrics should be self-improvement, not punishment, with outside intervention as a last resort. They don’t want to re-create the federal system with many trip wires leading to sanctions, which encourages gamesmanship, such as weeks of narrowly focused test prep, to avoid them. “There are a lot of lessons from NCLB we do not want to repeat,” Kirst said.

At the same time, a coalition of civil rights and children’s advocacy groups have been pressing the state board to keep the focus on equity and closing gaps in achievement among underperforming groups of children: African-Americans, Hispanics, low-income children and English learners. In an February letter to the state board, the coalition said the board should adopt both performance targets ­– the bar that all districts and schools must shoot for – and improvement targets, how much they need to grow within a set time to close the achievement gap.

In coming months, Kirst said the state board will deal with a range of issues. Among them, it will explore:

  • The balance between standards that measure growth and those that set a single target;
  • The need for indicators that can serve as predictors or warning signs of future student performance and trailing indicators that reflect students’ and schools’ longer term performance, reflecting progress – or lack of it. Examples of the former are rates of student absenteeism, 3rd-grade reading scores and perhaps, if measurements improve, attitudinal and non-cognitive skills like perseverance. Examples of trailing or lagging indicators are the high school dropout rate and the percentage of English learners who have tested proficient in English after seven years of help in language acquisition;
  • Which standards should be determined locally and which should be set by the state, using common data;
  • Standards for career readiness and the work world, which have proved more challenging than those measuring college readiness.
Evaluation rubrics will serve as guides for districts in setting goals in their LCAPs and for county offices of education, which review the LCAPs, in measuring progress. After the state board has established the rubrics, it will then move to the final phase: establishing timelines for schools and districts to meet the standards and the consequences for failing to do so. That could take the form of partnering low- and high-performing schools or ultimately, for chronically failing schools, a state-directed takeover.

The delay with the rubrics will extend the hiatus between the old and new state accountability systems. In switching to the Smarter Balanced standardized tests on the Common Core State Standards, the Legislature has suspended for at least two years the Academic Performance Index, a three-digit number that rated schools and districts based on test scores. The state board must decide whether to re-create or absorb the index in a multiple-measure system.

In the interim, Kirst said that the public will continue to measure districts’ and schools’ performance through the LCAPs and by the results of the Smarter Balanced tests in English language arts and math, which will be released in late summer.

Kirst said board members agreed that a year extension might be necessary, and some of the advocacy organizations wrote the Legislature supporting it. Ted Lempert, president of Children Now, confirmed that in an email. “It makes sense to provide more time to make sure the rubrics are done right, but it’s critical that the Board stay focused on providing transparency on where students and schools are succeeding and struggling,” he wrote.

The legislation gives the board until Oct. 1, 2016 to adopt the rubrics. That will require the state board to present the package at its July 2016 meeting – a deadline that Kirst said the board will meet.

 

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (9)

Leave a Comment

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Doctor J 9 years ago9 years ago

    Does anyone know anything about a budget line item to fund 10 school districts $4.6 million that don’t qualify for concentration grants to assist them with Class Size Reduction because of loss of QEIA funds ? At Monday nights Board Meeting the local teacher’s union president was taking credit for bringing it to the attention of the legislature and saying MDUSD’s share is $2 Million.

    Replies

    • navigio 9 years ago9 years ago

      The feb LAO budget analysis mentioned that there was extra QEIA money being diverted to ERP. Perhaps that is where this came from? That report also suggested being careful about simply undoing qeia class sizes. There is also the possibility of qeia funds being returned to the state. If that happened id expect they'd try to be used in a similar manner later. I didn't see mention of qeia in the trailer bill. That said, … Read More

      The feb LAO budget analysis mentioned that there was extra QEIA money being diverted to ERP. Perhaps that is where this came from? That report also suggested being careful about simply undoing qeia class sizes.
      There is also the possibility of qeia funds being returned to the state. If that happened id expect they’d try to be used in a similar manner later.
      I didn’t see mention of qeia in the trailer bill.
      That said, it seems odd that a school that qualified for qeia would not qualify for LCFF grants (although it’s possible).

  2. Paul Muench 9 years ago9 years ago

    Interesting comment from Linda Darling Hammond on shifting focus to improvement for all children and not just students deemed just below proficient. That will be an interesting topic to follow.

  3. Gary Ravani 9 years ago9 years ago

    From the above article: "At the same time, a coalition of civil rights and children’s advocacy groups have been pressing the state board to keep the focus on equity and closing gaps in achievement among underperforming groups of children: African-Americans, Hispanics, low-income children and English learners." As mentioned in the comments to Doug McRae, voluminous research shows that the the "achievement gap" is largely in place when students arrive at the Kindergarten door. The gaps tend to … Read More

    From the above article:

    “At the same time, a coalition of civil rights and children’s advocacy groups have been pressing the state board to keep the focus on equity and closing gaps in achievement among underperforming groups of children: African-Americans, Hispanics, low-income children and English learners.”

    As mentioned in the comments to Doug McRae, voluminous research shows that the the “achievement gap” is largely in place when students arrive at the Kindergarten door. The gaps tend to increase over time the more disadvantaged students are out of school on vacations and breaks. In fact, between K and 12th grade students spend about 17% of their waking hours in school and the remainder, 87%, with families and communities. If the “advocates” are truly interested in “equity” they will cease the obsession with the 17% “tail” of the achievement gap and focus more on the 87% “dog” of the achievement gap. I believe they will find those groups and politicians that remain steadfast in opposing any kids of demands for equity in the economy, the true source of equity gaps for families and communities, will be far less amenable to constant scrutiny and criticism than are the SBE, schools, and teachers. Go on folks, try taking on the real bad guys. Maybe just maybe you can move the needle on this nation’s status as being the wealthiest while simultaneously having the highest rate of child poverty of almost any industrialized nation. And CA, being the wealthiest state in that nation with the most children living in poverty. And then there is CA’s status, even with recent improvements, having fragile and very low funding per child for K-12 education.

  4. Doug McRae 9 years ago9 years ago

    "We need to know how valid and reliable the metrics are and the research behind them" -- Michael Kirst, SBE President. I think we all can agree with Kirst's statement for any metrics used for a new statewide accountability system. My question is -- Why isn't this standard being used for the new Smarter Balanced test scores? The Smarter Balanced scores won't be valid reliable or fair until 2017 or 2018 at best [or useful for … Read More

    “We need to know how valid and reliable the metrics are and the research behind them” — Michael Kirst, SBE President.

    I think we all can agree with Kirst’s statement for any metrics used for a new statewide accountability system. My question is — Why isn’t this standard being used for the new Smarter Balanced test scores? The Smarter Balanced scores won’t be valid reliable or fair until 2017 or 2018 at best [or useful for measuring the Common Core until 2019, according to Kirst in a Jan 2015 SacBee article]. But, at the end of the post, Kirst says that for the interim the public can continue to measure LEA and school performance through the LCAPs [without documented valid or reliable metrics] and Smarter Balanced test results [again, without documented valid reliable fair metrics].

    Methinks we have more than a little speaking out of both sides of the mouth going on here, using lack of valid reliable metrics as a justification for delay, but then saying we can go ahead anyway using invalid unreliable unfair metrics for accountability for the interim.

    It is also curious that the SBE used a budget trailer bill to get legislative approval for the delay, rather than a standard policy bill with vetting by education policy committees in the legislature. Making or changing policy via budget trailer bills raises a red flag, usually indicating the sponsor doesn’t want to have an open discussion of the policy issue under consideration.

    The bottom line is that California has discontinued collection of valid reliable fair statewide metrics for student achievement for at least 3 or 4 or 5 years, and without centerpiece usable metrics for student achievement it will be virtually impossible to implement a valid reliable fair revised accountability system for at least 3 to 4 to 5 years. It should not be lost on the reader that this is exactly the outcome desired by anti-assessment anti-accountability folks, short of outright repeal or ban of any statewide accountability system at all.

    Replies

    • Gary Ravani 9 years ago9 years ago

      Doug: When we talk about "metrics" and "accountability systems, let's be clear that those "mistakes from NCLB" that "we don't want to repeat" include the facts that there was never a research base supporting the use of a "metrics based accountability system" that would improve education in meaningful ways in the first place. Not only did the advocates of such systems come up with ludicrous metrics, like "100% of students will be proficient in math and … Read More

      Doug:

      When we talk about “metrics” and “accountability systems, let’s be clear that those “mistakes from NCLB” that “we don’t want to repeat” include the facts that there was never a research base supporting the use of a “metrics based accountability system” that would improve education in meaningful ways in the first place. Not only did the advocates of such systems come up with ludicrous metrics, like “100% of students will be proficient in math and ELA in xx years,” by picking them out of the air, the whole concept was picked out of the air. And now we do have the metics of NCLB, as analyzed by the National Research Council as well as other legitimate researchers, that show that not only learning wasn’t improved by the accountability systems, it was actually hurt by them.

      Ultimately, ETS as well as other researchers, have come to a general consensus that school related factors account for about one third of the variability in test scores (aka, the achievement gap) and family and community factors account for the other two thirds. Those family and community factors are directly related to conditions of poverty. Who is to be held accountable for the vast majority of factors related to the achievement gap? Volunteers line up to the right.

  5. Paul Muench 9 years ago9 years ago

    Is the goal for college preparedness actually under consideration? That will be walking a fine line between student choice and the aspirations to have all schools prepare students for the future of their choice. Seems to me this issue is a good reason to overfund schools. Children need flexibility and choices on their path to building a future.

    Replies

    • John Fensterwald 9 years ago9 years ago

      Paul: I have updated the piece to include source information on the evaluation rubrics, including presentations to the State Board in May and the webcast of the meeting. You may find David Conley’s presentation and testimony worthwhile. He is an expert on career readiness and has shared his views with the state board over the past year. He has relocated from Oregon to the Bay Area where he is continuing his work.

      • Paul Muench 9 years ago9 years ago

        Yes, that was interesting. He seems to be saying that if only one indicator is used for school success it should be A-G course participation but multiple measures is far better. He also acknowledges that at least so far student aspirations have been left out of the school improvement (accountability) discussion.