New California School Dashboard subjected to scrutiny by workshop participants

Juwen Lam, executive director of research, assessment, accountability and partnerships for the Alameda County Office of Education, discusses local indicators for the new California School Dashboard.
Credit: Theresa Harrington/EdSource Today

As school district staff, parents and community members learn to use the new California School Dashboard launched last week, many are raising questions about its usefulness in its current version.

Several participants at a workshop presented this past Saturday in Oakland by the California Collaborative for Educational Excellence, or CCEE, expressed concern about how current the data used in the dashboard is, how school districts will measure indicators like school climate, and how this information should be incorporated into districts’ Local Control Accountability Plans, or LCAPs.

Even those presenting the workshop were not able to answer all questions, but noted that the dashboard is expected to get better based on feedback through a cycle of “continuous improvement.”  The State Board of Education is calling the initial release a field test to familiarize districts and the public with how to use the data.

“How do we ensure that districts are really taking this seriously?” asked Monica Scott Green, a parent advocate in Oakland who works with some district schools.

The dashboard includes color-coded data related to several indicators the State Board of Education has decided are priorities in measuring school and district success. Colors range from red and orange  – at the bottom end of the spectrum – to yellow, green and blue at the top end. Green is the state’s target.

But it is a work in progress, with some data available now and some to be added later. The dashboard is expected to be fully functional next fall, when the state will begin using it for accountability purposes. The state data are available online for the public to look up individual schools and districts. EdSource has also created a School Data Dashboard with a comparison clipboard that allows users to compare results among schools or districts.

By next fall, the state website is expected to include six state indicators of progress for schools and districts, along with four local indicators. Data for the state indicators will be gathered and reported by the state, while districts will determine and report data for the local indicators.

State indicators include: chronic absenteeism; graduation rates; suspension rates; academic performance for grades 3-8 on standardized math and English language arts tests; English learner progress, and college and career readiness. Local indicators include: providing basics such as qualified teachers, appropriate instructional materials and facilities; implementation of academic standards; parent engagement, and school climate.

Districts or schools with indicators that show up red or orange for two years in a row for specific subgroups of students will be targeted for interventions. They will be required to highlight these as areas of greatest need and performance gaps in district accountability plans, said Michelle Magyar, CCEE senior manager for training.

“This allows districts to tell their story,” she said. “But we think it’s really important to go beyond state indicators. We’ve been describing it as a first entry point.”

Currently, the dashboard does not include color ratings for college and career readiness, chronic absenteeism or the local indicators. It also doesn’t include “detailed reports,” which are expected to be added in April to allow the public to compare state indicators to local indicators, Magyar said.

Data for foster youth are also missing. Magyar said she expected those data to come next year, but added that she wasn’t sure whether data would be added from previous years.

“The state Department of Education is realizing as they’re updating their technical guide that they need to provide some of those details,” she said.

During a session about the local indicators, questions were raised about the reliability and validity of the results, since districts are free to develop their own measurements. And some questioned why the state Board of Education is allowing districts to say they “met” the requirements for these indicators simply by reporting the data to their boards and the public, without ensuring that they actually make progress toward improvement.

“This is where a lot of confusion came about in the last few weeks,” said Juwen Lam, executive director of research, assessment and accountability partnerships for the Alameda County Office of Education, who presented the workshop. She noted that the state board changed from “met/not met” to “reported/not reported,” then back to “met/not met.”

“I’ve heard a lot of frustration from a lot of districts because it’s very difficult to plan if we’re shifting things around,” Lam said. “The meaning behind the indicator doesn’t seem like it has shifted too much, but we are still in conversations about this.”

Districts can use “self-reflection” tools or questionnaires to measure their local indicators – or they can come up with their own. But Lam said measuring “met or not met” for two years in a row “is somewhat misleading,” because it implies a district set a goal but did or didn’t meet it.

“In this model, how I meet that criteria is I have reported that data to my stakeholder,” she said. “So, as long as I report my data, I have met the goal.”

Several participants also raised questions about the best way to measure parent engagement.

“What about determining validity and reliability?” Green asked. “If the district develops it, is that to be accepted?”

Lam said she has heard many concerns about this.

“How do you ensure it’s valid and reliable?” she said. “The short answer is: you don’t.”

She suggested that local indicators should be compared to state indicators to assess whether the total picture makes sense.

“If school climate is rated high, but the suspension rate is red, that is a red flag to me,” she said, referring to red as the lowest level. “Ask: ‘How is it I’m measuring something locally here, but not getting consistent results over here?’”

The student advocacy group Children Now, which sent a representative to the workshop, also has expressed concerns about the dashboard’s content and functionality. Samantha Tran, the group’s senior managing director for education policy, said the fact that much of the data are not current is a problem. Data related to graduation rates, suspension rates and English learner progress are all from 2014-15, with part of the color-coded calculation based on the change from prior years.

“There’s a pretty significant data lag,” she said. “If we’re really going to use this data for planning and for accountability, it needs to be more timely.”

Tran was also concerned that there is no link on the dashboard to 5×5 report cards, which include informative breakdowns by subgroups and schools for the state indicators.

In addition, she said, it would be helpful to be able to compare several schools in a district side-by-side across all indicators.

“I think the nice thing about this moment is that we actually have something to play with and work through,” Tran said of the dashboard. “It will make the conversation a lot more real. We’ve been raising these issues – and now there’s something to get your head around.”

EdSource in your inbox!

Stay ahead of the latest developments on education in California and nationally from early childhood to college and beyond. Sign up for EdSource’s no-cost daily email.

Subscribe