Black teachers: How to recruit them and make them stay
Lessons in higher education: What California can learn
Keeping California public university options open
Superintendents: Well-paid and walking away
The debt to degree connection
College in prison: How earning a degree can lead to a new life
For nine of the last 10 years, scores on California’s standardized tests steadily rose, showing the testing system did what it was supposed to do and raised student achievement. That streak ended and exposed an ongoing achievement gap with Thursday’s release of the 2013 Academic Performance Index, or API.
The number of schools meeting the statewide target of 800 on the API fell from 53 percent to 51 percent. Each year the state and individual schools get a base API from the prior year’s tests and a growth target calculated from that base. This year, the overall API declined from a base of 791 to 789. Elementary schools triggered the decrease: 56 percent met the target, down from 59 percent last year. Half of all middle schools hit 800 and 31 percent of high schools made it.
The results also mark the last time the API will be based solely on the narrow measurement of the California Standards Test, as the state’s testing system is undergoing a radical change over the next few years while Common Core State Standards are implemented.
The API measures a school’s academic growth based on students’ scores on California’s standardized tests, known as STAR exams. Not surprisingly, when the State Department of Education released those results earlier this month, the number of students scoring proficient or better on the 2013 exams also fell slightly for the first time in a decade.
Passage rates on the California High School Exit Exam are also factored into the API for grades 10 through 12, although those scores account for only about 9 percent of a school’s overall API.
The API ranks schools on a scale from 200 to 1000. The target score of 800 means that about 60 percent of students scored proficient or better on the tests and about 40 percent met the basic level.
State Superintendent of Public Instruction Tom Torlakson described the statewide downturn as a “slight dip,” during a phone call with reporters Thursday morning, and attributed it to years of budgets cuts and the move to Common Core standards.
“All of this is despite very real challenges of budget cuts over the last five years,” Torlakson said, referring to the loss of about $20 billion and 30,000 teachers, as well as what he described as the “heavy lifting underway” as schools shift away from the old standards and the old testing to Common Core standards, a nationwide set of academic guidelines that have been adopted by 45 states.
The current academic year is the first that districts are required to implement Common Core, but some started earlier. Deputy Superintendent Deb Sigman said a survey by ETS, which administers the STAR exam program for California, found some correlation between those districts that were implementing Common Core and drops in their scores.
It’s the opposite case in the state’s largest district. In a news release, Los Angeles Unified said it’s “a hopeful sign for the future” that grades 6 and 9, which implemented Common Core standards last year, showed the highest gains in test scores with proficiency rates rising by 2 to 5 percentage points in English and math.
On the surface, a two-point drop may not seem worrisome, but the decline follows years of double digit increases.
“That’s a pretty big drop off; it’s never been negative before,” said Doug McRae, a retired test developer who helped design the STAR exams.
Still, McRae said that based on actual numbers since the 2002-2003 academic year, when the standards-based tests were put in place, the state’s accountability system had a strong impact.
At that time, just 20 percent of all schools reached the 800 or above.
“They showed gains,” McRae said. “The API attempts to measure whether academic performance going up or down, and by that measure, academic achievement in California has gone up very nicely in 10 years.”
Still, California’s approved achievement hasn’t been reflected in the state’s scores on the National Assessment of Educational Progress, called “The Nation’s Report Card.” The tests, called NAEP, are given periodically to a sample of students across the country in reading, writing, math, science, civics and other subjects. On the most recent exams, taken in 2011, California’s fourth grade math scores were below 36 other states, and reading scores were behind 39 other states. In eighth grade, math and reading were both lower than 44 other states. Forty-five states exceeded California on the eleventh grade science test.
Sigman attributed discrepancy to differences in the tests.
“NAEP doesn’t assess California standards or the Common Core standards, so there’s a bit of a disconnect between what NAEP assesses and what is taught in our public schools in California,” Sigman said.
Despite the poor showing on NAEP and the overall drop in this year’s API, the results included positive signs.
“There are some (racial and ethnic) subgroups that actually made progress this year, which is a glimmer of something good,” said Carrie Hahnel, research director for EdTrust-West, an Oakland-based research and advocacy group.
API scores jumped by 5 points each for low-income students and students with disabilities. Scores for English learners grew by 1 point. That trend didn’t hold for students from ethnic and racial groups often considered at risk in school. API results dropped by 2 points for black and African American students, by 3 points for Native Americans and four points each for Filipino and Pacific Islander students. Scores for white students fell as well, by 3 points.
“I don’t think anybody’s satisfied with where we are and what we need to do for our students in poverty and our students who traditionally don’t do as well,” Sigman said.
That should change with implementation of Gov. Jerry Brown’s Local Control Funding Formula, which gives schools additional money for high-needs students, Torlakson said.
Between the LCFF, Common Core standards and the new tests developed by the Smarter Balanced consortium that are aligned to Common Core, California’s API will be based on an entirely different set of weights and measures in the next few years. It’s likely to include graduation and discipline rates as well as harder to quantify matters, such as school climate and student attitudes about learning.
Since its inception, the state’s standardize testing system has been heavily criticized by those who say it encouraged a “teach to the test” culture at the expense of critical thinking and innovation. Regardless of whether the STAR system was loved or hated, it has been the driver for the more sophisticated accountability system in the works.
“Straightforward, easy-to-understand summaries of performance will always have some value and have some place,” said Eric Crane, a senior researcher at nonprofit educational research firm WestEd, who helped develop the API system when he worked at the state Department of Education. “However, more of a whole school view on what students and teachers and, by extension, administrators are doing, can only be helpful.”
Before the API, data collection and analysis was the province of principals and superintendents, Crane said.
“What the API did was really open that conversation to parents and the community to have discussions about school performance that were data based,” Crane said. “In policy work sometimes just the sunlight is a big thing.”
A grassroots campaign recalled two members of the Orange Unified School District in an election that cost more than half a million dollars.
Legislation that would remove one of the last tests teachers are required to take to earn a credential in California passed the Senate Education Committee.
Part-time instructors, many who work for decades off the tenure track and at a lower pay rate, have been called “apprentices to nowhere.”
A bill to mandate use of the method will not advance in the Legislature this year in the face of teachers union opposition.
Comments (10)
Comments Policy
We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.
Manuel 11 years ago11 years ago
What is worse is that this tragedy is based on a decision made in 2002 without the benefit of data analysis. Or at least data analysis that the State Board of Education was made aware of. This is what I have been able to find out: While discussing how to use the accountability measures taken by California under the mandate of the Public School Accountability Act of 1999 to meet the requirements of NCLB, this was … Read More
What is worse is that this tragedy is based on a decision made in 2002 without the benefit of data analysis. Or at least data analysis that the State Board of Education was made aware of.
This is what I have been able to find out: While discussing how to use the accountability measures taken by California under the mandate of the Public School Accountability Act of 1999 to meet the requirements of NCLB, this was recorded by the secretary of the State Board of Education in the Final Minutes of the December 11-12, 2002 meeting:
Please note that no proof is offered on how this “goal” of 800 was determined to be “reasonable.” Also note that given that the CST had only been administered officially for one year, there was no way of determining whether this goal was achievable by any of the other schools staff looked at. Please also note that, as Doug has told us, the base year for the CSTs was defined to be 2002. How many schools met the goal that year? For example, I looked at LAUSD’s API base for 2002 and it was 595 (2002-03 growth was 626). Given this, how could the staff then under Mr. Padia have determined that 800 was a reasonably attainable goal for all schools when a district with roughly 10% of the state’s students is so far off this “reasonable” goal?
Since I have no idea how this conclusion was arrived at, I can only conclude that staff pulled this one out of thin air. Or maybe from some recondite cavity. Or from under a rock. What do you guess, dear reader?
Replies
Doug McRae 11 years ago11 years ago
Manuel: The goal of 800 for the API was not pulled out of thin air. The design for the API came primarily from the Public Schools Accountability Act (PSAA) Advisory Committee which held frequent meetings from 1999 thru 2001 to review and recommended how the API would be constructed, with most of the technical work done by its Technical Design Group (TDG) chaired by Ed Haertel from Stanford who also was a member of the … Read More
Manuel: The goal of 800 for the API was not pulled out of thin air. The design for the API came primarily from the Public Schools Accountability Act (PSAA) Advisory Committee which held frequent meetings from 1999 thru 2001 to review and recommended how the API would be constructed, with most of the technical work done by its Technical Design Group (TDG) chaired by Ed Haertel from Stanford who also was a member of the PSAA Adv Gp. This advisory structure made recommendations to the SPI and the SBE who tinkered with their recommendations (at times) before they became policy. The goal of 800 was their considered collective judgment that setting the API goal at 875 (which was the point value assigned “proficient” for test scores) would be too high at that time, but the goal could be moved up in future years if necessary. The point value assigned “basic” was 700, which was considered too low for a goal. So, they compromised at 800. Over time, the PSAA has considered recomending increasing the goal gradually up to 875 over a period of (say) half dozen years, but with the budget cuts and financial dips from 08 thru 12 this consideration never was forwarded to the SPI or SBE for action. Bill Padia was the chief CDE staff person who worked with the PSAA Adv Comm when it was designed and constructed, and the reference to Bill in the SBE minutes reflected that he was reporting out to the SBE the recommendations from the PSAA Adv Comm and the SPI.
The API is a complex data system, and all complex data systems have their glitches and anomolies, and are subject to misinterpretations and overinterpretations especially the longer they are operating. All and all, the API has done a good job of tracking academic achievement over time for CA, which is what it was designed to do, in my opinion. For the most part, controveries over API scores have been over interpretations rather than the mechanics of its construction, and of course everyone has a right to their own opinions for “meaning” and “effects” of API scores. But, as a designer of large scale data systems for the past 40+ years (assessment systems rather than accountability systems), I give the API system good marks overall, much better than accountability systems in most other states and unquestionably far better than the federal AYP accountability data system that has now been widely discredited. Doug
el 11 years ago11 years ago
This is kind of a deep in the weeds question, and I ask it out of a desire to increase my understanding. All and all, the API has done a good job of tracking academic achievement over time for CA, which is what it was designed to do, in my opinion. Doug, can you explain to me how you are testing this hypothesis, that API is tracking academic achievement effectively? Are you comparing it to other … Read More
This is kind of a deep in the weeds question, and I ask it out of a desire to increase my understanding.
All and all, the API has done a good job of tracking academic achievement over time for CA, which is what it was designed to do, in my opinion.
Doug, can you explain to me how you are testing this hypothesis, that API is tracking academic achievement effectively? Are you comparing it to other measures? Are you deconstructing the components of scores for some particular schools that you track and verifying that the amalgamated score makes sense given the makeup and change of test results for those schools?
I’d like to know more about how this number is being validated from people who have more visibility into the data than I do. Thanks as always for your willingness to share your knowledge.
Manuel 11 years ago11 years ago
Doug, thank you for sharing the history of how it came about since there is no "paper trail" in the InterTubes. Having said, that, to say that it was the "considered collective judgment" that begat what we deal with is an enormous leap of faith on the correctness of this decision. It doesn't seem to me that it has ever received much scrutiny over the years and I think it is time to ask if such … Read More
Doug, thank you for sharing the history of how it came about since there is no “paper trail” in the InterTubes.
Having said, that, to say that it was the “considered collective judgment” that begat what we deal with is an enormous leap of faith on the correctness of this decision. It doesn’t seem to me that it has ever received much scrutiny over the years and I think it is time to ask if such a number is reasonable considering the behavior of the CST over the last eleven years as well as the importance given to this number by the politicians and educrats when “accountability” is used as a convenient cudgel against communities, students, teachers, and administrators.
To be blunt, the actions of the high priests of educational policy need to be questioned by the masses because “trust me” doesn’t cut it anymore.
navigio 11 years ago11 years ago
“Straightforward, easy-to-understand summaries of performance will always have some value and have some place,” said Eric Crane, a senior researcher at nonprofit educational research firm WestEd, who helped develop the API system when he worked at the state Department of Education. “However, more of a whole school view on what students and teachers and, by extension, administrators are doing, can only be helpful.” Before the API, data collection and analysis was the province of principals and … Read More
“Straightforward, easy-to-understand summaries of performance will always have some value and have some place,” said Eric Crane, a senior researcher at nonprofit educational research firm WestEd, who helped develop the API system when he worked at the state Department of Education. “However, more of a whole school view on what students and teachers and, by extension, administrators are doing, can only be helpful.”
Before the API, data collection and analysis was the province of principals and superintendents, Crane said.
“What the API did was really open that conversation to parents and the community to have discussions about school performance that were data based,” Crane said. “In policy work sometimes just the sunlight is a big thing.”
If you have 50% of your students scoring in the proficiency band and 50% in the basic band, your API will be slightly above 800. If you have 50% proficient and 50% in the below basic band, your API will be slightly above 700. If you have 50% proficient and the rest in the far below basic band, your API will be well below 600. All three of those are examples where half the students are proficient (or are at ‘grade-level’ as the media often terms it).
On one hand, API could be given credit for being sensitive to such differences (they are after all, relevant if the performance bands have any meaning). On the other hand, an approximately 250 point API difference between those 3 scenarios would indicate a potential over-sensitivity to factors that have nothing to do with learning. The majority of YoY changes in API at a school level are a result of demographics (ie who is taking the test this year). And obviously CST results are high correlated with those demographics. It is not uncommon for schools to have variations in their proficiency rates that are even opposite of their API results (a bit ironic as proficiency rates are used as an AYP metric).
Anyway, I strongly disagree that API is a window into the inner workings of a school. I do agree it is simplistic, and I think its even inappropriately misleading at a school level unless all people care about measuring is the color of the student body and how much their parents earn (which are admittedly more important factors for many than we might want to admit).
And there is no ‘discussion’ happening about API ‘in the community’. What goes on is moving schools based on this number. That is a tragedy.
navigio 11 years ago11 years ago
very interesting… so one cause of the drop is common core, yet ets says common core is causing the increase. is there a link to the ets survey? The 9th grade increase wasnt just in LAUSD, so maybe 9th grade cc implementation was a state-wide phenomenon… ?
Replies
Kathryn Baron 11 years ago11 years ago
Navigio, I have asked for a copy of the ETS survey, but, so far, have not received it.
navigio 11 years ago11 years ago
Thank you Kathryn.
navigio 11 years ago11 years ago
Ok, I had a bit of time on my hands. I took the 2013 CST ELA results for each grade in every district in CA (excluding indirect charters because they way theyre implemented in the database file makes including them difficult) and looked at the number of districts that had improvements in scores (proficiency rates) compared to last year to see whether it seems likely that LAUSD's improvement was an anomaly (ie specific to its … Read More
Ok, I had a bit of time on my hands. I took the 2013 CST ELA results for each grade in every district in CA (excluding indirect charters because they way theyre implemented in the database file makes including them difficult) and looked at the number of districts that had improvements in scores (proficiency rates) compared to last year to see whether it seems likely that LAUSD’s improvement was an anomaly (ie specific to its own curricular decisions).
I still need to do some tweaking and more validation, but it appears over 80% of districts had score improvements (or held steady) in 9th grade (compared to about 48% for all grades in the state, and below 45% for elementary and middle school grades).
I did not take the size of the district into account, rather this is just the number of districts that improved, regardless of size (since the claim was district-wide policy that created the effect). This seems to indicate that if in fact CC was the reason for LAUSD’s improvement at 9th grade (6th and 10th grade had a similar dynamic, just not as extreme), then it must have been done at most of the districts in our state as well; specifically in 9th grade.
I am going to go out on a limb here and posit that the reason for the increase was not CC implementation (especially as other leaders mentioned that as the reason for their declines) rather something very specific related to either the tests or the students. Given the relative consistency of the state’s demographics compared to last year, Im going to lean toward the test. Especially since ETS hasnt been able to provide any evidence of the explanation they hinted at..
.. not that anyone cares anymore..
el 11 years ago11 years ago
Thank you for doing this work, navigio. I have found it extremely intriguing.