Last spring more than 3 million students in California, the largest number ever to take an online test in the state, took field tests of new assessments aligned to the Common Core state standards without major technical breakdowns or system crashes, according to state officials.

Just as California avoided the massive online breakdowns that occurred with the federal healthcare.gov website, education leaders here are now optimistic that when the full battery of tests are administered this spring for the first time that the process should go relatively smoothly.

The field tests of the assessments produced by the Smarter Balanced Assessment Consortium were intended to be a practice run for the full rollout this spring, when all of California’s 3rd- through 8th-grade students, along with 11th graders, will take the assessments in both English language arts and math for the first time. They will replace the multiple-choice, pencil-and-paper tests, known as the California Standards Tests, that students had taken each spring for 15 years until 2013.

After the field tests were administered, some news reports documented a range of problems, including students struggling to master the technicalities of taking a test online instead of filling in bubbles with a pencil. But state and Smarter Balanced officials interviewed by EdSource believe that some of the problems that occurred last spring have  been dealt with, or with additional preparation and planning can be averted by  this spring.

California Department of Education officials say their preliminary conclusions about the field test process are based on online surveys of school districts and eight focus groups of key education constituencies, including parents and students. Two of the groups focused on English learners and special education. The department will present an official report of its findings to the State Board of Education in time for its meeting in November.

What is still not known is how well students will perform on the most important part of the new assessments – the academic content. No scores on the field tests in either math or English language arts last spring were published, and how well students do will only be known when their scores are published for the first time after the full assessments are administered  next spring.

Because California has by far the largest number of students of any state  – more than 6 million – what happens here will have an impact on the overall implementation of the most prominent reform now underway in the nation’s schools.

“People were very nervous to begin with, and through our partners with the Education Testing Service, county offices and everyone else involved, things went remarkably well,” said Sue Burr, a member of the State Board of Education, at its meeting in Sacramento on Sept. 3, in response to a presentation by California Department of Education officials.

Even in Los Angeles Unified, which issued a detailed report documenting a range of problems at individual school sites, officials say the field tests went well for the nearly half million students who took them.

“It was a major challenge, but it went better than expected,” said Cynthia Lim, executive director of the district’s Office of Data and Accountability. “A year ago, if you had told me that 450,000 students would take this test online, I wouldn’t have believed it, but it actually happened.”

Leading up to the tests, districts were provided with detailed instructions about how to gain access to the online testing system. For months beforehand, districts could participate in workshops or webcasts on any number of issues related to the new tests.

Unlike other states that administered the field tests to a sample of students, California chose to administer the field tests to all eligible students. Of  8.9 million test sessions – some students logged in for two or three sessions to finish the various parts of the new assessments – 97 percent of students completed them.

At the local level, school districts are still working through a range of technical problems.

In Los Angeles, principals and test coordinators have identified problems such as not having enough iPads, laptops or desktop computers at some schools for students to take the test in a timely manner. In some instances, Lim said, the field tests were spread out over a six-week period so students could take the tests on a staggered schedule. School personnel said that lengthy period of time was too disruptive of school routines, and that the testing period should be shorter.  Officials also reported that students experienced “log-in issues” with Smarter Balanced software, and students “were regularly kicked off.”

Diane Hernandez, director of the Assessment Development and Administration Division at the California Department of Education, said that the report to be presented to the State Board of Education in November will give a fuller picture of problems at the school site level.  She said there were “some gaps” in broadband access at some schools, but mostly in small rural districts. To fill those gaps, the department last month announced a fund of $26.7 million, known as the Broadband Infrastructure Improvement Grant program. The state last week released a preliminary list of 300 schools – many in remote locations – that may be eligible to apply for the money.

Hernandez said the biggest problem encountered by districts was resetting passwords they needed to gain access to the testing system, known as TIDES, now renamed TOMS (Test Operations Management System).   But districts having difficulty were able to call the California Technical Assistance Center using an 800 number to get immediate help, Hernandez said. The state has contracted with the Education Testing Service to run the center.

This year districts have received updated – and detailed – instructions for administering the tests in the spring. Those are posted online on a website dedicated to the new assessment system, known as the California Assessment System of Student Performance and Progress (CASSPP).

Smarter Balanced officials also said they experienced few major problems with the field tests. “We did not have any interruption of service in 55 days of administering the field test,” Joe Willhoft, the executive director of the Smarter Balanced consortium, said in a webinar earlier this month.

He said some students had difficulties logging in due to unclear instructions that were given by test administrators. Some also had difficulties with “text to speech,” zooming, audio and other technical features of the online assessments, but he said the Smarter Balanced help desk was able to respond to those concerns, and those technical issues have been fixed. Willhoft said there were disruptions due to inadequate bandwidth at some schools, but that in general the feedback has been “overwhelmingly positive.”

Later this fall, districts will have another chance to prepare for the spring administration of the new assessments when the Smarter Balanced consortium provides them with “interim assessments” to gauge how students are doing. Along with other districts, Los Angeles Unified’s Lim said that the district learned a great deal from the field tests, including “how to respond quickly to schools” experiencing difficulties administering the test. The district is currently hosting focus groups of test coordinators to ensure that things go more smoothly in the spring.

Last year, 95 percent of  field test sessions that students started at LA Unified were completed, without students being bumped off due to password malfunctions, bandwidth problems or other technical glitches. This year, Lim said, the target is for all students to do so.

She said some teachers and principals said that despite the new technical challenges of having to administer online assessments, they prefer them to the more cumbersome pencil-and-paper tests. The online tests have eliminated the need to collect test booklets, sort and bundle them, and then take them to a test center for the results to be collated. The district no longer needs a warehouse where for a full month prior to the annual testing period under the old system employees packaged and distributed test materials to the schools.

“This is much more manageable,” Lim said. “It is the wave of the future.”

 

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (3)

Leave a Reply to el

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Robert Nacario 9 years ago9 years ago

    While Smarter Balanced reports few major problems we have to consider the difference between a computer administered test and a computer adaptive test. There were problems reported with the computer administered test that was largely linear in design. That is, a problem was presented to the student and students provided their answer, really little if any processing of data. The live assessment will be computer adaptive. That is, the problem is … Read More

    While Smarter Balanced reports few major problems we have to consider the difference between a computer administered test and a computer adaptive test. There were problems reported with the computer administered test that was largely linear in design. That is, a problem was presented to the student and students provided their answer, really little if any processing of data.

    The live assessment will be computer adaptive. That is, the problem is presented to the student, the student provides an answer, host computer accepts the answer and has to process the answer before providing the next suitable problem. Many, many, many more data packets must travel over these lines multiplied by as many students taking the assessment which will be many more, because not all consortium states adminsitered the assessments to all of their students.

    Over the past 2 years our district has administered computer adaptive assessments and do so in an environment that establishes a national norm group composed of hundreds of schools. While we have encountered similar challenges as described in the article early on with computer adaptive assessments, many have been resolved and interruptions have been few this current year.

    While we all hope the Spring assessment goes smoothly, the optimism many feel should be conveyed with caution and hopes that bandwidth is adequate on both sides, and that data processing capacity on the host side is as adequate.

  2. Doug McRae 10 years ago10 years ago

    This post focuses on only the process used to collect the "test" information, i.e., the new technology involved for computer-administered tests. It masks the substantial issues involved in generating meaningful credible actual test scores . . . whether students have sufficient actual instruction on common core standards prior to operational testing to permit valid test scores, or the fact that while Smarter Balanced may have generated enuf qualified test questions to deliver an actual full … Read More

    This post focuses on only the process used to collect the “test” information, i.e., the new technology involved for computer-administered tests. It masks the substantial issues involved in generating meaningful credible actual test scores . . . whether students have sufficient actual instruction on common core standards prior to operational testing to permit valid test scores, or the fact that while Smarter Balanced may have generated enuf qualified test questions to deliver an actual full test spring 2015 [the so-called “field” test was not an actual test, but rather an exercise to try-out test questions for potential inclusion in an actual test], it still needs actual test scores from spring 2015 to generate credible scoring rules for the test (i.e., “cut scores”) for valid reliable fair individual student and aggregate group scores. Such scoring rules will not be available until probably Sept 2015, to release actual results probably 4th quarter 2015.

    The technology issues are important, and CA’s experience this last spring went better than expected. Yet to be shared by Smarter Balanced are data on how many qualified questions were developed, or data on whether computerized test administration revealed fairness or inequity for major subgroups such as socio-economic disadvantaged students or English learners. We need these data to fully evaluate the results of the Smarter Balanced 2014 test development exercise. Addressing technology issues is a necessary thing to do before implementing a new statewide testing regime, but not sufficient for passing judgment on whether new Smarter Balanced tests are ready for full implementation spring 2015. Data from the issues not addressed in this post are needed for the latter judgment.

    Louis, can you provide the links for the website mentioned in the section of the post quoting CDE personnel and the webinar mentioned in the section of the post quoting SBAC personnel?

    Replies

    • el 10 years ago10 years ago

      It's pleasantly surprising to hear that the technical aspect went relatively smoothly... but honestly I'm a bit skeptical too. People are a lot mellower about the problems they encounter when the test doesn't count. 97% of 8.9 million "successful" sounds great, but that means that 267,000 kids weren't able to complete. How does that compare to the paper and pencil tests? What will the procedure be for those kids? Will they have to/get to take it … Read More

      It’s pleasantly surprising to hear that the technical aspect went relatively smoothly… but honestly I’m a bit skeptical too. People are a lot mellower about the problems they encounter when the test doesn’t count.

      97% of 8.9 million “successful” sounds great, but that means that 267,000 kids weren’t able to complete. How does that compare to the paper and pencil tests? What will the procedure be for those kids? Will they have to/get to take it again? Will the score stand as is? If they’re evenly distributed the situation is different than if that means that whole schools were affected. (And, of course, it matters if we’re using these scores for diagnostics or to decide who to fire and which schools to close.)

      I’m glad to hear that there will be funding for more broadband. I hope that sooner rather than later every school in California has a true broadband connection and that there will also be money for more devices in schools so that we’re not testing over a 6-8 week period. When you have statisticians trying to explain differences in test scores as being equivalent to N days in school, one imagines that a 40 day difference in test date would have some impact.