California plans to roll out computerized testing aligned to Common Core standars in 2015. Image from Flickr

A majority of school districts said they’re ready for online tests aligned to Common Core. Credit: Flickr

A majority of the school districts and charter schools responding to a state survey indicated they’ve got the technology to offer computer-based testing for the new Common Core standards. But the one-third to 40 percent of districts that said they have only some or little confidence they can pull it off could signal trouble complying with a new state law requiring all districts to give the Common Core math or English language arts field test six months from now.

That testing requirement, stipulated under the recently signed Assembly Bill 484 – which says that the field, or practice, test be given instead of state tests required under federal law – is also putting the state in conflict with U.S. Secretary of Education Arne Duncan.

The number of districts and charter schools not capable of giving the Common Core tests could be larger, since only about two-thirds of districts responded to the state survey, which was taken between June and September. However, the charters and districts that responded enroll 87 percent of the state’s students and include the 25 largest districts, the state Department of Education said in a memo released Wednesday. And seven out of eight districts also indicated they plan to use some of the one-time money they’re receiving for Common Core implementation to fill in gaps in technology necessary to offer the online tests. The Legislature approved $1.25 billion for Common Core and left it to districts to decide how to spend their share of it over the next two years.

The field test next spring is intended to help a federally funded state collaborative creating the assessment, called the Smarter Balanced Assessment Consortium, choose the test questions and determine the preliminary grading scale for the official Common Core tests. Those tests will be first given to the two dozen states in the consortium, including California, in spring 2015. Districts that can’t administer the assessments by computer will be able to give the paper-and-pencil version for up to three years.

The field tests will afford districts a run-through on giving computer-based assessments, so that they can work out technical kinks a year ahead of time. Teachers will see the kinds of questions that will be on the formal test.

Results from the survey indicate that the biggest hurdle many districts anticipate is human, not technological. While 70 percent of respondents expressed confidence they have the Internet bandwidth required and 58 percent said they have computers that meet the test specs, only 46 percent expressed complete or considerable confidence they have adequate support personnel to handle the testing and 20 percent expressed little confidence. Many districts cut back on tech support during the recession and remain understaffed. Most districts acknowledged they need additional peripheral equipment, such as headsets and keyboards.

Deputy superintendent Deb Sigman

Deputy Superintendent Deb Sigman

“The big takeaway for us, affirming what we’ve been hearing in conversations with districts, is the issue about having staff on the ground,” said Deb Sigman, deputy state superintendent of public instruction and a member of the Smarter Balanced executive committee. “We’re worried that districts have had cutbacks.”

Sigman said that, in response to the survey, the Department of Education plans to use savings from not administering state assessments to start a help desk for districts’ testing issues and will offer more training for district personnel. The department will bring a plan to the State Board of Education at its meeting next month.

The survey didn’t deal with the question of whether students, particularly in elementary school, will be competent enough to take a computer-based test. They will need keyboarding skills and familiarity with computers for the tests to accurately reflect their knowledge of Common Core standards.

The field test will be given between early March and early June 2014. Two-thirds of the 880 districts and charter schools expressed solid confidence in their ability to give the test during that 12-week window, and only 8 percent had little confidence they could meet the deadline. But that time frame may shrink considerably a year later, when the actual test is first given. A standard test period is five weeks, to preserve test security and to reduce advantages to districts that give the test after additional weeks of instruction. But a tighter window would pose additional equipment and capacity challenges for some schools and districts.

The Smarter Balanced tests will be given to all students in grades 3 through 8 and grade 11.

AB 484 mandates that districts offer either the math or English language arts Common Core field test – not both – to all students. Sigman said that the survey results validated the Legislature’s judgment not to overstress districts by requiring that both field tests be given. But that decision also put the state at odds with Duncan. He had indicated a willingness to grant states a one-year waiver from federal testing requirements if each student took both the English language arts and math tests. Duncan could fine the state $16.5 million – 1 percent of Title I money coming to California used to operate the state Department of Education – or potentially more for being out of compliance.

 

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (35)

Leave a Reply to John Fensterwald

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Jerry Heverly 10 years ago10 years ago

    I often feel very dumb when I read these reports. API suspended? I read as much as I can about Cal. education and yet I had no clue. API suspended for two years? Which throws a monkey wrench into AYP? I pray all this is true. But, alas, I'm certain my district bureaucrats will carry on as if they both were unchanged. After all, how to justify those salaries if there is no data … Read More

    I often feel very dumb when I read these reports. API suspended? I read as much as I can about Cal. education and yet I had no clue. API suspended for two years? Which throws a monkey wrench into AYP? I pray all this is true. But, alas, I’m certain my district bureaucrats will carry on as if they both were unchanged. After all, how to justify those salaries if there is no data to track? Our CCSS training says pacing guides are outmoded for the new methods, yet when I ask about dropping our pacing guides I get blank stares (are you nuts? drop our beloved pacing guides?)

    Replies

    • navigio 10 years ago10 years ago

      Hi Jerry. It wasn't actually suspended yet, but a law was passed recently to modify testing behavior and as part of that law, the discretion was given to the state super and BoE to suspend it if they felt API wasnt going to be accurate. The relevant part of the law is appears to be: "(2) A school or school district annually shall receive an API score, unless the Superintendent determines that an API score would … Read More

      Hi Jerry. It wasn’t actually suspended yet, but a law was passed recently to modify testing behavior and as part of that law, the discretion was given to the state super and BoE to suspend it if they felt API wasnt going to be accurate. The relevant part of the law is appears to be:
      (2) A school or school district annually shall receive an API score, unless the Superintendent determines that an API score would be an invalid measure of the performance of the school or school district for one or more of the following reasons:
      […]
      (F) A transition to new standards-based assessments compromises comparability of results across schools or school districts. The Superintendent may use the authority in this subparagraph in the 2013–14 and 2014–15 school years only, with approval of the state board.

      Obviously, I expect we’ll hear a whole lot more about this if/when it actually happens. John appears to think its a given. It would be interesting to hear whether there was some indication by the board/super that implied that.

      • John Fensterwald 10 years ago10 years ago

        navigio: Because of the field test for Common Core in grades 3-8 and 11, it's clear to me and consistent with the word I'm hearing that API will be suspended for 2014. You could do a high school API, but it would consist of only 10th grade science and the high school exit exam. Not much value there. My hunch is that the State Board will not want to use the first year scores for Smarter … Read More

        navigio: Because of the field test for Common Core in grades 3-8 and 11, it’s clear to me and consistent with the word I’m hearing that API will be suspended for 2014. You could do a high school API, but it would consist of only 10th grade science and the high school exit exam. Not much value there.
        My hunch is that the State Board will not want to use the first year scores for Smarter Balanced in 2015 for the API with implications for accountability, but there will no doubt be some discussion. And suspending API a second year risks further issues with the feds. We’ll see.

        • Doug McRae 10 years ago10 years ago

          John -- Yup, suspending API for 2014 is pretty much a done deal with the only potential hitch being ADuncan's reaction to CA's "double testing" waiver request. I agree with your analysis for suspending API for 2015 also. In addition to questionable validity of potential SB scores in 2015, there are no plans to develop equivalency between paper/pencil tests and computer-adaptive tests before the tests are administered spring 2015, which means we'll have apples … Read More

          John — Yup, suspending API for 2014 is pretty much a done deal with the only potential hitch being ADuncan’s reaction to CA’s “double testing” waiver request. I agree with your analysis for suspending API for 2015 also. In addition to questionable validity of potential SB scores in 2015, there are no plans to develop equivalency between paper/pencil tests and computer-adaptive tests before the tests are administered spring 2015, which means we’ll have apples and oranges for scores in 2015 not suitable for API calculations. And, of course, derived scores (basic/proficient/advanced) for the spring 2015 testing cycle are not likely to be available until fall 2015, so that also throws at least a delay wrench into any plans to use 2015 SB data for API calculations. There has been no discussion at all, as far as I know, how CA plans to finesse federal AYP accountability requirements in 2015 . . . . I guess they’ll be figuring out how to fly that part of the SB airplane after it gets off the ground.

        • navigio 10 years ago10 years ago

          Thanks for the clarification, John.

  2. Stephanie 10 years ago10 years ago

    Does anyone know why we are still required to give the science CST in grades 5,8 and 10?

    Replies

    • John Fensterwald 10 years ago10 years ago

      I'll be writing more about this issue this week, Stephanie. I know it sounds crazy to continue testing on old standards, while districts are supposed to begin teaching the new Next Generation Science Standards, but federal law -- NCLB – requires testing annually in grades 5, 8 and 10. Unlike the tests for Common Core standards in English language arts and math, which are in the works, there's no timeline for new tests for … Read More

      I’ll be writing more about this issue this week, Stephanie. I know it sounds crazy to continue testing on old standards, while districts are supposed to begin teaching the new Next Generation Science Standards, but federal law — NCLB – requires testing annually in grades 5, 8 and 10. Unlike the tests for Common Core standards in English language arts and math, which are in the works, there’s no timeline for new tests for the new science standards, so there is no alternative to offering the old tests. Since API will be suspended in 2014, the scores on the science tests won’t count toward a school’s API score next year. However, I am assuming that individual student scores will be made available to parents, and the school will get results as well.

      • navigio 10 years ago10 years ago

        Was the api officially suspended already? Or are you just expecting that to happen given the bill that gave leaders the option to do that at some point?

        • Doug McRae 10 years ago10 years ago

          Navigio -- The bill that gives the SBE authority to suspend the API in 2014 and 2015, AB 484, does not become law until Jan 1, 2014, so the SBE has to wait until 2014 before they can officially suspend the API. Also, I might note that the API is a portion of our federal AYP calculation, and hence suspension of the API for 2014 and/or 2015 also factors into the mix for the pending … Read More

          Navigio — The bill that gives the SBE authority to suspend the API in 2014 and 2015, AB 484, does not become law until Jan 1, 2014, so the SBE has to wait until 2014 before they can officially suspend the API. Also, I might note that the API is a portion of our federal AYP calculation, and hence suspension of the API for 2014 and/or 2015 also factors into the mix for the pending “double testing” waiver that presumably will be addressed between now and January 2014, though even if CA submits their waiver by the Nov 22, 2013, deadline, it is questionable when the feds will reply . . . . especially given the early Oct 2013 status of federal government responsiveness. It’s tough to respond quickly when nobody’s home in DC . . . . .

          • navigio 10 years ago10 years ago

            Nobody’s ever home in DC..

            What you described aligns with my memory. Johns wording made it sound like he knew something we didn’t. Just trying to make sure.

      • Doug McRae 10 years ago10 years ago

        John -- your statement "it sounds crazy to continue testing on old standards while districts are supposed to begin teaching the new NGSS" assumes that the old standards are not aligned to the new standards . . . I haven't seen anything yet on the content alignment of the old 1977 CA science standards to the new NGSS standards, but it would not be surprising that the alignment analysis is pretty much the same for … Read More

        John — your statement “it sounds crazy to continue testing on old standards while districts are supposed to begin teaching the new NGSS” assumes that the old standards are not aligned to the new standards . . . I haven’t seen anything yet on the content alignment of the old 1977 CA science standards to the new NGSS standards, but it would not be surprising that the alignment analysis is pretty much the same for science as it was for E/LA and Math, where both state and national analysts concluded that the content and rigor of our old 1997 standards were relatively comparable to the content and rigor of new standards, and that the differences between the two sets of standards are primarily how the material is expected to be taught rather than what is expected to be learned. This analysis was fairly widely accepted prior to the early Sept AB 484 amendment surprise designed to dump the STAR tests, and the SSPI presser rhetoric proclaiming vast differences between the old and new standards as the justification for dumping the old STAR tests immediately. It sounds like you are drinking the same coolaid the SSPI was drinking in early Sept with your “sounds crazy” statement . . . . . a sounder perspective is to wait for some credible analysis on old vs new content and rigor before jumping to conclusions.

        • John Fensterwald 10 years ago10 years ago

          Strictly coffee this morning, Doug, no Kool-Aid. From what I have read about the new science standards – particularly their emphasis on concepts and the cross-disciplinary approach to topics – is that they are very different from the California science standards and will be a heavier lift for the state's science teachers than the new Common Core standards will be for math teachers. I plan to look into the issue more closely before the State Board … Read More

          Strictly coffee this morning, Doug, no Kool-Aid.

          From what I have read about the new science standards – particularly their emphasis on concepts and the cross-disciplinary approach to topics – is that they are very different from the California science standards and will be a heavier lift for the state’s science teachers than the new Common Core standards will be for math teachers. I plan to look into the issue more closely before the State Board picks up the science standards again next month, but my understanding — and others with more knowledge can jump in — is that there is not close alignment.

          • Doug McRae 10 years ago10 years ago

            John -- You make my point when you talk about the differences between old and new science standards being emphasis on concepts and the cross-disciplinary approach to topics . . . . those are "how to be taught" factors, not "content to be learned" factors. And those will be much heavier lifts for science teachers, no question. But if one looks at the alignment between old and new standards in terms of content expected to … Read More

            John — You make my point when you talk about the differences between old and new science standards being emphasis on concepts and the cross-disciplinary approach to topics . . . . those are “how to be taught” factors, not “content to be learned” factors. And those will be much heavier lifts for science teachers, no question. But if one looks at the alignment between old and new standards in terms of content expected to be learned, and that is the perspective that tests look at the standards they attempt to measure, then you find a substantial overlap for the content and rigor between the old and new E/LA and math standards, and I’m not gunna be surprised if the same comes from an analysis for old and new science standards. If so, since the old and new content standards have substantial alignment, and the tests are designed measure those standards, the old tests will yield usable data to measure the new content standards. I’m not arguing to retain the old tests indefinitely, rather I’m saying the data from the old tests are usable for a rational transition from old to new tests with a new computerized protocol for statewide assessments.

            It’s easy to get caught up in the Reading Wars [phonics vs whole language] and Math Wars [basic skills and algorithms vs mathematical power] and Science Wars [basics/facts vs hands-on learning] that Peter Schrag wrote about 15+ years ago when the 1997 standards were being debated. Those are curriculum wars, not assessment wars. Assessments need to be above those wars, designed to measure expected content regardless of the curriculum and instruction used to teach the content standards. If we allow tests to be biased toward only one approach to teaching the standards, we are narrowing local district, local school, and classroom teacher options for choosing the best way to teach CA’s extremely diverse student body in the best way teachers deem possible to maximize individual student learning. That will be a narrowing of curriculum far more extensive than the narrowing of content [E/LA and Math emphasized more than other content areas] that folks complain about now.

            • navigio 10 years ago10 years ago

              But havent you said that the new tests (not science) are intended more to measure the how than the what?

            • Doug McRae 10 years ago10 years ago

              Navigio: Yup -- at the SBE meeting Sept 4, in a public comment I quoted Dan Koretz from Harvard as saying "The consortium tests don't measure what students know . . . . they are designed to push instruction in a certain way." I tend to agree with Koretz, and this is one of the reasons I've not been a fan of the federally funded consortium test development effort (either consortium) from the git … Read More

              Navigio: Yup — at the SBE meeting Sept 4, in a public comment I quoted Dan Koretz from Harvard as saying “The consortium tests don’t measure what students know . . . . they are designed to push instruction in a certain way.” I tend to agree with Koretz, and this is one of the reasons I’ve not been a fan of the federally funded consortium test development effort (either consortium) from the git go. Statewide assessment data will widely be interpreted as measuring what students know, and it seems to me pretty basic that therefore the tests should be designed to measure what students know rather than for some other purpose.

  3. Jennifer 10 years ago10 years ago

    This is interesting. School districts are scrambling to get ready for the computerized Smarter Balance test that will be used to check the validity of their future computer based test. Crazy thing is educators do not have the resources necessary to teach the common core standards. (Shhh.... Don't tell anyone you are not supposed to know this information.) Many teachers do not have the pedagogy developed to support the shift that needs to occur regarding common core … Read More

    This is interesting. School districts are scrambling to get ready for the computerized Smarter Balance test that will be used to check the validity of their future computer based test. Crazy thing is educators do not have the resources necessary to teach the common core standards.
    (Shhh…. Don’t tell anyone you are not supposed to know this information.)
    Many teachers do not have the pedagogy developed to support the shift that needs to occur regarding common core implementation and we don’t have the curriculum, because it is in the process of being adopted as we Facebook.
    Just this week, my super star students took a math test that they put two hours into (several are still not done). Some of the issue is their teacher, yours truly, who has been diligently researching and finding curriculum, has not taught all of the standards on the interim assessment because I am trying to prepare my students to actually learn skill sets for the manipulation and understanding of numbers, not get them ready for the interim test.
    Our site has 36 computers in our computer lab (wahoo!!!- that was a long time coming) that about 500 students will need to use to take the Smarter Balance test. Taking a test pencil and paper vs computer is also a skill set that needs to be developed. Imagine how many educational hours are going to be devoted to such a task.
    (Perhaps school should go until 5pm that way I can still get in art, PE, Science, Social Studies, and behavioral training. Yup, I said behavioral training).
    The common core standards were adopted in 2010. The tests were developed and will be implemented, in a pilot phase, this year 2014. The funny thing is we won’t adopt math curriculum until next year 2015 and Language arts a year or two after that 2016-17. (Anyone see the flaw)
    There has not been professional development in regards to best practices for classroom implementation of common core. Why? Because, “when we adopt our curriculum we will get it”. Ummm… 4-6 years AFTER the standards were adopted we will be officially given training. The sad prediction I am making is the training will be how to use the curriculum, and not how to support our understanding of the math skills by providing support for our pedagogical development. To develop a new approach to teaching and learning, one needs an opportunity to be exposed to the new approach.
    The unorganized implementation of this new educational journey is frustrating. I can’t even get feed up with those in charge at our district office because they too are blindly wallowing through the muck created by our government.
    The crazy thing is the standards are exciting. The possibilities the CCSS open up are fantastic. And the fact that our standardized tests do not count this year is fantastic!!!!!!!! The stress relief from that info is replaced, sadly, by the fact that in order to truly do a good job educating my students, I now spend hours looking for resources on-line developed by states such as NC and UT.
    The title of this article is MOST districts. The title should be students and the adults who teach them do not have the necessary equipment for ALL students to succeed. 40% of the children who make up the classrooms in CA is a significant amount. People in charge have to stop worrying about how something looks and get real about what our children need to be successful, contributing citizens. Skills you don’t take a standardized test to measure.
    (Sunday soapbox COMPLETE).
    PEACE

  4. el 10 years ago10 years ago

    "Teachers will see the kinds of questions that will be on the formal test." Does this mean teachers will be allowed to see the items on this test, perhaps even take it themselves? Honestly, I'd like to see as a pilot this November that we install the test software and have all staff, all board members, every member of the legislature :-), and any interested parents sit down and take one of the tests. It's going to … Read More

    “Teachers will see the kinds of questions that will be on the formal test.”

    Does this mean teachers will be allowed to see the items on this test, perhaps even take it themselves?

    Honestly, I’d like to see as a pilot this November that we install the test software and have all staff, all board members, every member of the legislature :-), and any interested parents sit down and take one of the tests. It’s going to be a wholly different experience than what we grew up with, and (a) we all need to understand what it is we’re asking the kids to do and (b) what better way to see what kinds of issues that will come up than with volunteer adults?

    Also: is there a standard time in school curriculum where we are teaching touch-typing?

    Replies

    • Pat Kaplan 10 years ago10 years ago

      Totally agree!! Exactly what I’d like to see: the governor, Arne, the state board of education and all the local boards and supers taking that test!! The practice tests I’ve seen are totally biased to kids familiar with keyboarding and computer skills. It’s ridiculous to think our students are going to be ready for this test much less the technology in schools as in the Oakland district.

  5. Jerry Heverly 10 years ago10 years ago

    Forgive me for these unrelated and somewhat extraneous questions: Could a district test math in one grade and English in another? Do we know yet if there will be tests for 9, 10? Test security for 12 weeks for a computerized test? Maybe we could get the 500,000 kids who hacked those LAUSD iPads to attempt a break in of the tests? I thought only districts with "capability" would do the CCSS in 2014. If paper and pencil … Read More

    Forgive me for these unrelated and somewhat extraneous questions:
    Could a district test math in one grade and English in another?
    Do we know yet if there will be tests for 9, 10?
    Test security for 12 weeks for a computerized test? Maybe we could get the 500,000 kids who hacked those LAUSD iPads to attempt a break in of the tests?
    I thought only districts with “capability” would do the CCSS in 2014. If paper and pencil is allowed aren’t we all capable?
    And re: LCFF. Our district now routinely refers to LCFF money as a “giant categorical”. They are paranoid about spending a dime of the money lest the county or state rebuke them retroactively for misallocation. Is this a common attitude around the state?

    Replies

    • Doug McRae 10 years ago10 years ago

      Jerry: A few observations for some of your Q's -- Re tests for grades 9, 10 -- my understanding is there will be a small sample of 9th & 10th graders taking SBAC field tests in 2014, fewer than the samples needed for grades 3-8 & 11. For new "common core" tests for grades 9, 10 in the future, AB 484 calls for an SSPI recommendation what to do for non fed-required grades and content … Read More

      Jerry: A few observations for some of your Q’s —

      Re tests for grades 9, 10 — my understanding is there will be a small sample of 9th & 10th graders taking SBAC field tests in 2014, fewer than the samples needed for grades 3-8 & 11. For new “common core” tests for grades 9, 10 in the future, AB 484 calls for an SSPI recommendation what to do for non fed-required grades and content areas by March 2016 and presumably some additional tests will then be added to the statewide assessment program during the 2017 to 2020 period.

      Re needing 500,000 kids to hack into the new computerized tests, I’d observe probably that 500 to 1000 kids on hacking patrol will do the job . . . . no need paying for overkill, need to be efficient with our statewide testing dollars . . . . (grin).

      Re only districts with tech capability doing the tests in 2014, there won’t be any paper/pencil SBAC field tests in 2014 so the only way to participate will be via computerized test administration. AB 484 itself says nothing about tech capability; rather it says districts “shall” participate in one (and only one) content area SBAC field test in 2014. But, at the SBE meeting 9/4, it was acknowledged verbally during CDE staff and SBE member exchanges that districts without tech capability wouldn’t be expected to participate in SBAC field testing . . . . common sense trumps legislative language, in effect. But, exactly what constitute insufficient tech capability to generate an excused absense from 2014 field testing is yet to be determined . . . . districts need to stay tuned on that aspect of 2014 testing.

  6. navigio 10 years ago10 years ago

    The survey link seems to point to private content.

    SBAC has a bandwidth check tool. According to that tool I could give the math test to about 1250 students at one time using my 4G cell phone data connection.

    I think I know what I’d do if I were a district CTO.

    Replies

  7. Doug McRae 10 years ago10 years ago

    The post points out that the CDE's technology survey asked LEA's for current tech capability based on a 12-week test administration window, which is the window for the SBAC field test spring 2014, but that the test administration window for an operational test in 2015 is likely to be much shorter [with the 5-week window mentioned in the post is standard operating practice for statewide tests due to test security reasons]. Thus, schools will need … Read More

    The post points out that the CDE’s technology survey asked LEA’s for current tech capability based on a 12-week test administration window, which is the window for the SBAC field test spring 2014, but that the test administration window for an operational test in 2015 is likely to be much shorter [with the 5-week window mentioned in the post is standard operating practice for statewide tests due to test security reasons]. Thus, schools will need more technology capabilty for a shorter operational test administration window in 2015. It also should be pointed out that the operational test in 2015 will involve two content areas (E/LA and Math) rather than only one content area per AB 484 for the SBAC field test in 2014, so that factor will also increase the technology required in 2015. The SBAC field test sampling design may request only one random grade per selected school in 2014, similar to the sampling spec for the SBAC pilot test in 2013, and that spec would reduce the technology load further for 2014 field testing . . . but, the SBAC sampling specs for 2014 haven’t been made public yet (as far as I know) and the AB 484 provision would trump the SBAC sampling spec anyway. In any event, the tech preparedness data does indicate that a majority of CA schools do now have the hardware and bandwidth to respond at least the SBAC field test request for participation, with the AB 484 provision for one content area for all schools within a 12-week window more of a dicy proposition for 2014. Physical technology demands for 2015 will be substantially higher with two content areas and a shorter test administration window. Finally, the CDE report gives very short shrift to the human capabilities needed, both teacher capabilities and student capabilities, to use the hardware and bandwidth needed to conduct computerized testing on a statewide basis. The only mention of this capability is a single line indicating survey respondents said “the greatest need (is for) professional development for staff related to technology.” That reflects the need for increased teacher capabilities to use technology for testing (and, more importantly, for instruction) purposes; no mention whatsoever of the kid capacity to use the technology to generate valid/reliable E/LA and Math scores that are not compromised by lack of kid familarity with the technology and the protocols needed to respond to the test questions. The bottom line is there are many levels of concern that need to be addressed for a meaningful discussion of technology preparedness for a new computer-adaptive statewide testing program.

    Replies

    • SoCal Teacher 10 years ago10 years ago

      Just because someone down at the district office claims their district is ready…

      • Manuel 10 years ago10 years ago

        It’s no different than when they claim that the ratio of students to teachers in a classroom is 25…

  8. SoCal Teacher 10 years ago10 years ago

    Our computer lab doesn’t have enough computers to test a full class. The computers themselves are over 10 years old. Keys are missing from the keyboards. If the wind shifts in the wrong direction, the wireless goes down. Keep your fingers crossed that the district servers don’t go down. Sure, we are sooooo ready for testing.

  9. Paul Muench 10 years ago10 years ago

    Our district has already warned parents to expect test scores to be lower with the Smarter Balanced tests. So even if the technology is ready the clear message is that students will not be ready.

    Replies

    • John Fensterwald 10 years ago10 years ago

      That's probably the gist of it, Paul. It's certainly useful to start telling parents now that the scores on the Smarter Balanced assessment should not be compared with the CST results of the past, because they reflect different standards. Might also be useful to remind parents that initial CST scores were low, after they were introduced. What's worrisome is that some students won't have keyboarding skills for a computer-based test, requiring manipulation of figures using a … Read More

      That’s probably the gist of it, Paul. It’s certainly useful to start telling parents now that the scores on the Smarter Balanced assessment should not be compared with the CST results of the past, because they reflect different standards. Might also be useful to remind parents that initial CST scores were low, after they were introduced.
      What’s worrisome is that some students won’t have keyboarding skills for a computer-based test, requiring manipulation of figures using a mouse and written answers, to show what actually they do know.

      • Paul Muench 10 years ago10 years ago

        Districts are still talking as if students will get a summary result similar to the CST. (Basic/Proficient/Advanced) Probably because the districts don’t know how the results will be named. If it’s not already the case, then maybe this is a good time to come up with a different naming scheme for the summaries. It might help to ward off any confusion.

        • Doug McRae 10 years ago10 years ago

          Paul: My understanding is that districts/schools will not get a summary result (like basic, proficient, adavanced) from 2014 field testing. And, for 2015, my understanding is that cut scores for performance levels will not be determined until several months after consortium-wide test admnistration is complete, perhaps in September 2015. That means districts/schools/teachers/parents/students will not have interpretable results from spring 2015 testing until sometime in the fall of 2015. Same timing (fall 2015) for … Read More

          Paul: My understanding is that districts/schools will not get a summary result (like basic, proficient, adavanced) from 2014 field testing. And, for 2015, my understanding is that cut scores for performance levels will not be determined until several months after consortium-wide test admnistration is complete, perhaps in September 2015. That means districts/schools/teachers/parents/students will not have interpretable results from spring 2015 testing until sometime in the fall of 2015. Same timing (fall 2015) for any aggregate and disaggregate assessment data, and any contribution to accountability calculations like federal AYP and CA API. This delayed availability of meaningful scores from spring 2015 test administration is why I’ve questioned whether it is fair to call SBAC 2015 an “operational” test. It won’t be “operational” until it can be scored and turned into interpretable data, which will be fall 2015 at best . . . . If anyone with a better understanding of SBAC plans for setting cut scores has conflicting information, please let us all know.

          • Paul Muench 10 years ago10 years ago

            I took our districts comment to mean the "lower" summary scores would last for some number of years. Otherwise parents will not even see the scores. We didn't get any predictions on why this would happen or how long it would last. But with any system of testing I can imagine it takes time for teachers and students to adapt to its prompts. And then we can see how students fair … Read More

            I took our districts comment to mean the “lower” summary scores would last for some number of years. Otherwise parents will not even see the scores. We didn’t get any predictions on why this would happen or how long it would last. But with any system of testing I can imagine it takes time for teachers and students to adapt to its prompts. And then we can see how students fair in terms of the new definitin of proficiency or whatever it is called. I’m sure an expert could tell a much more detailed story than I just did 🙂

            • Doug McRae 10 years ago10 years ago

              Paul: Oh, I misread your comment. Re whether scores will go up or down from old tests (STAR CSTs) to new consortium tests, we'll never know for sure because we won't have equivalency checks from the old cut scores to the new cut scores . . . . The SSPI recommendations to the legislature Jan 2013 included having these equivalency checks (Rec # 11) but when the legislature amended AB 484 in early Sept … Read More

              Paul: Oh, I misread your comment. Re whether scores will go up or down from old tests (STAR CSTs) to new consortium tests, we’ll never know for sure because we won’t have equivalency checks from the old cut scores to the new cut scores . . . . The SSPI recommendations to the legislature Jan 2013 included having these equivalency checks (Rec # 11) but when the legislature amended AB 484 in early Sept it not only excluded these checks but banned local districts from doing such checks. Equivalency checks from old to new tests are standard operating good large scale assessment practice, needed to explain the new test results, and I haven’t heard any reasonable explanation for why the equivalency checks were excluded from AB 484 — the only explanation has been the SSPI presser rhetoric saying CA needs a “clean break” from old to new testing systems. My view has been that CA’s new common core tests should yield results roughly equivalent to the old STAR CSTs, since the content and rigor of the two sets of standards (CA’s 1997 standards and new common core standards) being measured heavily overlap. This result is what Massachusetts released several weeks ago based on their 2013 MCAS results. For CA, the eventual SBAC results will reflect not only the E/LA and Math content being measured, but also the change to new types of items and new computer-adaptive test administration formats, and these new elements may well totally confound any attempts to interpret changes in E/LA and Math achievement from old tests to new tests. So, while CA will eventually see new test results that look similar to the old CST results (i.e., %’s basic, proficient, advanced), we will have no way of saying whether the new results are in actuality lower, about equal, or higher than the old test results.

            • navigio 10 years ago10 years ago

              To be honest, I don't think we even have that now. No one can explain why the results were what they were this year. In fact, different leaders are using the same excuse to 'explain' opposite moves. The way things are going, my expectation is that state leaders will look at field results and decide not to release anything at all. Not only are tests and standards different but the sample size will be reduced … Read More

              To be honest, I don’t think we even have that now. No one can explain why the results were what they were this year. In fact, different leaders are using the same excuse to ‘explain’ opposite moves. The way things are going, my expectation is that state leaders will look at field results and decide not to release anything at all. Not only are tests and standards different but the sample size will be reduced by up to 75% if I read correctly.

              Regarding comparability with mass (getting similar results across varying variables), my impression was that you thought we needed to change something about our tests in order to achieve that. Have you changed your mind?

            • Doug McRae 10 years ago10 years ago

              Navigio: We never have universal agreement on explanations for why test results are what they are; test results are just numbers, and different folks will always have different views on why the numbers are what they are. Re field test results, expecially the 2014 SBAC field test that is primarily an item-tryout exercise to generate a sufficient number of empirically qualified items to populate an adequate computer-adaptive item bank, I totally agree that neither … Read More

              Navigio: We never have universal agreement on explanations for why test results are what they are; test results are just numbers, and different folks will always have different views on why the numbers are what they are. Re field test results, expecially the 2014 SBAC field test that is primarily an item-tryout exercise to generate a sufficient number of empirically qualified items to populate an adequate computer-adaptive item bank, I totally agree that neither individual student nor aggregate results should be released; those results are techy stuff for test developers to determine whether individual items are qualified for a large scale assessment program, and kids/schools will be taking different sets of items (probably only about 50 items of the overall 20,000 items included in the field test for any given kid) with no comparability at all for item sets being given to other kids. Also, any total scores for a kid from the field test will not be meaningful, and no cut scores to translate into basic, proficient, advanced will be available for the 2014 field test exercise. Comparability across kids, schools, districts, subgroups, etc., as well as meaningful interpretations of total scores, will have to await the 2015 administration of SBAC computer-adaptive tests, and as explained previously those elements of test development won’t be available for use until fall 2015.

              Re your question about my previous comments on changing something to achieve comparability, yes I submitted a proposal to the legislature on how to transition from STAR CSTs to new common core consortium tests over a multi-year period (I suggested 4-years) to make the changes bit-by-bit, first changing to computer-based fixed form tests, then adding computer-enhanced items, and finally going to complete computer-adaptive tests, with the transition designed to generate reasonable comparability checks for each stage of the transition. [I recall John posted a link to this proposal in the comment section of a previous post on statewide assessment issues.] But that proposal did not fly in legislative circles, and instead we had the end-of-session surprise the first week in September to flush STAR CSTs almost entirely for the 2014 testing cycle and instead go “all in” for the SBAC computer-adaptive tests in 2015, a plan essentially to dive into the deep end of the pool with no swimming lessons on either the common core itself or the new elements proposed for the test administration protocols . . . . that was a very dubious thing to do, in my opinion [I’m trying to keep my language suitable for this public forum . . . . . .].

            • Manuel 10 years ago10 years ago

              I have to say I feel Doug's pain. The implementation of SBAC is being done for political reasons and not because of following a proper testing strategy. But that's the world we live on, no? It may come to you all as a surprise that I say there is a right way to do testing. The fact of the matter is that I don't have a problem with testing. I have a problem on how it is … Read More

              I have to say I feel Doug’s pain.

              The implementation of SBAC is being done for political reasons and not because of following a proper testing strategy. But that’s the world we live on, no?

              It may come to you all as a surprise that I say there is a right way to do testing. The fact of the matter is that I don’t have a problem with testing. I have a problem on how it is implemented and then how it is sold to the public.

              I was once told by someone in the know that California did not fare well in the Stanford 9 and the CAT 6 because the norm population was very different than California’s and that is why the CSTs were developed. There has been much back-and-forth from my part claiming that the CSTs are set in stone and we should not expect much variability given what we know of the black box they are.

              Whether this is true or not is now moot because the SBAC tests represent a new world we know nothing about and are jumping into the deep end of the pool, as Doug puts it, and without even floaties, I might add.

              I think that the best we in California can do is simply wait it out. Our schools will continue to operate as they have always done, producing enough qualified students that fill all the seats available at CSU and UC. It won’t be the end of the world if the feds take away some of their funding because we don’t comply with a law that has not even been reauthorized. What are they going to do? Repo our schools? Prevent California graduates from being admitted to schools in other states?

              The only ones who are not happy about this are those whose livelihood and/or derived power depends on this relentless “accountability” binge they’ve been on for more than 13 years. Enough already. Let communities take care of their own problems without the state or the feds breathing down their necks while muttering “tests with rigor, tests with rigor…”

              For all we know SBAC might turn into a bust…