Alaska Succumbs   5 comments

I used to think of Alaska as one of the hero holdouts, because that state, along with Texas, Virginia, others, once flatly rejected Common Core. I remember reading with a mixture of awe and envy, how Alaska had opted out of the standards project in June 2009.

An Alaska Dept. of Ed spokesman, Eric Fry had once explained in a Heartland.org article that “We wanted to formulate our own plan… [Alaska] “would like to be the entity that declares its own standards.” http://news.heartland.org/newspaper-article/2010/03/25/alaska-texas-reject-common-core-standards

That was then. This is now.

https://www.facebook.com/StopCommonCoreAK

Alaska has now succumbed to the federal pressure and has officially and quite enthusiastically jumped into the nationalized education control trap.

Alaska will no longer be “the entity that declares its own standards.”

How did it happen? Well, Alaska decided to join the Common Core testing group called Smarter Balanced Assessment Consortium (SBAC).

Click to access sbac_april2013.pdf

Membership in SBAC demands that Alaska obey the decisions made by other, “governing” and “lead” states of the SBAC.

Of course, there was no vote by the Alaska legislature to decide to join Common Core. It’s an underhanded business, education reform. And what does it mean?

If you read the “Cooperative Agreement” between the SBAC and the Dept. of Ed, you will learn that despite the 10th Amendment to the U.S. Constitution and GEPA law, the SBAC members have agreed to obey every whim of the Department of Education and must:

Provide updated, detailed work plans and budgets for all major activities identified in the recipient’s application, including but not limited to:
• development, quality control, use and validation of artificial intelligence for scoring;
• selection of a uniform growth model consistent with test purpose, structure, and intended uses;
• development of performance tasks (addressing items such as technical challenges of scoring, reliability, and large-scale administration of performance-based items);
• development of a research and evaluation agenda (addressing items such as validity, reliability, and fairness);
• development and delivery of the technology platform for assessment.
3) Actively participate in any meetings and telephone conferences with ED staff to discuss (a) progress of the project, (b) potential dissemination of resulting non-proprietary products and lessons learned, (c) plans for subsequent years of the project, and (d) other relevant information, including applicable technical assistance activities conducted or facilitated by ED or its designees, including periodic expert reviews, and collaboration with the other RTTA recipient.
4) Be responsive to requests from ED for information about the status of the project, project implementation and updated plans, outcomes, any problems anticipated or encountered, and future plans for the assessment system, including by providing such information in writing when requested.
5) Comply with, and where applicable coordinate with the ED staff to fulfill, the program requirements established in the RTTA Notice Inviting Applications and the conditions on the grant award, as well as to this agreement, including, but not limited to working with the Department to develop a strategy to make student-level data that results from the assessment system available on an ongoing basis…” (page 3, Cooperative Agreement.)

But citizens of Alaska are speaking out.

An Alaska economist, Dr. Barbara Haney, put together the following list of questions:

1)What elected officials were involved in the process to opt into SBAC?

1a) Upon what authority did the state of Alaska put our state’s education system under the authority of the state of Washington and the SBAC consortium? Doesn’t this violate the Alaska Constitution?

1b) Isn’t SBAC an example of an Agenda 21 style regional board? In fact, isn’t this agenda 21?

2) Isn’t it true that the real reason that SOA entered into agreement with SBAC is to get the RTTT money and the NCLB waiver? How much money exactly are we getting from RTTT? To whom will those funds be disbursed?

3)The Race to the Top grant defines College and Career read as follows:
According to the USDOE “College- and career-ready standards: Content standards for kindergarten through 12th grade that build towards college- and career-ready graduation requirements (as defined in this document) by the time of high school graduation. A State’s college- and career-ready standards must be either (1) standards that are common to a significant number of States; or (2) standards that are approved by a State network of institutions of higher education, which must certify that students who meet the standards will not need remedial course work at the postsecondary level.”
http://www.ed.gov/race-top/district-competition/definitions
In other words, if you adopt the common core standards, you have career ready standards.

How do these new standards meet the needs of Alaska’s employers? (Specific references, specific industries, not platitudes). What career codes in Alaska’s economy are these standards keyed to? How does the SBAC test demonstrate this to Alaskan employers? How do these standards fit in with Alaska’s Manpower forecasts by AKDOL?

4) “Smarter Balanced is grounded in the notion that putting good information about student performance in the hands of teachers can have a profound impact on instruction and—as a result—on student learning.” http://www.edexcellence.net/commentary/education-gadfly-daily/common-core-watch/2013/by-the-company-it-keeps-smarter-balanced.html
Isn’t this teaching to the test?

Further, if that is so, then how will Alaska students perform well on the Common Core curriculum tests if they are not using the common core curriculum?

Isn’t this just the state’s way of bullying local districts into adopting the common core curriculum?

5) Another statement by SBAC to the State of MO in May 14, 2013 “This spring we are pilot testing the first 5,000 items and tasks we have developed with about a million students, engaging more than 5,200 schools drawn from all 21 of our governing states. The pilot test also serves as a beta test for our test delivery software. In addition to testing out our items, performance tasks, and software, the pilot test also gives us an opportunity to evaluate a variety of accessibility features for students with disabilities and English language learners.” http://www.edexcellence.net/commentary/education-gadfly-daily/common-core-watch/2013/by-the-company-it-keeps-smarter-balanced.html

Why is the state of Alaska not looking at established tests like ITBS and the ACT? Why are we using a test that doesn’t exist yet? Why are we using an experimental test?

How can SOA even argue that this is a test superior to other tests when the test hasn’t even been used anywhere?

Why was this test selected rather than ASPIRE, ITBS, or Alaska’s past NCLB test? Since that test is written for Alaska why couldn’t we continue to use it?

6) When SBAC was asked about their own cost structure on May 14, 2013 own cost structure, they stated:
“One element dominates the cost: approximately 70 percent of the vendor cost for summative assessments is tied to hand-scoring. Measuring the deeper learning required by the Common Core requires that students write extensively and much of that writing cannot yet be scored by technology. Paying teachers, faculty, and other content experts to score student responses is costly, but it is currently the only effective way to measure important elements of the Common Core.”

a) will Alaska Teachers be employed to grade Alaskan students?
b) isn’t this essentially what the original Alaska Test went to SBA testing? Didn’t we leave SBA testing due to this cost and alleged capricious nature of the grading system?
c) How then is the writing SBAC actually cheaper than the Digitcorp writing test?
Isn’t it true that SOA adopted this for the NCLB waiver and not because it is a superior test?
How does this test then become a superior instrument of evaluating student success?

7) In the area of English Language Arts (ELA), Smarter Balanced places these capabilities within its claims for both writing and for speaking and listening. In rural village schools there are some English speaking conventions are radically different from those in the roadway system. There is no way to avoid the obvious outcome that this test could discriminate against certain ethnic groups.
Has there been any effort to prepare these schools in speaking? Further, given that Hanley’s office indicates these schools will likely have a paper & pencil version of the test, how will the speaking component be evaluated?

8) SBAC funding ends Sept. 2014. In their comments to the state of MO on May 14, 2013, SBAC stated:
“At the conclusion of the federal grant, Smarter Balanced will transition to being an operational assessment system supported by its member states. The consortium does not plan to seek additional funds from the U.S. Department of Education.” http://www.edexcellence.net/commentary/education-gadfly-daily/common-core-watch/2013/by-the-company-it-keeps-smarter-balanced.html

How much will Alaska be expected to commit in the future of their funds? How does this break out on a per pupil basis (Vermont was told it would be $300 per student for the test alone). Where will this money come from?

Why did the state submit the members of the state to a new taxing authority?

Given Governor Parnell’s commitment to SB21 (now signed) and the short term revenue fall, where will the revenue come from in 2014 to pay for SBAC?

9) Pioneer Institute study on implementation show a staging acceleration in costs of SBAC. On average the costs are 4 times the amount given by the Race to the Top (RTTT) grant monies.
http://pioneerinstitute.org/education/study-estimates-cost-of-transition-to-national-education-standards-at-16-billion/

Will Borough Governments be expected to pay a share to SBAC? If so, have borough governments been informed for budgetary purposes?
How much will property taxes have to increase to meet these costs?

10) According to a CRESST study by UCLA & CA Board of Regents of SBAC and PARC dated May 2013 at http://www.cse.ucla.edu/products/reports/R823.pdf, page 9, second column, states
“Smarter Balanced plans to refine its specifications as it develops items and tasks, a contract for item development has been established, and item and task development are currently underway, as is a contract for specifying the test blueprint (see http://www.smarterbalanced.org/smarter-balancedassessments/ for the preliminary blueprints).

Why did the state of Alaska sign on to a test that is not yet written or tested? When there are clearly other tests available that are cheaper (by SBAC’s own admission) and comparable (according the Washington States’ OWN Washington Policy Center), why are we going with this far more expensive assessment?

11) The CRESST Report by UCLA on page 10 states, “However, collaboration may be incorporated into Smarter Balanced performance tasks, and metacognition may well be required in solving the complex, extended problems that both consortia plan as part of their performance task components.”

The use of group answers is a radical departure in Alaska State testing. How will group answers be used in scoring individual students? Will Alaska students be denied a diploma because they did not pass a group answer? Has the use of group answers been vetted in national testing norms? How will group answers be received by parents? Why does SOA DOE feel the use of group answers to be a superior measure of student performance over traditional methods of assessing individual students?

12) The CRESST Study further states on page 18 http://www.cse.ucla.edu/products/reports/R823.pdf
Both consortia have been optimistic about the promise of automated constructed-response and performance task scoring and have incorporated that optimism into their cost estimates for the summative assessment. Both are estimating summative testing costs at roughly $20 per student for both subject areas. In the absence of promised breakthroughs, those costs will escalate, there will be enormous demands on teachers and/or others for human scoring, and the feasibility of timely assessment results may be compromised.

(My note: Optimistic is academic way of saying full of excrement…) How will these escalating costs be met by the state of Alaska, particularly given that the full results of SB21 may not be realized?

13) Continuing on page 17: http://www.cse.ucla.edu/products/reports/R823.pdf the study states
“In addition to costs, extended performance tasks also offer a challenge in assuring the comparability of scores from one year to the next. Without comparable or equitable assessments from one year to the next, states’ ability to monitor trends and evaluate performance may be compromised.”

What this is saying that that this years scores cannot be compared to last years score (of course, there is no test yet either). So if there is no ability to make time series comparisons, how can you tell if a school is doing better or worse over time? This is a radical departure from past assessments used by SOA where there has been some degree of comparability over time. How can a school then look at last years results and this years results to measure improvement?

14) Continuing on page 19 of the CRESST Study http://www.cse.ucla.edu/products/reports/R823.pdf states specifically that SBAC is going against the grain of deeper learning assessments in their methodology.

“For example, Smarter Balanced content specifications include a relatively large number of assessment targets for each grade—on average 29 targets in mathematics and 35 targets in ELA. The claims, in contrast, reflect a reasonable number of major learning goals and represent the broad competencies that students need for college and career readiness. History suggests that focusing on discrete, individual standards is not the way to develop deeper learning, yet this is the strategy that states, districts, schools, and teachers have typically followed.”

Why is the State of Alaska then using an assessment of “deeper learning” that is designed in a way that history has shown will not reflect that deeper learning? Further, how will the curriculum used in schools reflect the acquisition of this deeper learning?

15) The CRESST Study on page 19 states, “Smarter Balanced has been very transparent in posting all of its plans and the results of its contracts. Yet, because its computer adaptive testing approach essentially individualizes test items to every student, it may be difficult to ascertain how well deeper learning is represented for every student or overall. The test blueprint will provide rules for item selection and presumably, those rules will include those for representing higher levels of depth of knowledge, but this is yet to be seen.”

If test questions are not the same for each student, then how can results be compared across students? Further, since the adaptive technology for the test does not yet exist, why is the state investing in it? Doesn’t this represent a radical departure from the traditional type of test given in SOA? Why does the state want to engage in this experimental test over other proven testing methods?

16) Many of the state’s schools do not have the equipment to offer this test on line. Who will be paying the cost of upgrading the school computer lines? Software? Computers? The purchase of additional computers?

In sum….
The test hasn’t been field tested, validated, or normed. The test will not offer a result that is comparable from one year to a next for a given institution. The adaptive technology isn’t available yet. Many of the districts in Alaska do not have the technology to offer this test. The Consortium is out of money in Sept. 2014.The test is using a strategy that has been shown to reflect the sort of knowledge it claims to test (deeper learning). The $20.00 per test estimate is considered overly optimistic and costs are expected to escalate. In contrast, there are instruments that have been validated that have a certain cost. Further, as the study states on page 18 “… while built-in accommodations may be easier to accomplish, there will still be the validity challenge of establishing the comparability of accommodated and non-accommodated versions of the test.”

17) Further, if the state is not using the Core Curriculum, then why are we using an assessment that reflects the core curriculum?

Great questions. Thank you, Dr. Haney.

Good luck, Alaska.

5 responses to “Alaska Succumbs

Subscribe to comments with RSS.

  1. I fear for the day that Nebraska goes this route as theboard of eduaction stated that they are afraid of losing federal money. But did say that before anything was decided which isn’t even near up for discusstion yet (thank goodness) they would have an open discussion for the public to voice their opinions. I hope they listen and not to the money

  2. Thank you very much. The questions you post look eerily familiar. Thank you for all your hard work in fighting in Utah! You have been an inspiration for us. During your battle down in Utah, I watched with great interest in order to help give guidance to other states. I never dreamed I would be fighting this in my own state, and in my own back yard.

    Please stop by http://www.stopalaskacommoncore.com and like us on facebook. We need all the help we can get.

  3. Reblogged this on The Arctic Conservative and commented:
    The State of Alaska recently entered into an agreement with SBAC. Those in Utah published questions posed by an economist on this matter.

  4. Pingback: An Open Letter to Governor Parnell Requesting Withdrawal from SBAC | Restoring Liberty

  5. Pingback: An Open Letter to Governor Parnell Requesting Withdrawal from SBAC – COMMON CORE

Comments are welcome here.