PARCC re ducks

Many textbooks and other materials are lightly edited and rebranded by their creators as Common Core aligned, but being there is no central ministry of education, as in Singapore, which reviews materials and issues an official government seal of approval, anyone can make such claims with impunity.  Some education departments are making their own determinations, such as NYC, which chose “Houghton Mifflin Harcourt's ‘Go Math’ program for elementary students, and Pearson's ‘Connected Math Program 3’ for the middle grades”, or Louisiana, which last year rejected “every math and reading textbook submitted by publishers”.

The precise wording of the 93-page Common Core State Standards for Mathematics notwithstanding, a lack of consistency in interim assessments, independently developed and posed to students in states such as Kentucky, New York, Illinois and North Carolina, raises the issue of whether these test questions accurately reflect the Standards and manifest Common Core’s intent, but no matter: states, too, are barreling ahead with no independent oversight.

Carol Burris, a principal at a high school on New York’s Long Island, whose essays are often published in the Washington Post blog The Answer Sheet, recently critiqued a math test for first graders and critiqued several sample math questions.  Lest we ourselves become completely overwhelmed by myriad Common Core offerings that run the gamut, we declined to pass specific comment on those independently written questions, and instead continue to focus on states’ sample and/or actual assessments and, to date, sample-only questions designed by the two  “official” consortia, SBAC and PARCC.

This preamble brings us to PARCC’s latest batch of sample items, twelve in total, released in early November, for Grades 3-6 (nothing new for Grades 7 or 8) and high school.  Fasten your seat belt…


Computer-based assessments

Our ongoing review of Common Core sample tasks from the SBAC and PARCC consortia and the State of Illinois, as well as questions that New York this year actually posed to students in the interim period before consortia assessments take over, has exposed issues with standards alignment, poor wording, incorrect mathematics, and odd interfaces, but no issue stands out more than this: none of the SBAC or PARCC extended tasks as of yet take advantage of technology’s capabilities in such a way to justify the transition to computer-based assessments.

Jason Becker, on his blog, characterized it this way:
[The SBAC tasks] represent the worst of computerized assessment. Rather than demonstrating more authentic and complex tasks, they present convoluted scenarios and even more convoluted input methods. Rather than present multimedia in a way that is authentic to the tasks, we see heavy language describing how to input what amounts to multiple choice or fill-in the blank answers. What I see here is not worth the investment in time and equipment that states are being asked to make, and it is hardly a ‘next generation’ set of items that will allow us to attain more accurate measures of achievement.


Illinois joins the procession

The Chicago Tribune on Halloween 2013 reported “Illinois grade school test scores plunge — especially in poor communities”.  Following on the heels of New York’s testing debacle, in which performance gaps widened as well, we’re beginning to detect a pattern.  

The Illinois State Board of Education, on its student assessment page, reports that it had to “adjust the performance levels on the ISAT for Reading and Mathematics to better align with the more rigorous standards of the Common Core”, and a separate document entitled “2013 ISAT Mathematics Assessment”, dated December 5, 2012, and authored by “Rachel Jachino, ISBE Mathematics Principal Consultant”, states that “[a]pproximately twenty percent (20%) of the operational items on the Reading and Mathematics ISATs were written to Common Core Standards and will be included as part of students’ scores/results for the 2013 ISAT,” but it remains unclear what portion of the drop in 2013 scores is attributed to when the “state increased the scores required to pass ISAT math and reading tests by 13 to 30 points, depending on the test and grade” (Tribune), or to an actual change of test content.

Is any improvement in store for this year?  Cut scores are not set by teachers, but we’re well into the 2013-2014 school year, and the 2014 ISAT Mathematics “Roadmap” (pdf), intended to guide teachers in their instruction, has yet to be released.

The ISBE recently removed sample questions for the 2013 tests on the premise that the “sample items displayed were not necessarily representative of the material that will appear on the 2014 ISAT”, which, incidentally, will be a one-year deal, because Illinois will “replace the ISAT with the Partnership for Assessment of Readiness for College and Careers (PARCC) assessments during the 2014-2015 school year.”

It’s a foggy road ahead indeed for Illinois teachers and their students.


Fractions are numbers, too – Part 3

The National Assessment of Educational Progress (``The Nation’s Report Card’’) in 2007 posed the following question:

Sidestepping for now the issue that only half of eighth graders correctly solved an elementary school problem, calculator allowed, the intrigue lies in the distribution of wrong answers:


A stroll through the PARCC

While Smarter Balanced Assessment Consortium has already administered a pilot exam with more than one million participating students, the other state consortium, Partnership for Assessment of Readiness for College and Careers, seems well behind in what is becoming a consortia war, and has already had at least one state jump ship.

To further add to our sense that PARCC is in disarray, information releases coming out of PARCC are few and far between and have this year decreased in frequency.  Nevertheless, with little fanfare, PARCC on August 19 released for public review a new series of item prototypes.  Two companies, ETS and Pearson, have contracted with PARCC “to develop the items and tasks for the PARCC mid-year, performance-based and end-of-year assessments”, but PARCC does not make clear the source of its latest batch of questions.


NYSED's released 2013 exam questions

The New York State Education Department clearly tried to play it safe in designing questions for the Grades 3 to 8 exams, with a lot of calculation and solve-the-equation problems, which would have been the same pre- or post-Common Core.  Where NYSED gets into trouble is when it tries to be creative, particularly in word problems.

So without further ado, here’s a look at some of the released questions from the 2013 New York State Testing Program’s Common Core Mathematics Tests.


SBAC practice tests run-through – Part 1

At the end of May 2013, following its first pilot test for students, Smarter Balanced Assessment Consortium followed up on its previous releases of sample Common Core assessment questions with new batches for grades 3 through 8 and grade 11, in the form of online practice exams.

SBAC has since tweeted (read: crowed) unceasingly about these practice exams...
...and retweeted as well anyone else who’s mentioned the exams...
...in a positive light.


Good luck, New York's children

With mathematics exams that will be administered starting next week, New York is jumping the Common Core gun by offering its own version of Common Core-aligned assessments for grades 3-8. Sample questions have been provided on the New York State Education Department’s web site.

Here’s a quick look at some of those questions, and our Roger Ebert-style final evaluation for each:

Grade 3


Godzilla vs. Consortia

We previously examined in detail the content of a sample question coming out of Smarter Balanced Assessment Consortium, and it would be only fair to have a look at one of PARCC’s high school questions as well.  At the high school level, PARCC to date has played it safe by only releasing three official questions, all of Type I, its most basic level, which in the following example requires students to perform a mathematical procedure and enter numbers into empty boxes:
To work out this math problem, students must still resort to rewriting the entire equation on a piece of paper, with the very real possibility of a transcription error, so exactly what advantage for students has been gained by moving such assessments onto a computer?


The Life of Pi

Beware the ides of March and a Happy Saint Patrick’s Day t'ya, but as winter snows abate and the vernal equinox comes nigh, a young mathematics teacher’s fancy turns to thoughts of...π.

Spreading like the ``Harlem Shake’’ meme, it seems every classroom across the land now observes an in-school holiday called Pi Day.  For the uninitiated, Pi Day is cleverly celebrated on March 14, or 3/14.  A quick trip over to Pinterest regales you with endless examples of the hackneyed puns, cartoons, song lyrics, decimal expansions, and other non-activities that grace school walls to mark this annular (HA!) event.


Fractions are numbers, too – Part 2

We have a lot to say about CCSSI’s treatment of fractions, which starts tentatively with 1.G.3, but we’ll initially hone in on Grade 3, which is where Common Core begins its big push.  We’ll discuss Common Core’s sequence, and compare or contrast it to our own preferences for how fraction concepts should be introduced, and if we differ, provide a (hopefully justified) rationale for our choices.

3.NF.1 states, ``Understand a fraction 1/b as the quantity formed by 1 part when a whole is partitioned into b equal parts; understand a fraction a/b as the quantity formed by a parts of size 1/b.’’


Fractions are numbers, too – Part 1

We hold these truths to be self-evident, that all numbers are created equal...
(Well, Abraham Lincoln or Thomas Jefferson could have written this.) 
On February 8&9, 2013, while much of the northeastern US was getting socked with a blizzard, a symposium was held at Educational Testing Service headquarters in Princeton.  The meeting between ETS and the National Urban League was entitled "Taking Action: Navigating the Common Core State Standards and Assessments," and the purpose was to ``discuss [the] impact of Common Core State Standards on underserved communities’’ and ``consider strategies to succeed with the new standards and assessments.’’

We stumbled across the live-twitter feed by accident, but immediately recognized the meeting's significance, as David Coleman, Joe Willhoft, and Doug Sovde, three Common Core ``biggies’’ were all featured speakers.  For them, it offered an opportunity to ``sell’’ CCSSI to important community groups: in addition to the NUL, representatives of the NAACP, NCLR and SEARAC were also in attendance.


Which Super Bowl is this, again, dear?

Another pop quiz!

What do the Queen of England, the Super Bowl and the United States Constitution have in common?


Consorting with consortia

Common Core sets the standards, and two consortia, PARCC and SBAC, will write and grade the tests that assess whether those standards were attained.  It’s a very cosy, profitable arrangement where the heaviest burdens and risks fall on the educators and students.

UCLA’s National Center for Research on Evaluation, Standards, and Student Testing (CRESST) has published this month a report called On the Road to Assessing Deeper Learning: The Status of Smarter Balanced and PARCC Assessment Consortia.

The report gives a general stamp of imprimatur to ongoing progress (what did you honestly expect?) cloaked in the usual hedged language:
Study results indicate that PARCC and Smarter Balanced summative assessments are likely to represent important goals for deeper learning, particularly those related to mastering and being able to apply core academic content and cognitive strategies related to complex thinking, communication, and problem solving.
Any challenges in implementation, the report foresees, will not be substantive but rather ``technical, fiscal, and political’’.

The CRESST report sounds one note of caution: ``Absent strong representation in the new assessments, students' deeper learning likely will be compromised.’’  Therein lies the rub: will the ``assessments call for deeper learning and reflect 21st century competencies’’?  (CRESST report, p.5)

As we at ccssimath.blogspot.com are also interested in the assessments being developed by PARCC and SBAC, we thought we’d follow CRESST’s lead and release our own status report.


Graphs and data analysis – Part 2

What happens to 12 pizza pies at a school party?  POOF, before you know it, they’ve disappeared.

We begin, as we often do, with questions from the National Assessment of Educational Progress (``The Nation’s Report Card’’).  In 2005, students were asked to study a pie chart (alternately called a circle graph):


Graphs and data analysis – Part 1

In 2007, the National Assessment of Educational Progress (also known as ``The Nation’s Report Card’’) gave the following task to 4th graders:

This question demonstrates that even at the elementary school level, it is possible to usefully integrate several aspects of problem posing: that the problem be lengthy, that it bring together multiple math skills, and that the path to the solution not be readily apparent.  (We have previously coined the phrase ``length, connectivity and dimensionality’’ to describe this triumvirate of features.)

As our regular readers may surmise, we wouldn’t lead off with this question if there weren’t something more to it.  The specific task the NAEP required students to perform barely scratches the surface of the issues and learning possibilities contained in this fact pattern.