Wednesday, March 19, 2014

Grant Wiggins Actually Kind of Nailed the CC in 2011

Grant Wiggins:

I know, this is a little dry – but it matters greatly. The poor quality of local assessment and the mismatch with state tests is explained by it. The locally-designed items/questions/tasks used are often too low level and not valid measures of the goals in question. (This has been shown for decades using Bloom’s Taxonomy). So, students and teachers are often shocked when test scores come back; ironically, state tests are much harder than typical school tests. This issue can only be solved by clarity about the performance deamnds stated and implied in the Standards – typically via verbs and adverbs – as well as by sample valid (and invalid) performance indicators and performance tasks being added to the Standards.

So, what do we find in the Common Core? Not much help at all: no glossary or discussion of why those verbs were chosen; and we see inconsistency in how the verbs are used across grade levels. And zero help on performance standards from the math group. ...

And that’s the point:rhetoric seems to be driving the work not intellectual clarity. This lack of attention to clarity and precision completely undermines the idea of Standards. You can bet dollars to doughnuts that some well-intentioned local educators are going to misread the Standard not because they read poorly but because the Standards are too vague and arbitrary in their language, especially across grade levels. (Yes, I know Standards are inherently general; that’s no excuse for shoddy language use or unclear terms and no Glossary).

As I mentioned above, I do not even understand why there have to be grade-level differences at all as long as there is a degree-of-difficulty of text standard – which there is – AND if there are rubrics and anchors for the scoring of work against the Standard over time on a continuum of sophistication. Why not just use only the Anchor Standards, then show samples of work to show what increasingly-sophisticated work against that same Standard looks like? That would greatly simply the whole enterprise and clarify that the point is increased rigor on the ‘same’ standard rather than spurious changes in the same Standard.

I know the answer, alas: the writers of the Standards and their guides didn’t think through the relationship between content standards, process standards, and performance standards. Good Lord, at least the ELA document included an Appendix with sample performance tasks. The math people provided us with absolutely no guidance as to what counts as appropriate performance tasks and appropriate levels of performance in terms of meeting the Standards.

All of this is fixable. But who, now, is in charge of these Standards? How will needed edits get done, and on a timely basis? Beats me. How will the 2 assessment consortia develop a valid test of these Standards without such clarification? Beats me. Write your local state people and demand better.

Nonetheless, he was and is now a fan. This is what makes my head explode.

No comments: