Thursday, September 30, 2010

Looking at NYC Charter Grades

Angus Davis:

Congrats to kids & teachers @ Democracy Prep on becoming top-ranked middle school in all of New York City!

Indeed congratulations. The interesting thing is that both Angus's and my earlier post on Democracy Prep's latest test scores are accurate.

That is, the charter middle school with the highest overall score of any middle school in the city which arguably has the best crop of charter middle schools in the country has ELA scores less than the city or state average.

Which is to say, it is a good school! But it isn't eliminating the ELA achievement gap, even after students spend three years in the school.

However, Democracy Prep still rates an "A," unlike some other well regarded NYC charters (just going through the ones I can think of, tbh):

  • Harlem Children's Zone/Promise Academy Charter School: B
  • Harlem Children's Zone/Promise Academy II: C
  • Harlem Success Academy 1 Charter School: A
  • KIPP Infinity: A
  • KIPP S.T.A.R.: B
  • KIPP Academy: A
  • Achievement First Crown Heights: C
  • Achievement First East NY: C
  • Achievement First Endeavor: C
  • Achievement First Bushwick: B

One thing that's peculiar about Democracy Prep's scores is the change in their relationship to other city schools. Last year their proficiency rate was about 20% above the city average, now it is about the same. The report card gives each school a somewhat cryptic percentile score compared to other schools in its "peer horizon" (similar demographics, etc.). Last year, in ELA level 3 or 4 proficiency this came out to 136.6%. This year, it is just 34.4%. That's a pretty steep drop! Compared to the whole city that went from 78.7% to 35.3%.

I have no particular insight into what that might mean...


Anonymous said...

It means damn little.

The "scores" are non-comparable year-to-year (as the formula changes each year. This version is a big change from last, except this time no school fell more than two letter grades - because if they did they got rounded back to 2 letter grades down)

The tests they are based on are non-comparable year-to-year (with last year's big adjustment).

The comparison "peer groups" are artificially created, and the scores rest in large part are based on "peer groups."

Sorry, really nothing to analyze here, just move along.


Tom Hoffman said...

Yeah, I'm not trying to make any subtle analysis here, but a 100% drop caught my eye...

Jason said...

The main differences from the NYC DOE webpage, if you read the material, is the use of a far better growth model + updated peer groups for the schools.

So what happened was likely two-fold-- Democracy Prep looked better on the student-level growth model that actually accounts for previous achievement and either Democracy Prep has a more challenging population than it used to have or its old peer group has a less challenging population because they're being compared to different schools.

All in all, I would say the new metrics are being calculated in a more accurate and fair way than in the past. The result is that NYC charter schools on the whole look a lot worse than they used to (

That's not surprising to me. Charters have far greater incentives to learn how they're being assessed and do whatever they can to look best by that assessment. Whenever the way we judge success changes, charters are likely to lose the most because they're the most likely to be geared toward doing all the right things to look good. I would also bet they look just as good as they used to in 2-3 years as they adjust practice to the newer methodology.

This doesn't mean the schools have actually gotten better or worse, just that they're responsive to quality assessments. That's why the trick has to be to ask the right questions which push charters to improve in ways that realize true gains in actual student achievement rather than to measure success in ways which are easily manipulatable and do not require true student success for apparent school success.