One of the quirks of NECAP testing is that since the tests are administered in October, school level reports are given for both the "testing year," that is, kids actually taking the test in the school, and "teaching year," kids who attended the school the previous year. This is supposed to track kids both leaving and entering different schools. For example, last year I could track the scores of Feinstein High School students the year after the school closed by looking at teaching year data.
This gives us a peek into how last year's closures and restructuring affected school level scores. One thing we don't know below is how many students left the PPSD schools below, perhaps anticipating the influx of new students.
At Martin Luther King, there were 313 testing year kids vs. 264 from the 2010-2011 teaching year. In ELA the testing year proficiency was 1 point lower than the teaching year, 4 points in math and two points in writing. Howerver, the school's 2011 testing year proficiency was down 11 points in English and 4 points in math from the 2010 testing year. Digging in a little more, the school seems to have picked up 19 4th graders (18% of the class) and 12 5th graders (12%) at level 1 in reading. One gets the impression at MLK that everyone was dragged down by the change.
Looking at this a day later... this data is strange because the decline in math seems to be more or less directly attributable to new arrivals because the year over year and the testing/teaching differences are the same. In reading the effect is stranger -- the big year over year difference but lack of teaching/testing spread... weird. Peer effects?
At Asa Messer, the year over year drop in testing year scores was 18 points in reading and 14 points in math. The teaching/testing year gaps were 13 points in reading and 10 points in math. This reflects 38 more 3rd graders, 32 4th graders and 10 5th grade students. In this case the change in population seems to more directly explain the change in scores. Again, the school seems to have particularly picked up a lot of 4th graders with poor reading skills: 18 at level 1 (17%).
Blackstone Valley Prep (mayoral academy) has 104 6th grade students listed in the testing year and 84 in the teaching year (that's the only applicable year). I would guess that someone didn't do a very good job of tracking where leaving students ended up. Or perhaps some of them left the RI public school system entirely or they simply increased the size of the class without telling me. Regardless, the 19% change in population lowered their testing year reading proficiency rate by 6 points and the math rate by 3 points.
This is consistent with what I'd expect in a school trying to apply an authoritarian model designed for low-income minority students to a more diverse population.
Actually, I guess we can test that hypothesis...
- Fall 2010 5th grade testing year: 101 students, 41 white, 35 not-low-SES, 11 w/IEP
- Fall 2011 6th grade teaching year: 84 students, 36 white, 32 not-low-SES, 5 w/IEP
- Fall 2011 6th grade testing year: 104 students, 42 white, 41 non-low-ses, 8 w/IEP
Huh, that's not it. Based on this possibly misleading, inaccurate data, it looks like they lost 17 kids, 14 of which were low-SES, and replaced them with 20 kids, 11 of which were low-SES. So, to be fair to BVP, the most obvious interpretation is that their testing year scores are lower because they're bringing in kids from lower-achieving schools and that's bringing down their scores.
It would be interesting if we seemed to have the full set of teaching year data which would include the kids who left the school. I guess it is possible that twenty students either left the state or switched to private schools, but that seems fairly unlikely to me. Other charters I've looked at don't show the same phenomenon.
Incidentally, this year's 5th graders at BVP had proficiency rates 5 points lower in reading, 10 points higher in math and six points lower in writing than the previous year's. That's more or less the scores of the incoming students, although it probably includes repeaters. The BVP teaching year report cryptically lists 8 5th graders who aren't given scores on that year's test. Presumably they are repeaters, but I don't know why they wouldn't be reported. Or perhaps that number has some significance beyond my comprehension.
Anyhow, keep the jump in 5th grade math scores in mind when considering BVP's overall 25 point jump in math. Certainly the above would be consistent with a school that was good at raising the math scores of all students but held back several because of reading or other issues.
Consider this just an exploration of a small and unusual publicly available data set.