I've not seen as much comment about the second article in the LA Post's series on value-added analysis as we did for the first. This one, LA's Leaders in Learning, focuses on school-level analysis, and with a change in framing would serve as a blistering critique of No Child Left Behind's accountability regime. It turns out that using value-added analysis, one finds that many low poverty schools with low absolute scores identified as failing under NCLB are actually achieving greater growth, more learning, than many other schools with more affluent student bodies and higher scores.
It is, of course, not particularly surprising. This is presented in a rather muddy fashion. On one hand, there's too much casting about for blame, when the fact of the matter is districts have had to focus on NCLB style accountability because that's the federal law, and while the Obama administration professes to want to change, their actions up through the latest round of mandated school restructurings, have maintained the same system. Also, getting too wound up about the availability of school-level growth data in elementary school is a bit of a show. The kids are tested every year, so one can pretty clearly see even by looking at one year where the kids are starting and finishing in a given school, and to be honest, it isn't clear to me that it is a more accurate evaluation of a school to look at, say, where this year's fifth graders started six years ago, since it is fairly likely the school has been extensively reorganized at least once in that span of time.
My experience in the golden state is not that a "major" reform would have happened in that five year period, but a series of small reforms that contribute to a lack of consistency in curricular programs. Some schools do have major staff and/or admin turnover in that period, but it's the range is more like 5-7 years. What is a constant is the high level of transiency in the student population (this was cited by staff at Central Falls as a factor in the schools difficulties). My district calculates the "transiency" rate at my school at something like 95%, but I question their methodology because a check of students from one year to the next, shows that about 30% of the students have turned over (~30% leave and are replaced by new students). Based on observation, many of these student return and in some cases leave again during the course of their elementary career. THIS is a significant problem because many of these students are in the most need, and their coming and going impedes our ability to assess them and provide needed services.
In general, students who spend their whole or most of their elementary years with us in a consistent block, do better on testing, etc. but this may be a sign of a more stable home life, etc. as much as the job we are able to do by having them there over time.
Post a Comment