Consider reading fluency, for example. (I'm not saying that fluency is more important than comprehension. I just have the experience with this to imagine what I'd do as a principal.) Teach a paraprofessional to have every first- and second-grade student in the school read to them one minute a week on a sample reading passage (there are sets of roughly equivalent passages one can purchase for this purpose). Have them enter the data through a Google Docs form, a SurveyMonkey survey, or some other tool that will send the data to a spreadsheet. Get someone to program the results so that you can show data per child with trend lines and sort by grade, classroom, etc. For a few extra lines of code, you could add locally-weighted regression trends to be really fancy, but that's beside the point.
Here's the point: this is not rocket science, this does not require a gazillion-dollar software package from TestPublisher Inc., and it's very different from the type of quarterly testing that superintendents are buying into in a big way (including that gazillion-dollar software package from TestPublisher Inc.).
Or perhaps use a free, customizable student information system.
I used to use a simple Excel grid. I expected my band students to be working on an array of individual skills. Every student's profile and goals were different--a totally individualized assessment program. My students could could easily keep track of what goals were reached, by checking the program on the lone classroom computer. Assessment happened in tiny odd bits of time, and was recorded and updated fluently. Even better, because I had students for two years, I had records of what they had mastered over a long stretch of time, making it easier to assign new and more complex individually tailored challenges.
In 1999, we were the pilot building for on-line, parent-accessible grading. We were all trained to use a hideously complex Novell program. Because parents started visiting our on-line gradebooks daily, teachers were instructed to post at least one grade per week, preferably more. And each grade had to be converted to points or percentages, and weighted. My perfect system was incompatible. It wasn't a matter of assigning points--there was no way to record the data, because the Novell program didn't recognize my symbols. I had, at the time, over 300 students and an average of 19 data points for each. The technology coordinator from the district came to "help" me. The conversation went something like this (he's looking over my shoulder at the Excel grade program I've devised):
TC: What's this G-B? G's not a grade.
Nancy: No, G's a key signature. And it's G-flat.
TC: How many points is it worth? And what's this goal thingie with the six?
Nancy: He's set a goal of six memorized scales this period. No points.
TC: What do you mean, no points? There have to be points! And this kid has four goals.
Nancy: Right. He just switched to bassoon, so four is a challenge.
TC: You mean the kids aren't all learning the same things?
Nancy: No. They set goals. There's a minimum requirement. I push them if I think they can do more.
TC: But that's not fair. And where are the grades?
Nancy: No grades. It's either mastered or not mastered. And it's infinitely fair.
TC: Well, how do you total up the points and figure a grade?
Nancy: No points. Grades based on whether they met their goals.
TC: This is a disaster. How do kids know where they stand? (as kids are looking at their personal data three feet from where he's standing)
And so on. In the end, I got a memo from the principal, distributed all over the district (the tech coordinator has a big mouth) saying that I had to have points, grades and post once a week.
Ah yes, we had some heated discussions about this at the SchoolTool sprint last weekend, as we're finishing up our 1.0 gradebook. We originally designed SchoolTool to handle poing and non-point based grading. That is, we have the data structures to model wacky grading systems.
Then you run into questions like "what's the average of "B," "check," "pass," "87," and "G-B?" So SchoolTool Gradebook 1.0 will be point-based.
OTOH, reading your post, it occurs to me that we can reasonably easily create a different kind of "worksheet" that allows different grading systems but makes no attempt to generate a mathematical average.
It is trying to accomodate both in the same space that drives one insane.
Also, my sister is having the exact same experience you did as an adjunct teaching art at Pitt this year -- except it is the other art professors telling her this, not a tech. It is driving her nuts.
Tom: "It is trying to accommodate both in the same space that drives one insane."
Nancy: Well, I actually did that, using Excel. In addition to individualized no-points goals (scales, technical skills, contest solos and ensembles, etc.), I also had assignments that every kid had to do. They all did a specific, themed musical composition every year, for example, that was graded. I used a narrative-based 4-point rubric (because it made so much sense and wasn't attached to percentages, which skew grading). I also let kids who got 1s and 2s revise their projects once I gave them feedback. Most of them chose to do that. Expanding their learning.
I put the non-mathematical data and the mathematical evaluations on the same grid, in separate shaded columns. Jessica can see that she met 5 of her 6 goals, and got a "3" on her composition. What's that? It's a B, from my perspective, unless her six goals included playing the Mozart horn concerto with the youth symphony, and the reason her composition (a 3-minute movement for woodwind quintet) got a 3 was because it wasn't finished. Then, it's an A-.
This kind of holistic, non-standardized evaluation of student work drives some people crazy, of course. They want "scientific" numerical comparisons. Because what matters most is not demonstrating what you've learned, or how high you set your goals, but how you compare to other kids. Right? Letting kids set their own goals (with a default suggestion for the clueless) works pretty well, surprisingly, and not just for highly capable kids.
In any event, technology often drives assessment, when it ought to be other way around. You've heard the old saying "to a man with a hammer, every problem is a nail?" To a man with a computer, every problem can be solved with more data.
I'm not saying that using such a system, particularly of your own design, will drive you insane -- trying to write one piece of software to anticipate how everyone will expect it to work.
So, for some people the whole point of having a computerized gradebook is that it will calculate a final grade automatically, so everything has to be numbers underneath. For others, at most they have to eyeball a row of symbols and get the gestalt.
But yes, trying to write software that gets out of the way of innovation in school design is the entire reason I do what I do.
Here's something I forgot to say:
I just found this blog. It's great--I'll be back. You're invited to come look at my playground, too:
Post a Comment