David Thornburg and Bill Kerr offer some "Web 2.0" in schools pushback.
We have lots of new technologies that have the power to transform education (e.g., MIT's "mesh"networking for the OLPC) but these topics have yet to make it into many presentations, and when I talk about these topics at conferences, you can drive a truck through the room without hitting anyone.
And yet, stick "podcasting" into a speech title, and it virtually screams "cutting edge!" even though some of us were posting audio files in MP3 format on the web years before Apple introduced the iPod...
So, to me, newness implies just that - something that hasn't been done (in education) before. There are tons of innovations waiting to be shared with educators. But until we see blogs, wikis, etc. as extensions of old technologies, we don't give ourselves time to explore the truly new. That, I fear, will hold us back from bringing the benefits of our new tools to all children.
Web 2.0 has become the new conventional wisdom of those who see themselves as radical reformers of the education system. Flashing bells and lights, gee wizz. Web 2.0 dominates educational technology conferences just like logo used to dominate educational conferences (without being deeply understood) in the late 80s, early 90s. This is a new majority within a minority. Let's sit around and self righteously criticise other educators because we get it and they don't.
I think a lot of the phenomenon that David in particular is pointing to in his piece is due to the still unintegrated nature of the "ed" and "tech" in "ed-tech." Most "edubloggers" just have a limited technical background. Most people on the "tech" side of "ed-tech" aren't in this conversation at all, and basically approach their work like MCSE's in a generic enterprise, whose job security depends on their minimizing risk via limiting access.
To the "ed" person, a new technology is irrelevant until they can access it at no cost (because they have no software budget) without having to install anything on their hardware (because they can't). On the tech side, new technologies are irrelevant until they are installed by default, interoperably, on the OS image of every OS they support. Whatever explicitly educational software is purchased for teachers seems so irrelevant to not even bear mention.
You don't even have to reach far for a perfect example: Zeroconf, aka Bonjour or Rendezvous networking. It is a mature, open, cross-platform standard, long supported by Apple, with free implementations for Windows and Linux, and support from lots of printer and other peripherial manufacturers. There are many, many applications for this technology in schools, but until OLPC started baking it into Sugar, these possibilities went almost completely unexplored. And if you can't install any software, or if you have no incentive whatsoever to open up the computers and/or network you're responsible for, then experimentation is barely possible, and it is hardly worth even talking about.
Of course, this is a vicious cycle
Nonetheless, there are interesting things to do on the web these days. I'm just reminded of John Gruber's line about the iPhone's (lack of) an API:
If all you have to offer is a shit sandwich, just say it. Don’t tell us how lucky we are and that it’s going to taste delicious.
A huge portion of "Web 2.0" innovation in schools is work-arounds for the closed and broken architecture schools have been saddled with. It's good that we've got work-arounds, but lets not confuse that with innovation.
Unfortunately, so-called "experimental" or "cutting-edge" technology often gets associated with "breaks a lot". Or more accurately: "breaks our system". This is due to any number of factors, including unimaginative system structures where you only have a limited number of operating regions that are really just clones of each other (development, pre-production, production, etc.). If you break one of these regions, people tend to go apeshit and the lockdown snowball begins.
I long for the day where schools/universities/corporations will give users each their own virtual machine to play with. To experiment with, break, repair, etc. without affecting the greater "production" environs that runs mission critical apps.
In a more general sense, it's going to take a change in leadership thinking for much of this to happen. Leaders would have to shift away from a "just make it work" and "don't break the system" mindset towards "let the users experiment". Can that happen on a widespread basis? I have my doubts...
Some of the early pioneers in the 60s and 70s understood the issues of making the technology accessible, bridging the gap b/w ed and tech, clearly: Ted Nelson. Seymour Papert. Alan Kay. The historical perspective is a good one to take too.
wrt technology and education getting to the point where powerful ideas can be assimilated by a lot of children (not just the advanced element) then it seems that it has taken 30+ years for this to develop. It's hard. eg. programmes such as scratch and etoys incorporate the best elements of smalltalk, logo, *logo and hypercard. On the other hand Kay argues that commercialisation of the 80s killed off innovation, that the youth were sucked away from open exploration of new ideas.
The recent NSF proposal by Alan Kay and team, "Steps Toward the Reinvention of Programming" is worth a look.
Post a Comment