Close readers of this blog have probably noticed more than one post about how I think technology in education is overhyped and that we could all benefit from stepping back a bit and thinking a little harder about how we use technology in education. In higher ed, more so than in K-12, there’s a tendency to use technology to automate, to gain efficiencies, choose your business-speak term here. CMS’s no longer are about changing the game of teaching, but are about 24/7 access to materials, managing grades and assessment, making it “easy” for faculty to post things online. Video isn’t for thinking about material in a different, more visual way, nor is it something that students create, but again, it’s a way to distribute lectures to ever larger classes. Technology, at the institutional level, isn’t about teaching and learning. It’s about a bottom line–somewhere. That’s not to say that there aren’t individual faculty and students out there using technology to transform the way they teach and learn. Most of those people, for the record, are not using a CMS for those purposes.
At all levels of education, there is sometimes a tendency to throw technology into the classroom because it’s there, because it’s good to say that every classroom has a Smartboard or set of laptops or iPod touches. And then, administrators, parents, etc. want those expensive things to be used. The problem is many people put the cart before the horse. Technology should be used to solve a problem. A problem shouldn’t be created in order to use technology. Let me give a personal example. In teaching writing, I had a problem of getting students to understand what it means to write for an audience. Writing teachers everywhere start to recognize when a paper is written “for the teacher.” The 5-paragraph theme comes to mind. I turned to blogging, a technology, to solve this problem. As it turned out, it solved a whole host of other problems as well, and turned out to be a fabulous tool for teaching, imo, almost anything. The idea of writing to learn has been around for a long time. Blogging to learn takes that to another level.
The question, then, that often gets asked is, “What can I use x (technology) for?” The real question should be, “I have x problem with my pedagogy, how can I solve it?” And, if technology integration is important, then one might start with some ideas around technology, but non-technical solutions should always also be considered.
So, I still believe in the power of technology to transform education, but I see the implementation of technology in many educational settings as wrong-headed. There are lots of reasons for that–from resistance to change to being in crisis mode (21st century skills, OMG!) to slick sales presentations. And I see educational technologists as sometimes part of the problem. A while back, I wrote:
I would say the same thing to the technology people out there whining about how people won’t use technology, how they don’t understand the changes it’s bringing, etc. First, I’d say think a little more critically about the technology you’re espousing. Too many technologists out there really sound more like evangelists, trying to convince people to use the snake oil. I understand. I was there. I felt the frustration, the worry. But I think technologists need to acknowledge the fear and the skepticism, not dismiss it as ridiculous. Yes, it’s a barrier, but not one that you knock down with a bulldozer. It needs to be dismantled bit by bit and it needs to be done with the help of the people that put it up in the first place. And we need to acknowledge that sometimes technology isn’t the answer and that some technology is being used in ways that are counterproductive to teaching and learning. Not everyone needs to blog and twitter and create multimedia presentations. Too often faculty see us as pushers of tools rather than as partners in education. And sometimes that’s because we project that attitude as often as that attitude is projected onto us.
In our zeal to get people as excited as we were about technology, we sometimes scare people away or cause them to dig their heels in.
My second shift in thinking about technology is similar to the first, but kind of different. Among the tech savvy educators I hang out with virtually, in person, and otherwise, I’m seeing a trend for not going beyond the applications. Now that I’ve spent my summer thinking about computing as opposed to technology, I’m starting to have a different kind of skepticism toward some kinds of technology. My goal now is to get computing into as many classes as possible. What does that mean? It means data visualization not creating charts in Excel. It means creating art with a computer program (like Processing), not Illustrator. It means coding up HTML, CSS, & PHP not using Blogger. It means creating maps with GIS or Google Earth, not using Google Maps. Don’t get me wrong. I love the way that many applications have lowered the barrier of entry for doing things like blogging or editing images, but I see it as my job now to get under the hood, to learn something about the code underneath so that we have students who can create the next Facebook or Angry Birds. So I’m starting to think, when someone wants to incorporate blogs or Google Docs or video, that’s great, but how can we take that to the next level. And it doesn’t have to be that particular teacher who takes it there. I’m not expecting your average history teacher to know HTML or Python or some other programming tool, but what I’m hoping is that the students will know and that they might be given the opportunity to create a historical map using some of the computing skills they have. And this is where my thinking is going.