I’ve had this post open in my browser for days. I read it, and then let it sit, and I just now went and read it again, and the comments. I’ve written many, many times about how frustrating I find it that people think Computer Science = Teaching Excel or how to use the Internet. Computer Science is a very, very broad field, and in fact, I would argue that it can encompass Digital Literacy. The writer of the post I linked to is frustrated by the lack of distinction, too, which she argues takes away from the importance of Digital Literacy by focusing more on Computer Science. So she’s on the other side of this issue from me:
It’s dismaying then, to see in a week where we are seeing a huge move forward in the promotion of technology and a fresh look at how ICT as a subject area is designed and implemented in schools, to see digital literacy being used as an interchangeable term for computer science skills.
Her focus is on the British Government’s announcement earlier in January to revamp the ICT curriculum so that its focus is more on computing and computer science, including coding. That announcement left CS teachers here salivating as they’ve been fighting to get any kind of computing into the curriculum. ICT or Educational Technology as it’s often called here in the states in “integrated” into the curriculum, sometimes fabulously, sometimes not. In some schools, it’s specifically taught as a separate class, sometimes not so well.
Yes, I think being able to blog and tweet and build documents together online and skype is all good. And if, as Josie says, it’s about critical thinking and lifelong learning, why is Computer Science not about those things, but Digital Literacy is? There are people who think that things are done on computers because it would be too hard to do them some other way. Facebook and Google are the way they are because someone programmed them to be that way, and if we don’t understand that, then we have a big problem.
Program or be Programmed, Rushkoff’s book, is an apt mantra for today’s world. We don’t have enough Computer Scientists not just serving as programmers, but working in other fields. And while I don’t believe that there’s such a thing as a Digital Native, and that we can just let the kids take care of their own digital literacy, I don’t think we can say that teaching DL is more or less important than teaching CS. I’m watching us all latch onto devices that can’t be easily hacked. Can you write a script for your iPad on your iPad? We’re dependent on software developers to create tools just to allow us to view Flash on them. We’re letting huge companies dictate what we can do with our tools. We need more people who are, yes, digitally literate, but who can participate in the development of tools that allow us the freedom to work in the world in whatever way we need to. That’s what attracted everyone to the Internet in the first place. The Internet would not exist if we didn’t have coders.
Sorry, but I’ve grown increasingly frustrated by this focus on “21st Century Learning” and “Digital Literacy” without anyone recognizing that without Computer Scientists, we would not have those terms. I’m watching fellow CS teachers being asked to teach digital literacy classes when they could be teaching Python or Java or helping a kid develop an app. Many of us feel that we’re being shoved out by the call for “21st Century Learning”. What’s more 21st Century than knowing how to code, or having a deep understanding of how computers work? Or having people able to harness the power of computing to solve our biggest problems: cancer, global warming, famine, transportation. That’s where we’re headed. Those problems will be solved by people plus computing.