Remistifying “Digital Literacy”?

In the comments, my dear friend Everett remarks on a recent piece (http://nyti.ms/qhON3m) appearing in the NY Times.

It is a lament and a diatribe about the decline of the thinker and the rise of the information junkie in an increasingly “post-idea” and “post-Enlightenment” world where our capacity for rational thought has allegedly diminished, despite all of our technological advances. Neal Gabler contends that information itself might be partially to blame: “It may seem counterintuitive that at a time when we know more than we have ever known, we think about it less.” He remains skeptical about the possibilities afforded by social media and the Internet. They are part of the problem. While the online world excels at facilitating countless micro-discussions and exchanges on almost every conceivable topic, this hyperactive space tends to crowd out avenues for the slow churning of grand arguments and theorizations.

In one way his regular commenting on my blog (and, as you’ll see, at least one element of the post itself) goes some way to providing a quirky counter-current to his position.

The argument of the piece (a bit ironically, if Gabler’s argument is right, given its appearance in a print publication) is a bit muddled on what the problem is, what the sources of it are and what the implications might be, but his point about digital media can probably be summed up in his claim that “you can’t think and tweet at the same time…” His big idea is ‘short form media is bad for big ideas.’

While it is true that the average blog post is shorter than the average book, the problem with critiques like this is that they are stuck using a metric of information density which uncritically borrowed from the age of Gutenberg. Sure, it’s impossible to summarize big ideas in 140 characters. But a huge portion of people use Twitter not as a way to communicate directly, but only as a way to encode other kinds of communications. Why does the NY Times have its own dedicated microurl? Because of how  frequently people were using Twitter to link to articles in the Times. So when Gabler claims that Twitter is bad “because tweeting…is largely a burst of either brief, unsupported opinions or brief descriptions of your own prosaic activities…a form of distraction or anti-thinking…” he’s providing an unfairly narrow image of how social media is used.

Another example. It’s true that I waste some amount of time on facebook watching videos of cats chasing lasers (though my favourite online video remains this classic of cats who shoot lasers). But most of my time there is spent following links posted by friends, reading the comments they write on these articles, commenting on their positions, and, when I’m lucky, getting into an even more extended conversation on these topics. The reality is, the majority of my discussion of “ideas” now happens not IRL, but on facebook. This concerns me, certainly. But not because it heralds the doom of thought itself.

One could respond that, among those using online technologies, my network of friends is anomalous, and that, though Gabler’s vision of Twitter may be narrow, he’s right about the majority of online content. Well fine, but then, the only important question is, are people talking about big ideas more or less than before Twitter? Because I am willing to wager with 1 to 1 odds that most of Western societies has always talked about the mundane details of their lives, most of the time. Were the biggest celebrities in 1899 intellectuals, actors, or war heroes?

But let me get back to Gutenberg: no doubt, reading a lot of articles online is different from reading an entire book. But it’s not clear to me which form of reading allows more thinking. As I read Gabler’s piece, I stumbled on his use of “Gresham’s Law” (which is sad, because I spent much of August reading political economy). So I looked it up on wikipedia. It turns out that basically, Gresham found (by accident) that bad money will always replace good where both are available in the market. Which also implies that my bothering to provide a hard link to the wiki page on Gresham is kind of silly, because as my experience indicates, if people stumble with ideas in their traipsing through the blogosphere, they will do the legwork to find out more. Indeed, a lot of my online activities will lead into a web of related readings, some followed links, some watched videos. It’s not deep reading, but rather, networked reading.

What are the implications of this change in the nature of reading, for thinking, for ideas and for culture? A fascinating question no doubt, and one which is being addressed obliquely in the literary sphere. But knowing the answers, like knowing whether we talk to each other less (or more) about ‘things that matter’ than we might have in 1899, would require actual research, rather than just rehashing the warning given by Plato in the Phaedrus against the written word (it’s also online) whenever new communications technologies. I suppose communications analysis suffers from its own Gresham’s Law.

Now, here’s a fascinating idea: I noticed in a public presentation yesterday, the habit people have of looking up terms or sources they aren’t familiar with isn’t limited to when they are reading online. They do it in public, too. I wonder what my blogosphere will think about that.

Demystifying “Digital Literacy”

Over at her New York Times blog, Virginia Heffernan quotes some pretty hyperbolic claims about the future of work in the United States, inter alia, that 65% of jobs which will be held by today’s grade-school kids will be unrecognizable to us – though admittedly, the claim may turn on what how exacting a standard of ‘recognizable’ we apply. Any exaggeration is due to from Cathy Davidson, a Duke scholar who research focuses include the impact of technology on learning and higher education, whose new book, Now You See It turns on questions of attention and technology in learning.

What’s most hopeful, and surprising, about the collection of findings Heffernan cribs from Now You See It:

Online blogs directed at peers exhibit fewer typographical and factual errors, less plagiarism, and generally better, more elegant and persuasive prose than classroom assignments by the same writers.

That finding has now been quoted hundreds of times by bloggers, some presumably delighted that their particular medium, often the target of neo-luddite laments regarding the prospects for digital-age literacy, shows real promise as a mode of written communication (at least, it should be noted, among engaged top-tier undergrads).

The implications are more complex. A friend, now completing her PhD in rhetoric at the University of Waterloo, had intended to investigate the process by which students learn academic practices related to the use of sources. Yet one of the key lessons of her research is just how poorly most undergraduate assignments are designed. At best, such assignments – generally in the form of the poorly defined ‘review paper’ – require students to practice skills which will be useful to them neither in “the real world” nor in the academic practice of the professor who is teaching the class.

At first, Heffernan uses these and other results drawn from Davidson’s book to take somewhat arbitrary potshots at Tom Pynchon and Michael Ritchie’s film The Candidate. Of course, attacking the content of critique and analysis in the undergraduate classroom is, of course, somewhat beside the point. Luckily, at the end of her post, Heffernan gets back on point, suggesting that higher education should be tied into the task of improving, not deriding, digital literacy. What my friend’s research highlights is that this is not simply a matter of insufficient room for collaboration, “web accountability” or multimedia savvy: instead, improving learning outcomes may be simply a matter of designing assignments which allow students to write in a register which seems – and is – relevant: like writing a blog post.