There is no ethical progress, and no strategy, in silence.

So I love–love–Freddie de Boer. There is, given the defensiveness in his writing, obviously a big slice of the American left-liberal blog-o-sphere who absolutely hates him for his politics, or for the way he expresses his politics, or for the timing of his expression of his politics or…something. But I find his engagement with questions of ethics and strategy, his resistance to the fetishization of American machine politics as the sole locus of social change in the directions of justice, his earnest, forthright, sometimes fearless articulation of his own take on various moments—I find all of it inspiring, energizing, so often just on-the-nose. The fact that he is willing to say “maybe browbeating young people isn’t the best way to get people thinking about class and intersectionality on university campuses” while also having the capacity to powerfully express the essential, powerful and sublime irrationality of human generosity in the face of a culture addicted to stories starring homo economicus, is enough to give me some hope about the future of Western civilization.

But I want to talk about something else. I want to talk about jealousy. I simply cannot understand how Freddie writes so much. He is one, maybe two years younger than me. He has published papers, other papers in the works, is almost done his dissertation. He has had to spend most of the six years of his doctorate, unlike my set-up, handling teaching responsibilities alongside his own research, plus attending to various on-campus commitments. It’s clear, from his writing, that he doesn’t succeed in his professional life by unplugging from popular culture, either. Quite to the contrary. His blog posts indicates that he is active on facebook, scouring his friends feeds for signs of the American pop-liberal zeitgeist, that he still finds time to read some fiction, that he has movies and kinds of movies that he likes.

I don’t know how he does it. But I have an inkling. Let’s set aside for a moment whether I am as smart as Freddie, whether I have his analytical capacity. Give me the benefit of the doubt for a moment that I’m a smart guy, that I can tackle and manipulate ideas with the best of them. Okay, so the question is: why am I not producing?

There are to my mind two ways to put the answer. On the one hand, I am tormented, haunted, by the breadth and depth of my ignorance. There is this old joke chart that points that the real gift of learning isn’t so much knowledge as it is ignorance: you may increase the number of “things” you know over time, but the horizon of things of which you are ignorant also expands. Getting how something works, how it really works, always seems nearly within grasp, so that just one more article, one more book, will be all that’s needed to settle the questions that you set off with. It is, beyond this, extremely hard work for the curious mind to remember that not every point of confusion can be explained or explored now; that the journey into the wilderness started with a purpose, and that tracking through it without leaving a trail may mean adventure for you, but is ultimately of no use to anyone else.

The other way of putting it is that in terms of fear rather than distraction. I often feel that the things I want to express are, if not complicated, at least a bit out of left field. It feels to me like it will be a waste of time, or an embarrassment (I can’t even spell embarrassment without a spell check), to write things online or even in publication, that I haven’t fully thought out. Objectively, I think this is garbage: the world is generally full of generous, thoughtful people who want to check their own prejudices and intuitions against those of others. It’s also laughably narcissistic: how many people would really care enough about what I have to say that the relative quality of what I put out matters? Nonetheless, it’s the psycho-cognitive situation in which I find myself.

So wish me luck. I am going to try and put out more rough drafts–more missives from the wilderness. But I am also going to try sending stuff that feels incomplete for potential publication. It’s like my masters’ supervisor always said. Academic work is an iterative process.

In the end, there is no ethical progress, and no strategy, in silence.

Good advice for weekends (or dead astronauts and hot metal)

In a post on touchscreens, Edward Tufte pleads for us to to spend less time having a 2-d experience of our 3-d world. Thus, as interesting as it may be to learn something about how PowerPoint kills astronauts, we shouldn’t forget there are richer things to do:

Plant a plant, walk the dogs, read a real book, go to the opera. Or hammer glowing hot metal in a blacksmith shop.

Remistifying “Digital Literacy”?

In the comments, my dear friend Everett remarks on a recent piece (http://nyti.ms/qhON3m) appearing in the NY Times.

It is a lament and a diatribe about the decline of the thinker and the rise of the information junkie in an increasingly “post-idea” and “post-Enlightenment” world where our capacity for rational thought has allegedly diminished, despite all of our technological advances. Neal Gabler contends that information itself might be partially to blame: “It may seem counterintuitive that at a time when we know more than we have ever known, we think about it less.” He remains skeptical about the possibilities afforded by social media and the Internet. They are part of the problem. While the online world excels at facilitating countless micro-discussions and exchanges on almost every conceivable topic, this hyperactive space tends to crowd out avenues for the slow churning of grand arguments and theorizations.

In one way his regular commenting on my blog (and, as you’ll see, at least one element of the post itself) goes some way to providing a quirky counter-current to his position.

The argument of the piece (a bit ironically, if Gabler’s argument is right, given its appearance in a print publication) is a bit muddled on what the problem is, what the sources of it are and what the implications might be, but his point about digital media can probably be summed up in his claim that “you can’t think and tweet at the same time…” His big idea is ‘short form media is bad for big ideas.’

While it is true that the average blog post is shorter than the average book, the problem with critiques like this is that they are stuck using a metric of information density which uncritically borrowed from the age of Gutenberg. Sure, it’s impossible to summarize big ideas in 140 characters. But a huge portion of people use Twitter not as a way to communicate directly, but only as a way to encode other kinds of communications. Why does the NY Times have its own dedicated microurl? Because of how  frequently people were using Twitter to link to articles in the Times. So when Gabler claims that Twitter is bad “because tweeting…is largely a burst of either brief, unsupported opinions or brief descriptions of your own prosaic activities…a form of distraction or anti-thinking…” he’s providing an unfairly narrow image of how social media is used.

Another example. It’s true that I waste some amount of time on facebook watching videos of cats chasing lasers (though my favourite online video remains this classic of cats who shoot lasers). But most of my time there is spent following links posted by friends, reading the comments they write on these articles, commenting on their positions, and, when I’m lucky, getting into an even more extended conversation on these topics. The reality is, the majority of my discussion of “ideas” now happens not IRL, but on facebook. This concerns me, certainly. But not because it heralds the doom of thought itself.

One could respond that, among those using online technologies, my network of friends is anomalous, and that, though Gabler’s vision of Twitter may be narrow, he’s right about the majority of online content. Well fine, but then, the only important question is, are people talking about big ideas more or less than before Twitter? Because I am willing to wager with 1 to 1 odds that most of Western societies has always talked about the mundane details of their lives, most of the time. Were the biggest celebrities in 1899 intellectuals, actors, or war heroes?

But let me get back to Gutenberg: no doubt, reading a lot of articles online is different from reading an entire book. But it’s not clear to me which form of reading allows more thinking. As I read Gabler’s piece, I stumbled on his use of “Gresham’s Law” (which is sad, because I spent much of August reading political economy). So I looked it up on wikipedia. It turns out that basically, Gresham found (by accident) that bad money will always replace good where both are available in the market. Which also implies that my bothering to provide a hard link to the wiki page on Gresham is kind of silly, because as my experience indicates, if people stumble with ideas in their traipsing through the blogosphere, they will do the legwork to find out more. Indeed, a lot of my online activities will lead into a web of related readings, some followed links, some watched videos. It’s not deep reading, but rather, networked reading.

What are the implications of this change in the nature of reading, for thinking, for ideas and for culture? A fascinating question no doubt, and one which is being addressed obliquely in the literary sphere. But knowing the answers, like knowing whether we talk to each other less (or more) about ‘things that matter’ than we might have in 1899, would require actual research, rather than just rehashing the warning given by Plato in the Phaedrus against the written word (it’s also online) whenever new communications technologies. I suppose communications analysis suffers from its own Gresham’s Law.

Now, here’s a fascinating idea: I noticed in a public presentation yesterday, the habit people have of looking up terms or sources they aren’t familiar with isn’t limited to when they are reading online. They do it in public, too. I wonder what my blogosphere will think about that.

Demystifying “Digital Literacy”

Over at her New York Times blog, Virginia Heffernan quotes some pretty hyperbolic claims about the future of work in the United States, inter alia, that 65% of jobs which will be held by today’s grade-school kids will be unrecognizable to us – though admittedly, the claim may turn on what how exacting a standard of ‘recognizable’ we apply. Any exaggeration is due to from Cathy Davidson, a Duke scholar who research focuses include the impact of technology on learning and higher education, whose new book, Now You See It turns on questions of attention and technology in learning.

What’s most hopeful, and surprising, about the collection of findings Heffernan cribs from Now You See It:

Online blogs directed at peers exhibit fewer typographical and factual errors, less plagiarism, and generally better, more elegant and persuasive prose than classroom assignments by the same writers.

That finding has now been quoted hundreds of times by bloggers, some presumably delighted that their particular medium, often the target of neo-luddite laments regarding the prospects for digital-age literacy, shows real promise as a mode of written communication (at least, it should be noted, among engaged top-tier undergrads).

The implications are more complex. A friend, now completing her PhD in rhetoric at the University of Waterloo, had intended to investigate the process by which students learn academic practices related to the use of sources. Yet one of the key lessons of her research is just how poorly most undergraduate assignments are designed. At best, such assignments – generally in the form of the poorly defined ‘review paper’ – require students to practice skills which will be useful to them neither in “the real world” nor in the academic practice of the professor who is teaching the class.

At first, Heffernan uses these and other results drawn from Davidson’s book to take somewhat arbitrary potshots at Tom Pynchon and Michael Ritchie’s film The Candidate. Of course, attacking the content of critique and analysis in the undergraduate classroom is, of course, somewhat beside the point. Luckily, at the end of her post, Heffernan gets back on point, suggesting that higher education should be tied into the task of improving, not deriding, digital literacy. What my friend’s research highlights is that this is not simply a matter of insufficient room for collaboration, “web accountability” or multimedia savvy: instead, improving learning outcomes may be simply a matter of designing assignments which allow students to write in a register which seems – and is – relevant: like writing a blog post.

The Cat in the Hat and tests in the bag

Who are these people?

It is eminently logical that the reading comprehension tests scores of children and adults alike increase according to the time they spend reading for pleasure. More, it is not surprising that children with more than 100 books in their home score markedly better on standardized tests, including math tests,, than children whose parents own fewer than 10 books. And, especially for those who us have attended college, there is nothing breathtaking with this final conclusion, which along with the first two bits of trivia comes from a study by the United States National Endowment on the Arts discussed here in the New York Times:

students who lived in homes with more than 100 books but whose parents only completed high school scored higher on math tests than those students whose parents held college degrees … but who lived in homes with fewer than 10 books.

Thus my opening: who are these people? The correlation is remarkable only in the plainest, “notable”, sense of the word. What is breathtaking is not the conclusion, but the fact that there is a significant cohort of college-graduate parents who own less than 10 books.

I can think of only two even vaguely believable explanations for the existence of these people. Completing college is no guarantee of economic success. Perhaps, facing the slouching, lurching beast which is poverty in the United States, and specifically caught between its twin jaws of a pitiless labour market and an increasingly toothless welfare system, some college-graduates may be included in a sizable cohort priced out of books, as if they were only a disposable luxury.

More likely, in response to a rising tide of unnecessary credentialism, these parents participated in four years of post-secondary schooling so mind-numbing that, instead of feeding a flickering flame of passion for learning, the experience so finally smothered whatever academic spark they left high-school carrying that they respond to books in their home not with an inflamed literary temperament, but a more literally pyromaniacal urge.

Critics of the study suggest that the authors under-measure internet reading – though if my writing is any example of what is available, then we can be sure that writing on the internet is no substitute. Books are, the critics argue, a thing of the past. However, the conclusion about such correlations show that books still matter, and for children they matter at home.

There is a bright side, in the form of a clear lesson for all of us: if one wants to avoid a child so precocious that she corrects grammar and regales with trivia, the choices are to sell off one’s Hawking, Hemmingway, Chomsky and Chaucer, or to put them, along with the Cat in the Hat, under lock and key. At least until standardized test day. (via Arts & Letters daily)