read

This talk was presented today at the Middlebury Institute of International Studies in Monterey, California, as part of the Domain of One's Own" initiative that the school is launching. The complete slide deck is here.

I’m often asked to give a title for a talk months in advance, certainly weeks before I’ve ever actually planned what I’m going to say or written a word. So when I finally turn to do so, I frequently find that I’m confounded – what did I want to say? What did I mean to say? What will I say?

Something about attending to the digital, reclaiming the Web, it looks like…

When I’m spinning around, grasping for ideas, grasping at how to structure a talk, I read books. Yeah. I know. Weird. Books. Those old things. But ideas are developed more slowly and thoroughly in books. That’s something that’s desperately lacking in the steady stream of information flow online. There’s something about the pace of print – in reading and in writing. Deliberate. Deliberation. There’s something too about the pace of a keynote (and a sermon and a lecture) that I think draws on print. This type of speaking is perhaps, with a nod to Walter Ong, a “printed orality.”

And in particular, I like to turn to the Oxford English Dictionary. To be honest, I turn to it quite often – keynote or not. I find etymology – the history of words’ origins, their changing meanings – to be quite useful in situating my own language, my own ideas, and to ground these ideas not just in whatever what’s on my mind at the moment, but in their historical origins. I find the OED to be quite useful in thinking about the history of technologies, particularly communication technologies. And perhaps it seems silly or redundant or obvious to say this: but communication technologies do predate computing technologies. Our communication practices might change – might – because of new computing technologies. But new practices tend not to be invented utterly whole cloth. The legacies of language and culture persist.

“Attending to the digital,” the title of this talk, is not meant to signal entirely new forms of reading or writing; the digital does not signal entirely new forms of attention.

But I want to pause there and explore some of the meanings of that word “attention,” in part because we seem to be in the middle of a moral panic of sorts about attention, particularly about the detrimental effects some contend that technology has on attention.

According to the OED, “attention” – derived from the Latin “attendere,” to attend – means “The action, fact, or state of attending or giving heed; earnest direction of the mind, consideration, or regard; especially in the phrase to pay or give attention. The mental power or faculty of attending; especially with attract, call, draw, arrest, fix, etc.” “Attention” is a noun, but it refers to an action and/or a state of being. Attention is a mental activity. An earnest activity – which I particularly like. Attention is a military activity. It refers to how we stand and how we see and how we think.

According to the OED, the word’s first usage in English came in 1374 by Chaucer, translating The Consolation of Philosophy, a sixth century tome, from Latin; and then “attention” was not used again until the 16th century. In Shakespeare’s Richard II:

O, but they say the tongues of dying men

Enforce attention like deep harmony:

Where words are scarce, they are seldom spent in vain.

The word “attention” is a common word, with fairly steady usage in literature throughout the last hundred or so years, its meaning fairly consistent – to consider, to heed. That is, until the early twentieth century when we started talking about “attention seekers” and “attention getters” and “attention grabbers” – phrases that reflected a changes in media and advertising. The development of the field of psychology around the same time also introduced the concept of the “attention span.” Another new phrase originated in the 1960s, at first via articles in educational psychology journals: “attention deficit.”

We’re doing “attention” wrong now, we’re told. We seek it too much; we hold it too little. There’s a deficit, a lack, a pathology even. But doing “attention” wrong how, I’d ask (well before I’d ask about doing it wrong why). After all, if you look through these definitions and usages, you can see that the noun is accompanied by all sorts of verbs. We pay attention. We give attention. Attract attention. Draw attention. Call attention. Fix attention. At which noun-verb combination are we failing? Surely not all of them. What and how are we not attending, not attending to? What and how are we not seeing? What role do technologies play in what we see, what we attend to, what we forget, what we ignore?

Let me pause here and reassure you: this is not going to be a talk that functions as a screed against “digital distractions.” These have become incredibly formulaic. You know the arguments by now: new technologies – most recently the culprit is the cellphone – are making us un- or anti-social. They are shortening our attention spans. They are nudging us to pay attention to all the wrong things – checking Twitter, for example, at the dinner table or texting while driving. We can’t sit still. We don’t have empathy. We don’t look at people, engage with people. Yet we can’t handle solitude. We can’t handle the despair of the human condition. “And that’s why I don’t want to get a cellphone for my kids,” says Louis C. K., whose comedy routine is frequently referenced in essays on “digital distractions.” You can almost predict when these articles and arguments are going to invoke his bit with Conan O’Brien, when they turn to argue that somehow digital technologies foreclose meaningful contemplation, foreclose our experiences of existential angst.

And then there are the responses, the counter-arguments to “digital distraction” that are often just as predictable. These often point dismissively to what’s almost a caricature of the work of MIT science studies professor Sherry Turkle, sneering at her claims that in Alone Together – that “we expect more from technology and less from each other.” Technologies makes us more social, these arguments insist. Technologies broaden our understanding and expand our capacity for empathy. We have never paid attention to one another in certain settings, these articles claim. And cue the requisite black-and-white photo of a train car full of men commuting to work, immersed in the solitude of their newspapers.

I find neither of these types of essays, neither of these arguments very satisfying.

In part, I find that those who want to dismiss such a thing as “digital distraction” tend to minimize the very real impact that new technologies do have on what we see, what we pay attention to. It’s right there in that phrase – “pay attention.” Attention has costs. It is a resource – one involving time and energy, a resource of which we only have a limited amount. Attention has become a commodity, with different companies and technologies bidding for a piece of it. As Matthew Crawford wrote in a New York Times op-ed last year,

…We’ve auctioned off more and more of our public space to private commercial interests, with their constant demands on us to look at the products on display or simply absorb some bit of corporate messaging. Lately, our self-appointed disrupters have opened up a new frontier of capitalism, complete with its own frontier ethic: to boldly dig up and monetize every bit of private head space by appropriating our collective attention. In the process, we’ve sacrificed silence – the condition of not being addressed.

This is “the attention economy,” we are told, where our attention is thoroughly monetized, where everything we do and think and are urged to do and urged to think reduced to a financial transaction. And it’s not just about our attention, of course; it’s about our data. It’s about a manufacturing of distractions – many, many distractions – so we are always clicking but rarely contemplative.

This crisis of attention we face today is often linked to an overabundance of information. But this is hardly a new or unprecedented circumstance. This is not the only time in history in which we’ve experienced “information overload.” This is not the first time we have struggled with “too much information.” The capacity of humans’ biological memory has always lagged behind the amount of information humans have created. Always. We have created a variety of technologies to help us manage information and memory – writing most obviously, but also codices, indices, tables of contents, libraries.

“In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes,” cognitive psychologist and computer scientist Herbert Simon wrote in 1971. “What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” 1971.

So it’s not accurate – not remotely accurate – to say that our current (over) abundance of information began with computers or was caused by the Internet. “Distraction” cannot simply be a result of the digital, even if digital technology companies seem perfectly adept to encourage and exploit that distraction.

Essays that both stoke and assuage fears about “digital distraction” tend towards the ahistorical because their assertions almost always focus on the digital, on new technologies as the cause. And again, this is why resources like the OED can be so valuable. As I said at the start of this talk, at the turn of the 20th century – well before the smartphone – the English language already reflected anxieties about attention, particularly about those who deliberately seek attention, those who seek notoriety, those who disrupt the social order. (Women.)

I do wonder how much anxieties about a disrupted social order are at the core of our anxieties about attention and distraction, our anxieties about technological change. I don’t say this dismissively. Nor do I want to suggest that all disrupting and re-ordering is necessarily progressive. We too often confuse technological advancement with political progress or with socio-economic justice – they aren’t the same thing.

But technological changes do alter and reflect the social and economic and political order.

Amusing Ourselves to Death by media theorist Neil Postman is often described as a polemic against the corrosive effects of television, and perhaps for that reason some might be quick to dismiss the relevance of its insights to a “digital world.” But the communication technologies that have been developed alongside and since television are not, again, a new language. They are built on the language of TV. As Postman writes,

On television, discourse is conducted largely through visual imagery, which is to say that television gives us a conversation in images, not words. The emergence of the image-manager in the political arena and the concomitant decline of the speech writer attest to the fact that television demands a different kind of content from other media. You cannot do political philosophy on television. Its form works against the content.

“You cannot do political philosophy on television” – that’s a prescient and damning statement now that one of the major political party Presidential candidates today, in a Presidential campaign that some are calling the most corrosive to American democracy, is a reality TV star. Writing about television some thirty years ago, Neil Postman gets so much right about attention, about attention to public knowledge, attention to the public discourse.

This public piece is important, I want to reiterate. This isn’t simply about attention or distraction on an individual level – whether or not your teen or your partner or your student is looking at you or looking at a screen – this is about public attention and public distraction. This is about public discourse – democracy really. What we pay attention to, shapes us. Collectively.

You would be mistaken to think that, because it predates the World Wide Web and mobile phones, Postman’s Amusing Ourselves to Death has no insights to offer us about technology today. After all the book isn’t simply about television. It’s about electronic communications, something that Postman traces through the developments of photography and telegraphy.

“The telegraph made a three-pronged attack on typography’s definition of discourse,” Postman writes, “introducing on a large scale irrelevance, impotence, and incoherence. These demons of discourse were aroused by the fact that telegraphy gave a form of legitimacy to the idea of context-free information; that is, to the idea that the value of information need not be tied to any function it might serve in social and political decision-making and action, but may attach merely to its novelty, interest, and curiosity. The telegraph made information into a commodity, a ‘thing’ that could be bought and sold irrespective of its uses or meaning.”

Irrelevance. Impotence. Incoherence. Information as a commodity, and attention as a commodity.

Surrounded by these informational conditions, what are we giving and paying attention to? What do we see? What do we contemplate?

There’s another common trope when writing about the dangers of “digital distraction” – the admonition to unplug, go offline, disconnect. Of course, this has been commodified too, with expensive “digital detox” retreats and the like that promise to help you become more mindful (so that you can return to your job, reinvigorated, of course). The problem with this framework – I loathe the use of that word “detox” – is that it pathologizes, making the problem of technology usage, attention and distraction, an individual one rather than a systemic one.

So I’m going to refer to another book here, and not to finger-wag about “digital distraction” – hell, this particular book was published in 1974 – or to set up some false dichotomy between humans in Nature and humans with computers. But I recently reread Pilgrim at Tinker Creek by nature writer Annie Dillard in preparation for this talk because it is fundamentally, I believe, a book about attention. The book – a latter day Walden of sorts as it’s often described – chronicles a year of exploration and observation and contemplation around Tinker Creek, in the Blue Ridge Mountains of Virginia.

So what does it mean to attend to the world around us – immediately around us? A sustained and compassionate and curious attention? Annie Dillard writes about this beautifully:

It’s all a matter of keeping my eyes open. Nature is like one of those line drawings of a tree that are puzzles for children: Can you find hidden in the leaves a duck, a house, a boy, a bucket, a zebra, and a boot? Specialists can find the most incredibly well-hidden things.


…If I can’t see these minutiae, I still try to keep my eyes open. I’m always on the lookout for antlion traps in sandy soil, monarch pupae near milkweed, skipper larvae in locust leaves. These things are utterly common, and I’ve not seen one. I bang on hollow trees near water, but so far no flying squirrels have appeared. In flat country I watch every sunset in hopes of seeing the green ray. The green ray is a seldom-seen streak of light that rises from the sun like a spurting fountain at the moment of sunset; it throbs into the sky for two seconds and disappears. One more reason to keep my eyes open. A photography professor at the University of Florida just happened to see a bird die in midflight; it jerked, died, dropped, and smashed on the ground. I squint at the wind because I read Stewart Edward White: ‘I have always maintained that if you looked closely enough you could see the wind – the dim, hardly-made-out, fine débris fleeing high in the air.’ White was an excellent observer, and devoted an entire chapter of The Mountains to the subject of seeing deer: ‘As soon as you can forget the naturally obvious and construct an artificial obvious, then you too will see deer.’


But the artificial obvious is hard to see.

The artificial obvious. The naturally obvious. How much of what we are compelled to pay attention to with various digital technologies is precisely the latter? How much of this natural obviousness is manufactured and elevated to a level of immediate and unnatural importance. You get a push notification on your phone to tell you Kim Kardashian was robbed at gunpoint in her exclusive Paris hotel room. What are we supposed to do with that information? How do we learn to see differently and not just react to what’s “obvious” about these sorts of stories?

The idea of the “news of the day,” according to Neil Postman, is a result of the telegraph, “which made it possible to move decontextualized information over vast spaces at incredible speed.” Information, telegraphed, is stripped of its context and of its relevance, and because of the distance – literal, metaphorical – those consuming the information are stripped of their ability to act in response. The telegraph, says Postman,

brought into being a world of broken time and broken attention, to use Lewis Mumford’s phrase. The principal strength of the telegraph was its capacity to move information, not collect it, explain it or analyze it. In this respect, telegraphy was the exact opposite of typography. Books, for example, are an excellent container for the accumulation, quiet scrutiny and organized analysis of information and ideas. It takes time to write a book, and to read one; time to discuss its contents and to make judgments about their merit, including the form of their presentation. A book is an attempt to make thought permanent and to contribute to the great conversation conducted by authors of the past. Therefore, civilized people everywhere consider the burning of a book a vile form of anti-intellectualism. But the telegraph demands that we burn its contents. The value of telegraphy is undermined by applying the tests of permanence, continuity or coherence. The telegraph is suited only to the flashing of messages, each to be quickly replaced by a more up-to-date message. Facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.

“A world of broken time and broken attention” – the telegraph, the television, and, of course, the Internet.

Last fall, my friend Mike Caulfield, Director of Blended and Networked Learning at Washington State University Vancouver, gave a brilliant keynote titled “The Garden and the Stream: A Technopastoral.” He didn’t refer to Postman directly, but in his talk, he made some similar observations about how technologies – “Web 2.0” technologies specifically – shape public discourse in part by privileging this particular brokenness of time and attention. Caulfield argues that we rely on two powerful metaphors to describe contemporary Internet technologies. The Garden. The Stream.

The stream privileges a rapid flow of information; this is “the feed” on Facebook and on Twitter. It is a serialization of information that you can wade in and out of, but the data always rushes by. The stream demands a certain kind of chronology – the presentation of information in reverse chronological order, with the latest updates at the top. Thus, these technologies command we pay attention to the newest information – via push notifications and counters that tell us the number of unread messages, for example.

The garden, Caulfield argues, helps us imagine the Web as a place, as a topological space. It’s deliberately designed. (So is “the stream” of course.) But we can walk through the garden along different paths. We aren’t forced into a stream that rushes by us. We can stroll. We can experience the garden in many different ways. We move through it; the garden does not move but it does change. We choose the pace and the direction we navigate. And we tend to the garden. We pay sustained attention. We deliberately plant. We carefully cultivate. We propagate. We plow. We dig up from the roots. We find the best place – location, water, soil – for growth. We trim back. We weed. We graft. We fertilize. We harvest. We care.

Some of those who imagined and developed the Web once talked about the technology in these terms. (Sometimes we still do.) Vannevar Bush’s Memex, outlined in a 1945 article in The Atlantic, is often the example cited here – he envisioned a personal “memory machine” where you could store and annotate all sorts of texts and images. And I think we like to imagine that that’s what the Web is. But it’s not. It never really was – due to both its infrastructure and intellectual property, for starters. Increasingly, the Web is even less of a garden.

Instead of cultivation and contemplation – growing a garden takes time – we are swept up in the stream. Of course we are, I imagine Neil Postman saying. Here’s Caulfield’s description:

The “conversational web”. A web obsessed with arguing points. A web seen as a tool for self-expression rather than a tool for thought. A web where you weld information and data into your arguments so that it can never be repurposed against you. The web not as a reconfigurable model of understanding but of sealed shut presentations.

This isn’t simply about technologies of distraction. This is about technologies of a fragmented discourse, one that privileges “comments” – never read the comments – versus a deeper, critical commentary. “But comments are a conversation,” some will say, extolling the virtues of “Web 2.0” that encouraged – purportedly at least – a readable, writable web. But as Postman observed of far earlier technologies, namely the telegraph, these had “introduced a kind of public conversation whose form had startling characteristics: Its language was the language of headline – sensational, fragmented, impersonal.”

Sensational, fragmented, impersonal – these are the characteristics that I think we should look at when we talk about distraction and attention. And I think we should contemplate how we can build technologies that foster a deep and sustained attention to ideas, to knowledge, and yes, to public discourse.

“What is television?” Postman asked. “What kinds of conversations does it permit? What are the intellectual tendencies it encourages? What sort of culture does it produce?” What is the Internet, we should ask now. What kinds of conversations does it foster, and what kinds does it foreclose. What are the intellectual tendencies the Internet encourages? What sort of culture does it produce?

As Middlebury rolls out MiddCreate, its initiative that will provide domains to students and staff, I urge you to think carefully about the metaphors and importantly the infrastructure of the Internet and of the Web. What are you going to attend to? How will you use your domain – a word with multiple meanings, referring to place and control and knowledge – to cultivate ideas carefully, thoughtfully, beautifully, collectively? These are questions of design – we can design differently. These are questions of intention. These are questions of attention. These are questions of incredible political significance right now. We need only look at the Presidential campaign, at an embrace of factlessness and conspiracy theories fostered by Facebook, to see the dangers of attending to technology at the expense of attending to democracy.

I want to turn here, to close, to the second part of my title – a phrase I haven’t referred to yet: “reclaiming the Web.” I want to invoke the speaker’s prerogative to change the title of my talk here as I come to its conclusion. I’ve used the word “reclaim” a lot in my work. I done so in part because the word does mean to bring back. Reclamation is to reassert, to protest, to heal, to restore. But again, I don’t really believe the tale that the Web was once something pristine that we must rescue and convert from wasteland. Yes, we need to engage in a reclamation. But it’s not the Web per se that we must rebuild. It’s broader and deeper than that. Broader and deeper than technology. Broader and deeper than “the digital.”

If there’s something to reclaim – or for many voices, to get to claim for the very first time – it is public discourse. It is, I hope, one that rests on a technological commons. I think we start towards that commons by thinking very carefully, by thinking very slowly and deeply, by cultivating very lovingly our spaces and places and own domains.

Image credits: Slide 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18. Works cited: “The Garden and the Stream: A Technopastoral” by Mike Caulfield. Amusing Ourselves to Death by Neil Postman. Pilgrim at Tinker Creek by Annie Dillard. “The Cost of Paying Attention” by Matthew Crawford. “Louis C.K. Hates Cell Phones” on Conan. “Designing Organizations for an Information-Rich World” by Herbert Simon.

Audrey Watters


Published

Hack Education

The History of the Future of Education Technology

Back to Archives