Article Image
read

Here is the transcript from my talk today at NYU, as part of the ECT Colloqium Series. No slides. Because, ugh, slides. Many thanks for the invitation, as coordinated by Sava Saheli Singh.

I’m very excited and honored to be here to talk to you today, in part because, obviously, that’s how you’re supposed to feel when you’re invited to speak at a university. And in part, honestly, I’m stoked because I’m reaching the end of what has been a very long year of speaking engagements.

Initially, I’d planned to spend 2014 working on a book called Teaching Machines. I’m absolutely fascinated by the history of education technology — its development as an industry and a field of study, its connection to scientific management and educational psychology and Americans’ ongoing fears and fascinations with automation.

I call myself a freelance education writer. But I’ve spent the year traveling around the world acting more like an education speaker.

I don’t really fit in in the education technology speaking circuit. I mean, first off, I’m a woman. Second off, I don’t tend to talk about ed-tech revolution and disruptive innovation unless it’s to critique and challenge those phrases. I don’t give ed-tech pep talks, where you leave the room with a list of 300 new apps you can use in your classroom. Third, I’m not selling a product, not selling consulting services, and because I’ve spent so much time this year traveling and speaking, I’m still not selling a book. And I don’t have a schtick. I don’t have a script. There isn’t actually a TED Talk that you can watch and see almost 100% word-for-word what I’m going to say over the course of a keynote. 

That’s what you do, I’m told. You write a talk. You give that talk again and again and again and again. You hone your delivery. You hone your jokes; perhaps you localize them.

I do it wrong. I try to write something new each time I talk. I use the opportunity of a public speaking engagement to spend some time crafting an argument, which I do write out in advance — as you can see now,  to deliver to each audience.

Somewhere along the way — mid-September, I guess — I realized that, while I did not finish writing Teaching Machines this year, I did actually write tens of thousands of words on ed-tech. I’ve got a couple more talks scheduled, but by the end of 2014, I’ll have delivered over 14 unique presentations. That’s 14 chapters. Why, that’s a book! So I’ve decided that I’m going to collect all the talks I’ve written and self-publish them. It’s like I did everything backwards: I did the book tour, and then I published the book.

Of course, once I decided to publish my talks as a book, I had to spend some time thinking about how they’d be ordered and grouped. I didn’t want to simply present them in chronological order, that is. What I said in January. Then what I said in February. Then what I said in March. So I needed to group my talks into sections, by theme. 

And then I realized too that, if I was going to publish my talks as a book, I need to think a bit more strategically about what I wanted to say in my final few presentations. 

See, I wanted to the book to have an arc. You have to have an arc.

If you’re familiar with my work, you know that I’m pretty critical of the shape that education technology takes, has taken, is taking. Perhaps it’s because I describe myself as a “recovering academic,” I get a lot of snarls in response to my criticism, that all I know how to do is “what grad students do” and that’s “criticize.” I hear this a lot, particularly from entrepreneurs who proudly proclaim that they “build" while I "tear things down."

I think that’s bullshit, frankly — often a cheap anti-intellectualism that posits markets as making and scholars as destructive. 

Nevertheless as I’ve weighed how I’d pull together my 2014 “book tour” into a book, I figured, heck, I should probably not send my readers into this downward spiral of education technology despair. I live there, people, and it's gloomy. So I figured I should find something, some way to wrap things up — not necessarily on an “up” note, but on an activist note, on a note that says that we can, at the very least, resist some of the dominant narratives about what education technology can or should do.

“Say something positive about ed-tech, Audrey.” Easier said than done.

But when Sava asked me to give a title and an abstract for this talk today, I decided to try.

Or at least, what I want to talk about today is how we can push back on the hype surrounding ed-tech disruption and revolution, how we can ask questions about whose revolution this might be — to what end, for whose benefit — and how we can, should, must begin to talk more seriously about education technologies that are not build upon control and surveillance. We must think about education technologies in informal learning settings, and not simply in institutional ones, We need to talk about ed-tech and social justice, and not kid ourselves for a minute that Silicon Valley is going to get us there.

So in doing so, I decided to invoke in the title of this talk Ivan Illich’s notion of “convivial tools.”

The phrase comes from his 1973 book Tools for Conviviality, published just 2 years after the book he’s probably best known for, Deschooling Society

These are just two of a number of very interesting, progressive if not radical texts about education from roughly the same period: Paul Goodman’s Compulsory Mis-education (1964). Jonathan Kozol’s Death at an Early Age (1967). Neil Postman’s Teaching as a Subversive Activity (1969). Paulo Freire’s Pedagogy of the Oppressed (first published in Portuguese in 1968 and in English in 1970). Everett Reimer’s School is Dead (1971).

These books — loosely — share a diagnosis: that our education system is controlling, exploitative, imperialist; that despite all our talk about democratization and opportunity, school often neatly reinforces the hierarchies of our socio-economic world — categorizations based on race and class and gender and nationality. 

(Let me stress “gender” there. I can’t but notice that this list, much like the list of those on the education speaking circuit today, is full of men.) 

During roughly the same period as the publication of these books challenging traditional education, traditional schooling, there was a growing interest in the potential for what the still fairly nascent field of computing could do to hasten some of this change — progressive change, I should be clear. Education technology, as I hope my book Teaching Machines will eventually make clear, has a history that stretches back into the early twentieth century and has much more in common with Edward Thorndike than it does John Dewey, more in common with multiple choice than with student choice and agency. But in the Sixties and Seventies, we saw progressive education and ed-tech start to coincide. For example, drawing on Seymour Papert’s constructionist theories of learning, Daniel Bobrow, Wally Feurzeig, Cynthia Solomon — aha! a woman! — and Papert developed the programming language Logo in 1967, a way for children could to learn computer programming but more importantly even, a way of giving them a powerful object, a powerful tool to think with. And in 1972, along a similar line of thinking, Alan Kay published his manifesto “A Personal Computer for Children of All Ages."

It’s perhaps worth reminding you that in the late Sixties and early Seventies, computers were still mostly giant mainframes, and although there was the growing market for microcomputers, these were largely restricted to scientists and the military. Alan Kay was among those instrumental in pushing forward a vision of what we now call "personal computing.” Not business computing. Not cryptography. Personal computing.

Kay argued that computers should become commonplace, and should be in the hands by non-professional users. He believed this would foster a new literacy, a literacy that would bring about a revolution akin to the changes brought about by the printing press in the 16th and 17th centuries. And key: children would be the primary actors in this transformation.

In “A Personal Computer for Children of All Ages,” Kay describes his idea for a device called the Dynabook — he describes his underlying vision for this piece of technology as well as its technical specifications: no larger than a notebook, weighing less than 4 pounds, connected to a network, and all for a price tag of $500, which Kay explains at length is “not totally outrageous.” ($500 was roughly the cost at the time of a color TV, Kay points out.)

“What then is a personal computer?” Kay writes. "One would hope that it would be both a medium for containing and expressing arbitrary symbolic notations, and also a collection of useful tools for manipulating these structures, with ways to add new tools to the repertoire.” That is, it would be a computer program but one that is completely programmable by the user — by “children of all ages."

“It is now within the reach of current technology to give all the Beths and their dads a ‘Dynabook’ to use anytime, anywhere as they may wish,” Kay continues. Again, this is 1972 - 40 years before the iPad. "Although it can be used to communicate with others through the 'knowledge utilities' of the future such as a school ‘library’ (or business information system), we think that a large fraction of its use will involve reflexive communication of the owner with himself through this personal medium, much as paper and notebooks are currently used.“ 

The personal computer isn’t “personal” because it’s small and portable and sits on your desk at home (not just at work or at school). It’s “personal” because you pour yourself into it — your thoughts, your programming. And as a constructionist framework would tell us, a device like the Dynabook wouldn’t be so much about transmitting knowledge to a child but rather it would be about that child building and constructing her own knowledge on her own machine.

Despite looking a lot like today's tablet computer — like an iPad even, Kay insists that his idea for the Dynabook was something very very different. He told Time Magazine last year that the primary purpose of the Dynabook was "to simulate all existing media in an editable/authorable form in a highly portable networked (including wireless) form. The main point was for it to be able to qualitatively extend the notions of 'reading, writing, sharing, publishing, etc. of ideas' literacy to include the 'computer reading, writing, sharing, publishing of ideas’ that is the computer’s special province. For all media, the original intent was 'symmetric authoring and consuming’.”

Consumption and creation — a tension that's plagued the iPad since it was unveiled, but that the Dynabook was designed to handle at all levels. The hardware, the software, all editable, authorable, tinkerable, hackable, remixable, sharable. 

"Isn’t it crystal clear,” Kay continued, "that this last and most important service [authoring and consuming] is quite lacking in today’s computing for the general public? Apple with the iPad and iPhone goes even further and does not allow children to download an Etoy made by another child somewhere in the world. This could not be farther from the original intentions of the entire ARPA-IPTO/PARC community in the ’60s and ’70s."

For Kay, the Dynabook was meant to help build capacity so that children (so that adults too) would create their own interactive learning tools. The Dynabook was not simply about a new piece of hardware or new software — but, again, about a new literacy. 

A similar analysis to all this could be made about the programming language Logo. The ed-tech market is now flooded with applications and organizations that promise to teach kids programming. But they aren’t Logo, despite some of them utilizing very cute Turtle graphics. Much of these new “learn to code” efforts are about inserting computer science into the pre-existing school curriculum. Computers become yet another subject to study, another skill to be assessed.

In Papert’s vision, and in Kay’s as well, "the child programs the computer, and in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intense contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building.” But as Papert wrote in his 1980 book Mindstorms, "In most contemporary educational situations where children come into contact with computers the computer is used to put children through their paces, to provide exercises of an appropriate level of difficulty, to provide feedback, and to dispense information. The computer programming the child.”

The computer programming the child.

The computer isn’t some self-aware agent here, of course. This is the textbook industry programming the child. This is the testing industry programming the child. This is the technology industry, the education technology industry programming the child.

Despite Kay and Papert’s visions for self-directed exploration — powerful ideas and powerful machines and powerful networks — ed-tech hasn’t really changed much in schools. Instead, you might argue, it’s reinforcing more traditional powerful forces, powerful markets, powerful ideologies. Education technology is used to prop up traditional school practices, ostensibly to make them more efficient (whatever that means). Drill and kill. Flash cards. Just with push notifications and better graphics. Now in your pocket and not just on your desk.

Increasing, education technology works in concert with efforts — in part, demanded by education policies — for more data. We hear these assertions that more data, more analytics will crack open the "black box" of learning. Among those making these claims most loudly — and wildly — is the CEO of Knewton, a company that works with textbook publishers to make content delivery “adaptive.” Knewton says that it gathers millions of data points on millions of students each day. CEO Jose Ferreira calls education “the world’s most data-mineable industry by far.”

“We have five orders of magnitude more data about you than Google has,” Ferreira said at a Department of Education “Datapalooza."  “We literally have more data about our students than any company has about anybody else about anything, and it’s not even close.” He adds, “We literally know everything about what you know and how you learn best, everything.”

(Knewton knows everything except apparently the meaning of the word “literally.”)

Education technology has become about control, surveillance, and data extraction. Ivan Illich, Neil Postman, Paulo Freire, Paul Goodman — none of them would be surprised to hear that, having already identified these tendencies in the institutions and practices of school.

But to say this — education technology has become about control, surveillance, and data extraction — runs counter to the narrative that computer technologies are liberatory. That they will open access to information. That they will simplify sharing. That they will flatten hierarchies, flatten the world. 

Not so.

I’ve heard it suggested often that the World Wide Web is an example of what Ivan Illich called “convivial tools” — although his book predates the Web by 15+ years, Illich speaks of “learning webs” in Deschooling Society. I grow less and less certain that the Web is quite “it." But of this, I am:

Education technology is not convivial.

Some explanation of what Illich meant by this term, recognizing of course that it’s part of his larger critique of modern institutions:

He argued that, "As the power of machines increases, the role of persons more and more decreases to that of mere consumers.” In order to build a future society that is not dominated by machines or by industry then, we need to "learn to invert the present deep structure of tools; if we give people tools that guarantee their right to work with high, independent efficiency, thus simultaneously eliminating the need for either slaves or masters and enhancing each person’s range of freedom. People need new tools to work with rather than tools that ‘work' for them. They need technology to make the most of the energy and imagination each has, rather than more well-programmed energy slaves."

What are convivial tools? They are those that are easy to use. They should be reliable. They should be repairable and durable — and already we can see here how the “planned obsolescence” of so much of technology veers away from conviviality. Convivial tools should be accessible — free, even. They are non-coercive. They should, according to Illich, support autonomy and agency and enhance the “graceful playfulness” in our social relationships.

"Oh, that sounds like user-centered design!” you might say. Or “the free software movement does this.” And again, I have to say: not quite. 

Again, the title I gave this talk was “Convivial Tools in an Age of Surveillance.” And perhaps that makes it easier to see the challenges in reconciling the conviviality of user-centered design or free software with the power of the state and of industry.

I could have easily chosen a different prepositional phrase. "Convivial Tools in an Age of Big Data.” Or “Convivial Tools in an Age of DRM.” Or “Convivial Tools in an Age of Venture-Funded Education Technology Startups.” Or “Convivial Tools in an Age of Doxxing and Trolls."

It’s that last one that’s been in my mind a lot lately, particularly in the wake of GamerGate, the ongoing harassment and threats against women in gaming, and more broadly the culture of the tech sector that claims to be meritocratic but is assuredly not. 

What would convivial ed-tech look like? 

The answer can’t simply be “like the Web” as the Web is not some sort of safe and open and reliable and accessible and durable place. The answer can’t simply be “like the Web” as though the move from institutions to networks magically scrubs away the accumulation of history and power. The answer can’t simply be “like the Web” as though posting resources, reference services, peer-matching, and skill exchanges — what Illich identified as the core of his “learning webs” — are sufficient tools in the service of equity, freedom, justice, or hell, learning.

“Like the Web” is perhaps a good place to start, don’t get me wrong, particularly if this means students are in control of their own online spaces — its content, its data, its availability, its publicness. “Like the Web” is convivial, or close to it, if students are in control of their privacy, their agency, their networks, their learning. We all need to own our learning — and the analog and the digital representations or exhaust from that. Convivial tools do not reduce that to a transaction — reduce our learning to a transaction, reduce our social interactions to a transaction.

I'm not sure the phrase "safe space" is quite the right one to build alternate, progressive education technologies around, although I do think convivial tools do have to be “safe” insofar as we recognize the importance of each other’s health and well-being. Safe spaces where vulnerability isn’t a weakness for others to exploit. Safe spaces where we are free to explore, but not to the detriment of those around us. As Illich writes, "A convivial society would be the result of social arrangements that guarantee for each member the most ample and free access to the tools of the community and limit this freedom only in favor of another member’s equal freedom.” 

We can’t really privilege “safe” as the crux of “convivial” if we want to push our own boundaries when it comes to curiosity, exploration, and learning. There is risk associated with learning. There’s fear and failure (although I do hate how those are being fetishized in a lot of education discussions these days, I should note.) 

Perhaps what we need to build are more compassionate spaces, so that education technology isn’t in the service of surveillance, standardization, assessment, control.

Perhaps we need more brave spaces. Or at least many educators need to be braver in open, public spaces -- not brave to promote their own "brands" but brave in standing with their students. Not "protecting them” from education technology or from the open Web but not leaving them alone, and not opening them to exploitation.
 
Perhaps what we need to build are more consensus-building not consensus-demanding tools. Mike Caulfield gets at this in a recent keynote about “federated education.” He argues that "Wiki, as it currently stands, is a consensus *engine*. And while that’s great in the later stages of an idea, it can be deadly in those first stages.” Caulfield relates the story of the Wikipedia entry on Kate Middleton’s wedding dress, which, 16 minutes after it was created, "someone – and in this case it probably matters that is was a dude – came and marked the page for deletion as trivial, or as they put it 'A non-notable article incapable of being expanded beyond a stub.’” Debate ensues on the entry’s “talk” page, until finally Jimmy Wales steps in with his vote: a “strong keep,” adding "I hope someone will create lots of articles about lots of famous dresses. I believe that our systemic bias caused by being a predominantly male geek community is worth some reflection in this context.”  

Mike Caulfield has recently been exploring a different sort of wiki, also by Ward Cunningham. This one — called the Smallest Federated Wiki — doesn’t demand consensus like Wikipedia does. Not off the bat. Instead, entries — and this can be any sort of text or image or video, it doesn’t have to “look like” an encyclopedia — live on federated servers. Instead of everyone collaborating in one space on one server like a “traditional” wiki, the work is distributed. It can be copied and forked. Ideas can be shared and linked; it can be co-developed and co-edited. But there isn’t one “vote” or one official entry that is necessarily canonical.

Rather than centralized control, conviviality. This distinction between Wikipedia and Smallest Federated Wiki echoes too what Illich argued: that we need to be able to identify when our technologies become manipulative. We need "to provide guidelines for detecting the incipient stages of murderous logic in a tool; and to devise tools and tool systems that optimize the balance of life, thereby maximizing liberty for all."

Of course, we need to recognize, those of us that work in ed-tech and adopt ed-tech and talk about ed-tech and tech writ large, that convivial tools and a convivial society must go hand-in-hand. There isn’t any sort of technological fix to make education better. It’s a political problem, that is, not a technological one. We cannot come up with technologies that address systematic inequalities — those created by and reinscribed by education— unless we are willing to confront those inequalities head on. Those radical education writers of the Sixties and Seventies offered powerful diagnoses about what was wrong with schooling. The progressive education technologists of the Sixties and Seventies imagined ways in which ed-tech could work in the service of dismantling some of the drudgery and exploitation.

But where are we now? Instead we find ourselves with technologies working to make that exploitation and centralization of power even more entrenched. There must be alternatives — both within and without technology, both within and without institutions. Those of us who talk and write and teach ed-tech need to be pursuing those things, and not promoting consumption and furthering institutional and industrial control. In Illich’s words: "The crisis I have described confronts people with a choice between convivial tools and being crushed by machines."

Sorry. That's the best I can do for a happy ending: remind us that we have to make a choice.

Blog Logo

Audrey Watters


Published

Image

Hack Education

The History of the Future of Education Technology

Back to Blog