read

Here are the notes and slides from my talk this morning at ALT-C 2014.

Ed-Tech's Monsters


On Monday, on our way up here to Coventry, we — that is, my mum, my boyfriend, and I — stopped at Bletchley Park, the site of the British government’s Code and Cypher School during the Second World War and the current location of the National Museum of Computing.

When we were planning our trip, I mentioned to my mum that I wanted to stop at Bletchley Park, and she said “Oh! Your grandfather did some work there” — a bit of family history I’d like to have known, as someone who writes about computers, but a bit of family history that I hadn’t considered until that moment. It makes sense, during the war my grandfather was the station commander at Chain Home Low, an early warning radar base, and later became Air Officer Commanding-in-Chief at Signals Command. Although he was knighted for his work on the development of radar, I’m not sure how much he really talked about that work with the family. My granny said that during the war she never actually knew what he did. She never asked. And he passed away before many of his stories were no longer classified.

I am, as some of you know, a folklorist by academic training. Not an instructional designer. Not an education psychologist. Not an entrepreneur. Not an investor. Not a computer scientist. Not much of a journalist. 

I am — insomuch as our disciplinary training is a proxy for our intellectual and our political interests — fascinated by storytelling, particularly in these sorts of hidden histories and lost histories and forgotten histories: my grandfather’s involvement at Bletchley Park, for example, and more broadly, the role of computer science in surveillance and war. 

What stories do we tell? Whose stories get told? How do these stories reflect and construct our world — worlds of science, politics, culture, and of course, education?

I try in my work to trace and retrace the connections through narratives and counter-narratives, through business and bullshit. My keynote this morning is going to try to string together a number of these stories (actually a lot of stories. I hope you’ve had coffee, and my apologies to everyone with a hangover. If this keynote doesn't make sense, blame the booze not me) — from history and from theory and from literature and from science.

See, when I heard that the theme of the conference was “Riding Giants,” I confess: I didn’t think about waves (even though I live in Southern California, in the heart of its surfer culture). I didn’t think about the Isaac Newton saying “standing on the shoulders of giants.” 

I thought about giants the way, I suppose, a folklorist would.

And as such I want to talk this morning about ed-tech’s monsters and machines.

I want us to think about Bletchley Park on the road to where we find ourselves today, knowing that there are divergent paths and other stories all along the way.

No doubt, we have witnessed in the last few years an explosion in the ed-tech industry and a growing, a renewed interest in ed-tech. Those here at ALT-C know that ed-tech is not new by any means; but there is this sense from many of its newest proponents (particularly in the States) that ed-tech has no history; there is only now and the future. 

Ed-tech now, particularly that which is intertwined with venture capital, is boosted by a powerful forms of storytelling: a disruptive innovation mythology, entrepreneurs' hagiography, design fiction, fantasy.

A fantasy that wants to extend its reach into the material world. 


Society has been handed a map, if you will, by the technology industry in which we are shown how these brave ed-tech explorers have and will conquer and carve up virtual and physical space. 

Fantasy.


We are warned of the dragons in dangerous places, the unexplored places, the over explored places, the stagnant, the lands of outmoded ideas — all the places where we should no longer venture. 

Hic Sunt Dracones. There be dragons.

Instead, I’d argue, we need to face our dragons. We need to face our monsters. We need to face the giants. They aren’t simply on the margins; they are, in many ways, central to the narrative.

I’m in the middle of writing a book called Teaching Machines, a cultural history of the science and politics of ed-tech. An anthropology of ed-tech even, a book that looks at knowledge and power and practices, learning and politics and pedagogy. My book explores the push for efficiency and automation in education: “intelligent tutoring systems,” “artificially intelligent textbooks,” “robo-graders,” and “robo-readers.” 

This involves, of course, a nod to “the father of computer science” Alan Turing, who worked at Bletchley Park of course, and his profoundly significant question “Can a machine think?” 

I want to ask in turn, “Can a machine teach?” 

Then too: What will happen to humans when (if) machines do “think"? What will happen to humans when (if) machines “teach”? What will happen to labor and what happens to learning? 

And, what exactly do we mean by those verbs, “think” and “teach”? When we see signs of thinking or teaching in machines, what does that really signal? Is it that our machines are becoming more “intelligent,” more human? Or is it that humans are becoming more mechanical? 

Rather than speculate about the future, I want to talk a bit about the past.

Long before Bletchley Park or Alan Turing or the Colossus, machines have spoken in binary — ones and zeroes. Quite recently I literally etched this into my skin with two tattoos that “speak" to me while I write. 

My left forearm, in binary, a quotation from Walt Whitman’s “Leaves of Grass”: “Resist much, obey little.” 

My right forearm, in binary, a quotation from Lord Byron’s “Song of the Luddites”: “Down with all kings but King Ludd.”

Poetry. Bodies. Resistance. Machines.

Lord Byron was one of the very very few defenders of the Luddites. His only appearance in the House of Lords was to give a speech challenging the 1812 Frame Breaking Act, which made destruction of mechanized looms punishable by death.

Ah the Luddites, those 19th century artisans who protested against the introduction of factory-owned “labor-saving” textile machines. And the emphasis, let’s be clear, should be on “labor” here, less on “machine.” The Luddites sought to protect their livelihoods, and they demanded higher wages in the midst of economic upheaval, mass unemployment, and the long Napoleonic Wars. They were opposed to the factories, the corporations owning the means of production, the mechanized looms.   

The Luddites were not really “anti-technology” per se, although that’s what the word has come to mean. From the Oxford English Dictionary:

Luddite: A member of an organized band of English mechanics and their friends, who (1811–16) set themselves to destroy manufacturing machinery in the midlands and north of England.

 The etymology: from the proper name Ludd with the suffix -ite. "According to Pellew's Life of Lord Sidmouth (1847) Ned Lud was a person of weak intellect who lived in a Leicestershire village about 1779, and who in a fit of insane rage rushed into a ‘stockinger's’ house, and destroyed two frames so completely that the saying ‘Lud must have been here’ came to be used throughout the hosiery districts when a stocking-frame had undergone extraordinary damage. The story lacks confirmation. It appears that in 1811–13 the nickname ‘Captain Ludd’ or ‘King Lud’ was commonly given to the ringleaders of the Luddites.”

Ludd was, as this image shows, often portrayed as a giant.

Today we use the word “Luddite" in what the OED calls the “transferred sense”: One who opposes the introduction of new technology, especially into a place of work. 

The sample usage the OED offers, from The Economist in 1986: "By suggesting...that the modern world has lost control of its technology, both [accidents] help to strengthen the hands of Luddites who would halt technology and therefore a lot of economic growth.”

To oppose technology or to fear automation, some like The Economist or venture capitalist Marc Andreessen argue, is to misunderstand how the economy works. (I’d suggest perhaps Luddites understand how the economy works quite well, thank you very much, particularly when it comes to questions of “who owns the machinery” we now must work on. And yes, the economy works well for Marc Andreessen, that’s for sure.)

In 1984 American novelist Thomas Pynchon asked “Is it ok to be a Luddite?” suggesting that, in the new Computer Age, it well may be that we have mostly lost our “Luddite sensibility.” We no longer resist or rage against the machines. But he adds that we might some day need to. He writes:

"If our world survives, the next great challenge to watch out for will come -- you heard it here first -- when the curves of research and development in artificial intelligence, molecular biology and robotics all converge. Oboy. It will be amazing and unpredictable, and even the biggest of brass, let us devoutly hope, are going to be caught flat-footed. It is certainly something for all good Luddites to look forward to if, God willing, we should live so long.”

And here we are, 30 years after Pynchon’s essay, facing pronouncements and predictions that our jobs — and not just the factory jobs, but the white collar jobs as well — are soon to be automated. “We are entering a new phase in world history—one in which fewer and fewer workers will be needed to produce the goods and services for the global population,” write Erik Brynjolfsson and Andrew McAfee in their book Race Against the Machine. “Before the end of this century,” says Wired Magazine’s Kevin Kelly, "70 percent of today’s occupations will…be replaced by automation.” The Economist offers a more rapid timeline: “Nearly half of American jobs could be automated in a decade or two,” it contends.

We are, some say, on the cusp of a great revolution in artificial intelligence and as such a great revolution in human labor. (Of course, the history of AI is full of predictions that are “two decades” away… But there you go. Like I said earlier, our technological storytelling is fantasy, fantastic.)

So thank you, Alan Turing, for laying the philosophical groundwork for AI. And thank you — ironically — Lord Byron. 

Lord Byron was the father of Ada Lovelace, who worked with Charles Babbage on his Analytical Engine. Ada Lovelace is often credited as the first computer programmer. (See, I love this sorts of connections.)

As we celebrate — probably the wrong verb — 200 years of Luddism, we should recall too another bicentenary that’s approaching. Lord Byron was there for that as well, when a small group of friends — Percy Bysshe Shelley, John William Polidori, Claire Clairmont, Mary Godwin — spent the summer of 1816 in Lake Geneva, Switzerland — “a wet, ungenial summer” — when they all decided to try their hands at writing ghost stories. There, Mary Godwin, later Mary Shelley, wrote Frankenstein, published in 1818, arguably the first work of science fiction and certainly one of the most important and influential texts on science, technology, and monsters.

Monsters, mind you, not machines.

"However much of Frankenstein's longevity is owing to the undersung genius James Whale who translated it to film,” writes Pynchon in his essay on Luddites, "it remains today more than well worth reading, for all the reasons we read novels, as well as for the much more limited question of its Luddite value: that is, for its attempt, through literary means which are nocturnal and deal in disguise, to deny the machine.”

While the laboratory visualized in Whale’s 1931 film is full of electrical and mechanical equipment, machines are largely absent from Mary Shelley’s novel. There are just a few passing mentions of the equipment necessary to cause that great "Galvanic twitch,” a couple of references to lightning, but that’s it. Pynchon argues that this absence is purposeful, that this aspect of the Gothic literary genre represented "deep and religious yearnings for that earlier mythic time which had come to be known as the Age of Miracles.” 

"To insist on the miraculous,” argues Pynchon, "is to deny to the machine at least some of its claims on us, to assert the limited wish that living things, earthly and otherwise, may on occasion become Bad and Big enough to take part in transcendent doings."

But even without machines, Frankenstein is still read as a cautionary tale about science and about technology; and Shelley’s story has left an indelible impression on us. Its references are scattered throughout popular culture and popular discourse. We frequently use part of the title — “Franken” — to invoke a frightening image of scientific experimentation gone wrong. Frankenfood. Frankenfish. The monster, a monstrosity — a technological crime against nature.

It is telling, very telling, that we often confuse the scientist, Victor Frankenstein, with his creation. We often call the monster Frankenstein.

As the sociologist Bruno Latour has argued, we don’t merely mistake the identity of Frankenstein; we also mistake his crime. It "was not that he invented a creature through some combination of hubris and high technology,” writes Latour, "but rather that he abandoned the creature to itself.” 

The creature — again, a giant — insists in the novel that he was not born a monster, but he became monstrous after Frankenstein fled the laboratory in horror when the creature opened his “dull yellow eye,” breathed hard, and convulsed to life.

"Remember that I am thy creature,” he says when he confronts Frankenstein, "I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed. Everywhere I see bliss, from which I alone am irrevocably excluded. I was benevolent and good— misery made me a fiend.”

As Latour observes, "Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.”

Our “gigantic sin”: we failed to love and care for our technological creations. We must love and educate our children. We must love and care for our machines, lest they become monsters.

Indeed, Frankenstein is also a novel about education. The novel is structured as a series of narratives — Captain Watson’s story — a letter he sends to his sister as he explores the Arctic— which then tells Victor Frankenstein’s story through which we hear the creature tell his own story, along with that of the De Lacey family and the arrival of Safie, “the lovely Arabian." All of these are stories about education: some self-directed learning, some through formal schooling.

While typically Frankenstein is interpreted as a condemnation of science gone awry, the novel can also be read as a condemnation of education gone awry. The novel highlights the dangerous consequences of scientific knowledge, sure, but it also explores how knowledge — gained inadvertently, perhaps, gained surreptitiously, gained without guidance — might be disastrous. Victor Frankenstein, stumbling across the alchemists and then having their work dismissed outright by his father, stoking his curiosity. The creature, learning to speak by watching the De Lacey family, learning to read by watching Safie do the same, his finding and reading Volney's Ruins of Empires and Milton’s Paradise Lost.

"Oh, that I had forever remained in my native wood, nor known or felt beyond the sensations of hunger, thirst, and heat!” the creature cries. 

In his article “Love Your Monsters,” Latour argues that Frankenstein is a “parable for political ecology.” Again, the lesson of the novel is not that we should step away from technological innovation or scientific creation. But rather we must strengthen our commitment and our patience and our commitment to all of creation — capital C creation now includes, Latour suggests, our technological creations, our machines.

Is Frankenstein a similarly useful parable for education technology? What are we to make of ed-tech’s monsters, of our machines? Is there something to be said here about pedagogy, technologies, and an absence of care? 

200 years of Luddites, 200 years of Frankenstein and — by my calculations at least — 150 some-odd years of “teaching machines."

To be clear, my nod to the Luddites or to Frankenstein isn’t about rejecting technology; but it is about rejecting exploitation. It is about rejecting an uncritical and unexamined belief in progress. The problem isn’t that science gives us monsters, it's that we have pretended like it is truth and divorced from responsibility, from love, from politics, from care. The problem isn’t that science gives us monsters, it’s that it does not, despite its insistence, give us “the answer." 

And that is problem with ed-tech’s monsters. That is the problem with teaching machines.

In order to automate education, must we see knowledge in a certain way, as certain: atomistic, programmable, deliverable, hierarchical, fixed, measurable, non-negotiable? In order to automate that knowledge, what happens to care?

Although teaching machines predate his work by almost a century, they are most often associated with the behaviorist Harvard psychology professor B. F. Skinner. 

An excerpt from Ayn Rand’s review of B. F. Skinner’s 1971 book Beyond Freedom and Dignity:

"The book itself is like Boris Karloff's embodiment of Frankenstein's monster,” Rand writes, "a corpse patched with nuts, bolts and screws from the junkyard of philosophy (Pragmatism, Social Darwinism, Positivism, Linguistic Analysis, with some nails by Hume, threads by Russell, and glue by the New York Post). The book's voice, like Karloff's, is an emission of inarticulate, moaning growls — directed at a special enemy: 'Autonomous Man.'"

I quote Rand’s stinging criticism of Skinner because of the Frankenstein reference, clearly: the accusation of a misbegotten creation of a misbegotten science. B. F. Skinner as Frankenstein. Rand implies here, with a fairly typical invocation of the film, that Skinner’s work is an attempt to “play God.” And we might see, as Rand suggests, Skinner’s creations as monsters — with a fixation on control, a rejection of freedom, and an absence of emotion or care.

To be clear, I quote Ayn Rand here with a great deal of irony. The Silicon Valley technology industry these days seems full of those touting her objectivist, laissez-faire version of libertarianism, her radical individualism. (Monstrous in its own right.) 

Rand uses Skinner as an example of the ills of federally funded research. She insists she does not want to “censor research projects” but instead to "abolish all government subsidies in the field of the social sciences.” A “free marketplace of ideas” where things like behaviorism will lose. 

But the “free marketplace of ideas” that a lot of libertarian tech types now want too actually values behaviorism quite a bit.

Rand criticizes Skinner for arguing that there is no freedom, that we are always controlled, that we should hand over our lives to scientific management full of “positive reinforcers.” For this behaviorist control, Rand will not stand. 

But behaviorist control mechanisms run throughout our technologies: gamification, notifications, nudges. 

The Turing Test — that foundational test in artificial intelligence — is, one might argue, a behaviorist test. The question isn’t, Alan Turing argued, “can a machine think?” but rather “can a machine exhibit intelligent behaviors and fool a human into thinking the machine is human?” 

Again, monsters and machines.

Before developing teaching machines, Skinner had worked on a number of projects, inventing as part of his graduate work, what’s now known as "the Skinner Box" around 1930. “The operant conditioning chamber,” the Skinner Box was used to study and to train animals to perform certain tasks. Do it correctly; get a reward (namely food). 

During World War II, Skinner worked on Project Pigeon, an experimental project to create pigeon-guided missiles. 

I cannot begin to tell you how much I wish I could have talked with my grandfather about Bletchley Park. Even more, how much I wish I could have asked him his thoughts about pigeons and radar.

The military canceled and revived Project Pigeon a couple of times. “Our problem,” said Skinner, "was no one would take us seriously.” By 1953, the military had devised an electronic system for missile guidance, and animal-guided systems were no longer necessary.

That same year, Skinner came up with the idea for his teaching machine. Visiting his daughter’s fourth grade classroom, he was struck by the inefficiencies. Not only were all the students expected to move through their lessons at the same pace, but when it came to assignments and quizzes, they did not receive feedback until the teacher had graded the materials — sometimes a delay of days. Skinner believed that both of these flaws in school could be addressed through a machine, and built a prototype which he demonstrated at a conference the following year.

All these elements were part of Skinner’s teaching machines: the elimination of inefficiencies of the teacher, the delivery of immediate feedback, the ability for students to move through standardized content at their own pace.

Today’s ed-tech proponents call this “personalization.”

Addressing social problems — including problems like school — for Skinner, meant addressing behaviors. As he wrote in Beyond Freedom and Dignity, “We need to make vast changes in human behavior. . . . What we need is a technology of behavior.” Teaching machines are one such technology.

Teaching — with or without machines — was viewed by Skinner as reliant on a “contingency of reinforcement.” The problems with human teachers’ reinforcement, he argued, were severalfold. First, the reinforcement did not occur immediately; that is, as Skinner observed in his daughter’s classroom, there was a delay between students completing assignments and quizzes and their work being corrected and returned. Second, much of the focus on behavior in the classroom has to do with punishing students for "bad behavior" rather than rewarding them for good.

“Any one who visits the lower trades of the average school today will observe that a change has been made, not from aversive to positive control, but from one form of aversive stimulation to another,” Skinner writes. But with the application of behaviorism and the development of teaching machines, “There is no reason,” he insisted, “why the schoolroom should be any less mechanized than, for example, the kitchen.”

But maybe there are reasons.

Maybe monsters and Luddites can help us formulate our response.

According to Google Ngrams, a tool that tracks the frequency of words in the corpus of books that the company has digitized, as society became more industrialized, we steadily and increasingly talked about Luddites, a reflection dare I say, of longstanding concerns about the changing nature of work and society.  Increasing, that is, until the turn of the 21st century, when according to Google at least and to paraphrase Dr. Strangelove, we learned to stop worrying and love the machine. 

By “love” here, I mean fascination. An enchantment with the shiny and the new. Acquiescence, not engagement, be it political, scientific, or sociological. 

This is not what Bruno Latour meant when he told us to “love our monsters.”

As our interest in Luddites seemingly declines, I fear, we face what Frankenstein counseled against: a refusal to take responsibility. We see technology as an autonomous creation, one that will move society (and school) forward under its own steam and without our guidance. 

Wired Magazine’s Kevin Kelly offers perhaps the clearest example of this in his book What Technology Wants. Technology, he writes, "has its own wants. It wants to sort itself out, to self-assemble into hierarchical levels, just as most large, deeply interconnected systems do. The technium also wants what every living system wants: to perpetuate itself, to keep itself going. And as it grows, those inherent wants are gaining in complexity and force.” 

That, I think, is monstrous. That is Frankenstein’s monster.

Kelley later tells us, “We can choose to modify our legal and political and economic assumptions to meet the ordained trajectories [of technology] ahead. But we cannot escape from them.” 

Throw up our hands and surrender, this argument suggests. Surrender to “progress,” to the machine.

But it is a slight-of-hand to maintain that technological changes are “what technology wants.” It’s an argument that obscures what industry, business, systems, power want. It is intellectually disingenuous. It is politically dangerous.

What does a “teaching machine” want, for example? Or to change the sentence slightly, “what does a ‘teaching machine’ demand?” 

I’ll echo Catherine Cronin who yesterday said that education demands our political interest and engagement. I insist too that technology demands our political interest and engagement. And to echo Latour again, "our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.” Political interest and engagement is love; it is love for the world. Love, and perhaps some Ludditism.

I’ll leave you with one final quotation, from Hannah Arendt who wrote,

"Education is the point at which we decide whether we love the world enough to assume responsibility for it and by the same token save it from that ruin which, except for renewal, except for the coming of the new and young, would be inevitable. And education, too, is where we decide whether we love our children enough not to expel them from our world and leave them to their own devices, nor to strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world.”

 Our task, I believe, is to tell the stories and build the society that would place education technology in that same light: “renewing a common world.” 

We in ed-tech must face the monsters we have created, I think. These are the monsters in the technologies of war and surveillance a la Bletchley Park. These are the monsters in the technologies of mass production and standardization. These are the monsters in the technologies of behavior modification a la BF Skinner. 

These are the monsters ed-tech must face. And we must all consider what we need to do so that we do not create more of them.

Audrey Watters


Published

Hack Education

The History of the Future of Education Technology

Back to Archives