Below are the notes and the slides from my talk today at Open Education 2013. David Kernohan and I shared the morning keynote slot today, and we were asked by David Wiley to offer a critique of open education. And so we did. You can find more details about Kernohan's talk here. Be sure to watch the documentary he made.
The Education Apocalypse
A couple of years ago, the Christian radio broadcaster Harold Camping predicted that Jesus would return to earth on May 21, 2011. The Rapture would occur, as alluded to in 1 Thessalonians 4:17 — when the "dead in Christ" and "we who are alive and remain" will be "caught up in the clouds" to meet "the Lord in the air.” That is, the souls of all the righteous — living or dead — would be lifted into Heaven.
When Camping emerged from his home on May 22, the morning after the date he’d set — “flabbergasted” — he revised his predictions. Initially, he’d stated that the May 21 Rapture would be followed by five months of fire and brimstone before the world ended on October 21.
His revision: the Rapture and the end of the world would both occur on October 21, 2011. Camping never had a huge following. But his radio station, Family Radio, did broadcast in over 150 markets. And rather than relying solely on the airwaves, the station bought billboards all over the country, warning people of the impending Judgement Day.
It was a successful marketing campaign. Camping raised millions of dollars in donations — because that’s what you do to prepare for the apocalypse, I guess: you write a check to a preacher. However, as far as end-times predictions go, Camping’s was a failure.
Doom- or salvation- filled, predictions about the end of the world have always been — well, up ’til now as here we stand today — wrong. Again and again and again. Wikipedia lists about 170 dates predicting the end of the world that have come and gone, and about a dozen more dates still yet to come.
One of those dates: 2045. The Singularity.
The Singularity is that moment when, as futurist, AI researcher, and Google’s director of engineering Ray Kurzweil notes in the subtitle to his book The Singularity is Near, humans transcend their biology. Google’s director of engineering. Let me restate that. Not some guy who runs a Christian radio show on a handful of markets.
The Singularity marks the creation of a technological super-intelligence, the replacement of our frail human flesh with more advanced machines — ever more complex prosthetics in ever more complex roles — hearing implants, eye implants, limb replacements, the human brain itself. The Singularity, some say, could bring about vast self-replicating nanotechnology, resulting in a grey goo that will eventually cover the entire planet. But we need not worry, according to Kurzweil, if the environment is altered or destroyed. We will in decades to come be able to upload our minds into computers.
Much like the Rapture has long promised, once we abandon the limitations of our earthly existence, we will be able to live forever.
We will live forever inside the machine. Technological salvation. “Rapture of the nerds.”
No surprise, many futurists balk at the phrase “Rapture of the nerds” to describe the Singularity. They object to the comparison to religious eschatology, arguing that there are no significant parallels between these two predictions about the end of the world as we know it. The Singularity is rational, they insist. Religious faith is not. Unlike other failed end-times predictions, the Singularity is coming. Really. They insist. That’s because the Singularity is science, while Judgment Day, a myth.
Now, folklorists bristle at the common usage of the word “myth” to imply “a lie.” A myth, by folklorists’ disciplinary definition, is quite the opposite. A myth is a culture’s most sacred story. It involves supernatural or supreme beings — gods. It explains origins and destinies. A myth is the Truth.
I want to talk to you today about narratives of the education apocalypse, about eschatology and mythology and MOOCs and millennialism, and I do so not just as a keen observer of education technology but as someone trained as a folklorist. As much as being an ed-tech writer compels me to pay attention to the latest products and policies and venture capital investment, I am fascinated by the stories we tell about all of this. I am fascinated by what I see as some of the dominant end-times myths of the business world, of the tech industry. I am fascinated by how these myths — these sacred stories — are deployed to talk about the end of the world —or at least “the end of the university as we know it,” as Techcrunch puts it with the fervor of a true believer.
And in the great American tradition of Harold Camping and Ray Kurzweil and Cotton Mather, “the end of the university as we know it” — the education apocalypse — has a date. Or several dates, depending on whose prediction you heed.
Take Sebastian Thrun’s MOOC millenialism, for example. An AI researcher, a faculty member at Stanford and, incidentally, at Singularity University (an education for-profit founded by Ray Kurzweil), and co-founder of the MOOC startup Udacity, Thrun has predicted that in 50 years, "there will be only 10 institutions in the world delivering higher education and Udacity has a shot at being one of them.”
50 years. That’s 2062 — about 17 years after the date Kurzweil has given for the Singularity.
A planet covered in grey goo with 10 university-teaching-machines delivering educational content to student-machines. That would be one helluva education apocalypse.
While the Singularity is listed on Wikipedia’s page of end-of-the-world predictions, the MOOC-apocalypse is not.
Nor is the “campus tsunami” — the phrase used by Stanford president John Hennessy and subsequently by pundit David Brooks to describe the coming end-times for higher education. Nor is the “education avalanche” — the phrase used by Pearson’s education consultant Sir Michael Barber to describe the cataclysm that’s about to bury us.
Nor is “disruptive innovation,” which I believe is one of the most influential millennialist myths of the contemporary business world, particularly of the tech industry. Of course, Clayton Christensen, the originator of the phrase, does not offer a specific date when all of humankind will be disrupted; rather it's an industry by industry transformation — an industry by industry salvation by market-forces, this time. Not God, but that other “invisible hand."
Again, I don’t mean by “myth” that Clayton Christensen’s explanation of changes to markets and business models and technologies — “disruptive innovation” — is a falsehood.
Rather, my assigning “myth” to “disruptive innovation” is meant, in part, to highlight the ways in which this narrative has been widely accepted as unassailably true. No doubt as a Harvard professor Christensen has faced very little skepticism or criticism about his theory about the transformation of industries — why, it’s as if his book The Innovator’s Dilemma were some sort of sacred text.
Helping to enhance its mythic status, the storytelling around “disruptive innovation” has taken on another, broader and looser dimension as well, as the term is now frequently invoked in many quarters to mean something different from Christensen’s original arguments in The Innovator’s Dilemma.
In this same vein, almost every new app, every new startup, every new technology — if you believe the myth-making-as-marketing at least — becomes a disruptive innovation: limo-summoning iPhone apps (e.g. Uber), photo-sharing iPhone apps (e.g. Path), online payments (e.g. Stripe), electric vehicles (e.g. Tesla), cloud computing (e.g. Amazon Web Services), 3D printers (e.g. Makerbot), video-based lectures (e.g. Khan Academy), MOOCs (e.g. Coursera), social search (e.g. Facebook Graph Search), and so on.
The companies I just named might very well be innovative — in their technologies and their business models. That’s beside the point if you’re looking for "disruptive innovation,” which is a pretty specific sort of “creative destruction” — ah, the unexamined millennialism in our political theorists, eh? — defined by Christensen as "a process by which a product or service takes root initially in simple applications at the bottom of a market" — that is, with customers who are not currently being served — "and then relentlessly moves up market, eventually displacing established competitors."
Per Christensen’s framework, there are other sorts of innovations that aren’t “disruptive” in this manner. There are, for example, “sustaining innovations” — that is, products and services that strengthen the position (and the profits) of incumbent organizations.
But that’s not the mythology embraced by the tech industry, which despite its increasing economic and political power, continues to see itself as an upstart not an incumbent.
And as a self-appointed and self-described disruptor, the tech industry seems to have latched on to the most millennial elements of Christensen’s theories — that is, the predictions about the destruction of the old and the ascension of the new — all at the hands of technology: The death of the music industry. The death of newspapers. The death of print. The death of the library. The death of Hollywood. The death of the video rental store. The death of the university. The death of the Web. The death of RSS. All predicted to be killed suddenly or eventually by some sort of “disruptive innovation.”
The structure to this sort of narrative is certainly a well-known and oft-told one in folklore — in tales of both a religious and secular sort. Doom. Suffering. Armageddon. Then paradise.
People are drawn to the “end of the world as we know it” stories — for reasons that have to do with both the horrors of the now and the heaven promised in the future. Many cultures (and Silicon Valley is, despite its embrace of science and technology, no different here) tell stories that predict some sort of cataclysmic event that will bring about a radical cultural (or economic or political) transformation and, eventually -- if things work out well -- some sort of better world.
The Book of Revelations. The Mayan Calendar. The Shakers. The Ghost Dance. Nuclear holocaust. The Singularity. MOOCs.
I’ll be the first to admit that the data in folklore professor Dan Wojcik’s book The End of the World As We Know It is dated (and full confession: he was my advisor for my Master’s Thesis, circa 2000). He published his book on end-times and American culture in 1997 — interestingly, the same year that The Innovator’s Dilemma hit store shelves.
Wojcik’s analysis of a sweeping societal belief in “the end of the world” was well-timed, just a few years before the year 2000, a date for many end-times predictions, from Edgar Cayce to Sun Myung Moon. But 2000 wasn’t just a date for religious millennialism. There were also substantial technological anxieties surrounding Y2K — perhaps you recall: the fear that computers would not be able to roll over from year 99 to 00 to mark the new millennium, that there would be widespread technological and thus economic and political collapse as a result. That makes Wojcik’s book about millennialism and American culture an interesting and contrasting companion to Christensen’s and his contention that we will witness “the end” of certain organizations thanks to technological “innovation.”
How pervasive is the belief in these end-times myths? Wojcik noted that, according to a survey by Nielsen — and again, these are dated statistics — some 40% of Americans believed that there was nothing we could do to prevent nuclear holocaust. 60% believe in Judgment Day. 44% in the Battle of Armageddon. 44% in the Rapture.
Around 20% of Americans believed that the Y2K bug would disrupt their lives, for what it’s worth.
Subsequent surveys found that belief in the end-of-the-world among Americans rose slightly after 9-11. One in 10 believed that the world would end, as the Mayan calendar predicted, in 2012. One in 5 believe that the world will end in their lifetime.
How does this pervasive belief that we’re living in the End Times shape how we view the future? How does it shape how we view education which is, after all, one of the primary means by which we prepare for that future.
How many Americans believe in “the Singularity”? We hear more and more headlines every day about robots taking over our jobs. How does this shape the way in which we plan for the future — again, how does it shape how we think about teaching and learning?
I’m curious to know those numbers, not necessarily among the American public at large, but of those those in Silicon Valley, particularly those in the field of artificial intelligence who are building education startups.
And I can’t find any polls so this is a guess, sure, but I’d wager than many in Silicon Valley — most even — believe in the doctrine of “disruptive innovation.”
But whatever the polling tells us, I’m interested in how the power and the influence of storytelling.
I think that Christensen’s “disruptive innovation” story taps into these same powerful narratives about the end-times, told, as always by the chosen ones — be they Americans, Christians, Shakers, Heaven’s Gate followers, survivalists, Java programmers, venture capitalists, Techcrunch, or Harvard Business School professors. Folks do seem drawn to these millennial stories, particularly when they help frame and justify our religious, moral, economic, political, cultural, social, technological world views.
Harvard Business School might seem like a surprising place of origin for a millennialist framework. To use Christensen’s own terminology, Harvard seems like a place that offers “sustainment” not “disruption.” But it’s worth pointing out that Christensen is a member of the Church of Latter Day Saints. He attended BYU here in Utah. He describes himself in a keynote last month at the National Summit on Education Reform, as “a religious man” who “believe[s] in the doctrines that are taught.” In that keynote on the topic of the history of business, measurements and innovation, Christensen invokes a number of religious metaphors to explain how business practices work. He talks about about “a Church of New Finance” where "the people who belong to that organization believe that the doctrine that are taught about finance are actually true.” “The high priests in this church,” he suggests, "are business professors like me.”
What do these high priests preach?
Well, here are a couple of education-related end-times predictions from Christensen:
- In 15 years, half of US universities may be bankrupt.
- By the year 2019 half of all classes for grades K–12 will be taught online.
- And just last weekend in a feature in The New York Times on “The Disruptors”: he wrote that "a host of struggling colleges and universities — the bottom 25 percent of every tier, we predict — will disappear or merge in the next 10 to 15 years."
Disruptive innovation will be, as the acolytes among the technology press are happy to echo, the end of school as we know it.
Such is its inevitability, so the story goes, that new players can enter the education market and, even though their product is of lower quality and appeals to those who are not currently “customers,” oust the incumbent organizations. (Incumbents, in this case, are publicly funded, brick-and-mortar schools.) As Christensen and his co-authors argued in their book Disrupting Class in 2008, “disruption is a necessary and overdue chapter in our public schools.”
But like many millennialist prophets are wont to do when their end-times predictions don’t quite unfold the way they originally envisioned — like Harold Camping and his prediction about May 21 and October 21, 2011 — Clayton Christensen and his disciples at the Clayton Christensen Institute have tweaked their forecast about (public) education’s future. 5 years post-Disrupting Class, "disrupting class" will look a bit different, they now say.
Earlier this year, the organization released a new white paper, detailing a new path for transformation that rests somewhere in between disruptive and sustaining innovations: they call it “hybrid innovations.”
"A hybrid is a combination of the new, disruptive technology with the old technology and represents a sustaining innovation relative to the old technology."
It’s an interesting revision of the organization’s predictions in Disrupting Class, the book which first applied “disruptive innovation” to education technology and that argued online learning would be a way to “modularize the system and thereby customize learning.”
Not so fast, the organization now says. Hybrid innovation. "Blended learning." A little bit online and a little bit offline. And while middle- and high schools (and colleges, although that isn’t the subject of this particular white paper) might offer opportunities for “rampant non-consumption,” -- that is, classically, an opportunity for "disruption" -- “the future of elementary schools at this point is likely to be largely, but not exclusively, a sustaining innovation story for the classroom.” Computer hardware and software and Internet-access in the classroom, as those of us who've been thinking about education technology for decades now keep saying, won't necessarily change "everything.”
Of course, even in Disrupting Class, the predictions of the ed-tech end-times were already oriented towards changing the business practices, not necessarily the pedagogy or the learning. And the promise of a thriving education technology eschatology were already muted in Christensen's earliest formulations, by the “restrictions” placed upon the education sector — restrictions by virtue of education being a public and not a private institution, of education not being beholden to market forces quite the same way that the other examples that the mythology of “disruptive innovation” has utilized to explain itself.
“People did not create new disruptive business models in public education, however.,” Christensen writes. "Why not? Almost all disruptions take root among non-consumers. In education, there was little opportunity to do that. Public education is set up as a public utility, and state laws mandate attendance for virtually everyone. There was no large, untapped pool of non-consumers that new school models could target.”
This latest Christensen Institute white paper clarifies then that the future of education isn't necessarily (or utterly or easily) "disrupted." There are limits to the predictions, to the predictive models, to the business school approach to education change and such.
Like so many millennialist entities faced with the harsh realities of faltering predictions, we are offered a new prediction instead.
But, let's be clear, the organization doesn't just predict the future of education. The Clayton Christensen Institute does not just offer models -- business models -- for the future. It does not simply observe an always changing (education) technology market. It has not simply diagnosed the changes due to technological advancements. It has not simply prophesied or predicted what future outcomes might be.
It has actively sought to shape the future to look a certain way. It has lobbied governments for certain aspects of its agenda, becoming a vocal proponent for its particular vision of a disrupted future.
"Over time," the new white-paper reads, "as the disruptive models of blended learning improve, the new value propositions will be powerful enough to prevail over those of the traditional classroom." And so, according to the Christensen mythology, disruption will prevail.
So is written. So it is told.
That’s the mythology at least.
And again. I was trained as a folklorist. I appreciate, I respect the power of sacred stories. But I am also a recovering academic. So I think it’s worth asking ourselves about what we’re taking on faith here.
Where in the stories were telling about the future of education are we seeing salvation? Why would we locate that in technology and not in humans, for example? Why would we locate that in markets and not in communities? What happens when we embrace a narrative about the end-times — about education crisis and education apocalypse? Who’s poised to take advantage of this crisis narrative? Why would we believe a gospel according to artificial intelligence, or according to Harvard Business School, or according to Techcrunch, or according to David Brooks or Thomas Friedman? What is sacred when it comes to the stories we tell about teaching and learning? And what — despite being presented to us as holy and unassailable — might actually be quite profane?