Here are the notes and the slides from my keynote today at the Canadian Network for Innovation in Education conference.
When I first was asked a couple of months ago to let the conference organizers know the title for my keynote today, I quickly glanced at the theme of the event and crafted some lengthy, semi-provocative phrase that would, I hoped, allow me to make the argument here that I make fairly often:
There’s a significant divide — a political and financial and cultural and surely a pedagogical divide — between the technology industry (Silicon Valley in particular) and the education sector when it comes to thinking about the future of teaching and learning and also when it comes to thinking about the meaning of “innovation.” As we move forward with our adoption of educational technologies, we must be more thoughtful, dare I say more vigilant about the implications of that divide.
The original title I offered focused on the word “culture” because despite being a PhD dropout, I remain a scholar of culture of sorts and not a businessperson or a technologist (despite spending far too much timing writing and thinking about business and technology). I also wanted to skew slightly what’s often the typical comparison made between technology companies and educational institutions: that the former are agile, readily pushing for and adapting to change, while the latter are ancient bureaucracies that are slow, if not utterly resistant to change.
This comparison leads people to say things like “culture eats strategy for breakfast” — one of those pithy business guru phrases that you hear invoked to talk about change and leadership. In the case of education, “culture eats strategy for breakfast” cautions business and technology leaders, and dare I say school administrators, that the slow-shifting nature of higher education will derail any plans for “innovation” unless that underlying culture can be addressed first. Change the culture of education. Then you can innovate. Or something like that.
So I thought initially that I’d like to talk to you today about the “culture” of innovators in tech and the “culture" of innovators in education (well, to be clear, those I’d label as innovators in education). Because even if both those groups are pushing for change, their values — their cultures — are incredibly different.
One culture values openness and collaboration and inquiry and exploration and experimentation. The other has adopted a couple of those terms and sprinkled them throughout its marketing copy, while promising scale and efficiency and cost-savings benefits. One culture values community, and the other reflects a very powerful strain of American individualism — not to mention California exceptionalism — one that touts personal responsibility, self-management, and autonomy.
One gives us something like this as a site to rethink teaching and learning and technology (This is the YOUMedia, a learning space within the Chicago Public Library, where among other things, teens can learn programming and multimedia).
The other gives us something like this.
One gives us something like this:
And the other gives us something like this. Or wants us to buy something like this. Or become the product of something like this:
Google Glass, a truly ostentatious eyepiece, a heads-up display that allows you to search and dictate and record and mostly look like an idiot — and of course allows Google to glean the metadata from your activity along the way — has become a symbol of what is, I think, a growing divide surrounding technology, innovation, and ideology — a “tech culture war” as the headline writers at Salon seem to really really like to call it
"Glass Explorers” — that’s Google's name for those in the pilot program. “Glassholes” is what others call the wearers. People wearing Google Glass in San Francisco — more accurately, people wearing Google Glass into punk rock bars in the Lower Haight — have had the $1500 devices knocked off their faces. “Attacked!” the Explorers gasp, for videotaping people who don’t want to be videotaped.
The private commuter buses for the employees of Google and Apple and other tech companies have become another lightning rod in this “tech culture war" — symbols of privatization and inequality, with protests and blockades occurring regularly along their routes. As San Francisco-based writer Rebecca Solnit describes it:
"The Google Bus means so many things. It means that the minions of the non-petroleum company most bent on world domination can live in San Francisco but work in Silicon Valley without going through a hair-raising commute by car – I overheard someone note recently that the buses shortened her daily commute to 3.5 hours from 4.5. It means that unlike gigantic employers in other times and places, the corporations of Silicon Valley aren’t much interested in improving public transport, and in fact the many corporations providing private transport are undermining the financial basis for the commuter train. It means that San Francisco, capital of the west from the Gold Rush to some point in the 20th century when Los Angeles overshadowed it, is now a bedroom community for the tech capital of the world at the other end of the peninsula."
As the investment dollars have flooded the computer industry in recent years, the cost of living in what was already one of the world’s most expensive cities has become out of reach for almost all its residents. The median rent for an apartment in San Francisco is now $3000 per month. The rent for a 2 bedroom apartment has gone up 33% in just the last two years. “Not a Single Home Is for Sale in San Francisco That an Average Teacher Can Afford,” read a Bloomberg Business Week headline earlier this year.
While paying lip service to “meritocracy” — the myth that anyone who works hard enough can make it — the technology industry remains quite hostile to women — just 13% of venture-capital funded startups are founded by women (that compares to about 30% of small businesses which are woman-owned). The industry also lacks racial diversity — 83% of VC funded startups are all white. Between 2009 and 2011, per capita income rose by 4% for white Silicon Valley residents and fell by 18% for black residents.
As the (largely white, male) engineers and entrepreneurs pour in to Silicon Valley looking to make their fortune — a recent survey found that 56% of computer engineers believe they’ll become millionaires some day — what will happen to the culture of San Francisco?
Again, from Rebecca Solnit:
"All this is changing the character of what was once a great city of refuge for dissidents, queers, pacifists and experimentalists. Like so many cities that flourished in the post-industrial era, it has become increasingly unaffordable over the past quarter-century, but still has a host of writers, artists, activists, environmentalists, eccentrics and others who don’t work sixty-hour weeks for corporations– though we may be a relic population. Boomtowns also drive out people who perform essential services for relatively modest salaries, the teachers, firefighters, mechanics and carpenters, along with people who might have time for civic engagement. I look in wonder at the store clerks and dishwashers, wondering how they hang on or how long their commute is. Sometimes the tech workers on their buses seem like bees who belong to a great hive, but the hive isn’t civil society or a city; it’s a corporation."
As I read Solnit’s diary about the changes the current tech boom is bringing to San Francisco, I can’t help but think about the changes that the current ed-tech boom might also bring to education, to our schools and colleges and universities. To places that have also been, in certain ways, a "refuge for dissidents, queers, pacifists and experimentalists.”
Global ed-tech investment hit a record high this year: $559 million across 103 funding deals in the the first quarter of the year alone. How does that shape or reshape the education landscape?
In the struggle to build “a great hive,” to borrow Solnit’s phrase, that is a civil society and not just a corporate society, we must consider the role that education has played — or is supposed to play — therein, right? What will all this investment bring about? Innovation? To what end?
When we “innovate” education, particularly when we “innovate education” with technology, which direction are we moving it? Which direction and why?
Why, just yesterday, an interview was published with Udacity founder Sebastian Thrun, who’s now moving away from the MOOC hype and the promises he and others once made that MOOCs would “democratize education.” Now he says, and I quote, “If you’re affluent, we can do a much better job with you, we can make magic happen." Screw you, I guess, if you're poor.
I’ve gestured towards things so far in this talk that might tell us a bit about the culture of Silicon Valley, about the ideology of Silicon Valley.
But what is the ideology of “innovation.” The idea pre-dates Silicon Valley to be sure.
An aside: I have this tendency when I’m delivering a talk in Canada to be terribly inappropriate, to say things I shouldn’t. Sometimes it’s a deliberate provocation. Sometimes, accidental — just as the words spill out of my mouth, I realize that I’ve failed to filter myself adequately. So I apologize.
See, as I started to gather my thoughts about this talk, as I thought about the problems with Silicon Valley culture and Silicon Valley ideology, I couldn’t help but choke on this idea of “innovation.”
So I’d like to move now to a critique of “innovation,” urge caution in chasing “innovation,” and poke holes, in particular, in the rhetoric surrounding “innovation.” I’d like to challenge how this word gets wielded by the technology industry and by extension by education technologists.
And I do this, I admit in part, because I grow so weary of the word. “Innovation” the noun, “innovative” the adjective, “innovate” the verb — they’re bandied about all over the place, in press releases and marketing copy, in politicians’ speeches, in business school professors’ promises, in economists’ diagnoses, in administrative initiatives. Um, in the theme of this conference and the name of this organization behind it.
What is “innovation”? What do we mean by the term? Who uses it? And how? Where does this concept come from? Where is it taking us?
How is “innovation” deeply ideological and not simply descriptive?
Of course, a dictionary definition suggests that “innovation” might mean nothing more than “something new.” According to Merriam Webster at least, innovation is "a new idea, a new method, a new device.” It is "the act or process of introducing a new idea, a new method, a new device."
And on one hand, that means it’s probably just fine that almost everything gets the “innovation" label slapped on it. I should just get over my frustrations with the word’s over-usage. Because every day, there’s something new. Every day is new. And as long as it’s new, it’s innovation! Especially if it’s tied to technology.
Indeed, right there, embedded in this particular definition, is a nod to technology: “a new device.”
What kind of technology? Well, here are Merriam Webster’s examples of how the word might be used in a sentence:
- “the latest innovation in computer technology”
- "Through technology and innovation, they found ways to get better results with less work.”
- "the rapid pace of technological innovation”
These examples help illustrate that, in popular usage at least, the new ideas, methods, and devices that comprise “innovation" have to do with computers. They also, let’s note, have to do with labor, and they have to do with speed and efficiency. These examples highlight show how tightly bound “innovation” has become with technology and offer a hint perhaps as to why technologists are sometimes quick to conflate the adoption of new tools with the adoption of new ideas and practices. Technology counts as “innovation,” even when we use it to do the same old stuff.
The Merriam Webster dictionary definition of innovation leaves out the notion of “change.” Defined this way, innovation isn’t necessarily about transformational ideas, different methods. It’s simply “a new thing.” Innovation framed this way means perpetually buying or building new things. Shiny new things, new devices that — so conveniently — become rapidly obsolete. Strangely that’s innovation.
Using Google’s Ngram Viewer, a tool that draws on the 5+ million books digitized by the company, we can trace the frequency of usage of “innovation.” The origins of the word date back to the 16th century, but we can see that, since the 1960s, its popularity has grown substantially.
The adjective “innovative” was hardly used before the 1960s at all. The timing here of its increasing usage, not surprisingly, coincides with the blossoming of the computer industry.
The Oxford English Dictionary, which also of course does a good job — but by some definitions, I suppose, a less “innovative” job — at showing how words and usage shift over time, does offer a definition of “innovation" that involves the idea of change:
"The action of innovating; the introduction of novelties; the alteration of what is established by the introduction of new elements or forms."
Elements or forms, not devices.
Interestingly, here we have a mention of novelties, a word that connotes something that’s newly popular but only for a short amount of time. The novelty doesn’t last; nor does the “innovation.” There’s a need for constant, perpetual renewal.
Obsolete, according to the OED, the transitive version of the verb “innovate” — that is, “to innovate” used with a direct object — meaning “to change (a thing) into something new; to alter; to renew.”
More common now, again according to the OED, the intransitive version of the verb — that is (a reminder for those who’ve forgotten their grammar lessons) an action verb that doesn’t have a direct object that’s the recipient of the action — “to innovate” meaning “to bring in or introduce novelties; to make changes in something established; to introduce innovations.” “To innovate” simply is. It doesn’t require a thing to act upon.
This makes the verb “to innovate” interesting to compare with “to invent,” which does typically have a direct object. We can see that “to invent” remains far more popular.
But here’s a comparison between “innovation” and “invention.” Again in the mid 1960s, “innovations” surpassed “inventions."
Wayne State University professor John Patrick Leary, who’s writing a very wonderful Raymond Williams-esque blog series on “Keywords for the Age of Austerity” has looked closely at the history of the word “Innovation” and suggests that we’re actually seeing a return of its transitive usage, “to innovate” with a direct object.
He points, of course, to the Harvard Business Review: “Who’s Innovating Innovation?”
Leary adds that, "The transitive construction 'innovating innovation' thus uses the word in a form that was last common in the 18th century. Then, the word referred to a process of transformation or renewal that often carried religious implications: the salvation promised through Christ, but importantly also ... offered through deceit by false prophets.”
Innovation as salvation. Or innovation as deception by false prophets. I’ll return to this point later.
Today “innovation” has come to refer to commercial interests and entrepreneurial efforts, but a religious tinge to the word remains. Leary points to the intransitive usage in, for example, descriptions of Apple founder Steve Jobs: his “constant desire to innovate and take chances.” Leary writes that, "We are no longer innovating on or upon anything in particular, which can make ‘innovate' sound like a kind of mantra, recalling the religious associations the word once had: 'If you don’t innovate every day and have a great understanding of your customers,' said a Denver processed cheese executive, 'then you don’t grow.' Innovation sounds more and more like an epiphany here."
Innovation as mantra. Innovation as epiphany.
Another obsolete meaning of the word “innovation," according to the OED, incidentally, "A political revolution; a rebellion or insurrection” although perhaps, like the religious origins of the word, a bit of that meaning remains as well. Certainly we see the words “revolution” and “innovation” used almost interchangeably these days when it comes to technology marketing and the promise of transformation and upheavel and disruption. Although again, thanks to Google Ngrams, we can see the trajectory of those processes as they are talked about in literature.
The technology innovation insurrection isn’t a political one as much as it is a business one (although surely there are political ramifications of that).
In fact, innovation has been specifically theorized as something that will blunt revolution, or at least that will prevent the collapse of capitalism and the working class revolution that was predicted by Karl Marx.
That's the argument of economist Joseph Schumpeter who argued most famously perhaps in his 1942 book Capitalism, Socialism and Democracy that entrepreneurial innovation was what would sustain the capitalist system — the development of new goods, new companies, new markets that perpetually destroyed the old. He called this constant process of innovation “creative destruction.
As this Ngram shows, the popularity of the phrase “creative destruction” follows a similar pattern to the word “innovation,” picking up usage following World War II and growing exponentially in subsequent decades. A Harvard professor, Schumpeter’s ideas have been incredibly influential — one of his students was Alan Greenspan, the former chairman of the US Federal Reserve, who was known to invoke the term “creative destruction” when he spoke on Capitol Hill.
In the technology sector, Schumpeter's influence might best be known via the work of another Harvard professor, Clayton Christensen, who in 1997 published The Innovator’s Dillemma, popularizing the phrase “disruptive innovation.”
“Disruptive innovation” — “creative destruction.”
The precise mechanism of the disruption and innovation in Christensen’s theory differs than Schumpeter’s. Schumpeter saw the process of entrepreneurial upheaval as something that was part of capitalism writ large — industries would replace industries. Industries would always and inevitably replace industries.
Schumpeter argued this process of innovation would eventually mean the end of capitalism, albeit by different processes than Marx had predicted. Schumpeter suggested that this constant economic upheaval would eventually cause such a burden that democratic countries would put in place regulations that would impede entrepreneurship. He argued that, in particular, “intellectuals” — namely university professors — would help lead to capitalism’s demise because they would diagnose this turmoil, develop critiques of the upheaval, critiques that would appealing and relevant to those beyond the professorial class.
That the enemy of capitalism in this framework is the intellectual and not the worker explains a great deal about American politics over the past few decades. It probably explains a great deal about the ideology behind a lot of the “disrupting higher education” talk as well.
Christensen offers a different scenario; there is no “end of capitalism” here. Christensen defines disruptive innovation as "a process by which a product or service takes root initially in simple applications at the bottom of a market" — that is, with customers who are not currently being served — "and then relentlessly moves up market, eventually displacing established competitors."
According to Christensen’s framework, there are other sorts of innovations that aren’t “disruptive” in this manner. There are, for example, “sustaining innovations” — that is, products and services that strengthen the position (and the profits) of incumbent organizations.
But that’s not how the tech industry views itself today — not as the makers of “sustaining innovations.” Despite its growing economic and political power, the tech industry continues to see itself as an upstart not an incumbent. As a disruptor. An innovator.
The notion of “disruptive innovation” has resonated deeply with tech industry. It is worth pointing out, perhaps, that disk drive manufacturers were one of the case studies in Christensen’s 1997 book. But Christensen himself has clarified that few technologies are intrinsically “disruptive technologies.” Disruptive innovations, in fact, can be the result of what are fairly crude technologies. The innovation, he argues, instead comes from the business model.
That’s why it doesn’t matter to proponents of the “disruptive innovation” framework that Khan Academy or MOOCs suck, for example. It doesn’t matter that they’re low quality technologies featuring low quality instruction and sometimes low quality content. What matters is that they’re free. What matter is that they change the market — it’s all about markets, after all. Students are consumers, not learners in this framework. What matters is that these innovations initially serve non-consumers (that is, students not enrolled in formal institutions) then "over time to march upmarket." That’s why they’re disruptive innovations, according to Christensen, who just this weekend published an op-ed in The Boston Globe insisting that “MOOCs’ disruption is only beginning."
Innovating markets. Not innovating teaching and learning.
What interests me in Christensen’s and Schumpeter’s frameworks, I confess, aren't the business school analyses or business professors' case studies or predictions. What interests me, as I said at the beginning of this talk, is culture. What interests me are the stories that the businesses tell about “disruptive innovation” — because this has become a near sacred story to the tech sector. It’s a story of the coming apocalypse — destruction and transformation and redemption, brought to you by technology.
Again, these cultural remnants of an older meaning of “innovation” — a process of transformation or renewal that has religious implications. Perhaps the salvation. Perhaps deception by false prophets. The Battles of the End Times, and you must decide which side you’re on.
“The end of the world as we know it” seems to be a motif in many of the stories that we hear about what “disruptive innovation” will bring us, particularly as we see Christensen’s phrase applied to almost every industry where technology is poised to transform it. The end of the newspaper. The end of the publishing industry. The end of print. The end of RSS. The end of the Post Office. The end of Hollywood. The end of the record album. The end of the record label. The end of the factory. The end of the union. And of course, the end of the university.
The structure to many of these narratives about disruptive innovation is well-known and oft-told, echoed in tales of both a religious and secular sort:
Doom. Suffering. Change. Then paradise.
People do love the “end of the world as we know it” stories — for reasons that have to do with both the horrors of the now and the promise of a better future. Many cultures (and Silicon Valley is, despite its embrace of science and technology, no different here) tell a story that predicts some sort of cataclysmic event that will bring about a radical cultural (economic, political) transformation and, perhaps eventually for some folks at least, some sort of salvation.
The Book of Revelations. The Mayan Calendar. The Shakers. The Ghost Dance. Nuclear holocaust. Skynet. The Singularity.
Incidentally, and according to Google Ngrams at least, “innovation” surpassed “salvation” in popularity some time in the early 1970s — although both seem to be ticking upwards in unison.
I don’t think this graph is an indication that “science” now trumps “religion” in late twentieth and early twenty-first century publications. As I hope I’ve illustrated for you this morning, the stories that we tell about “innovation” are still very much tinged with faith.
But if, as I’ve tried to argue here and elsewhere, “innovation,” and disruptive innovation in particular, has this millennialist bent to it — this belief in transformation through destruction (and, if the tech industry libertarians have their way, through deregulation) — what exactly does society look like on the other side of this change? What does education look like disrupted?
Again, as I’ve suggested in this talk, the answer to those questions will shape our culture. Our communities. And as such, the answers are political.
Our response to both changing technology and to changing education must involve politics — certainly this is the stage on which businesses already engage, with a fierce and awful lobbying gusto. But see, I worry that we put our faith in “innovation” as a goal in and of itself, we forget this. We confuse “innovation” with “progress” and we confuse “technological progress” with “progress” and we confuse all of that with “progressive politics.” We forget that “innovation" does not give us justice. “Innovation” does not give us equality. “Innovation" does not empower us.
We achieve these things when we build a robust civic society, when we support an engaged citizenry. We achieve these things through organization and collective action. We achieve these things through and with democracy; and we achieve — or we certainly strive to achieve — these things through public education.