Here are the slides and notes from my keynote today at the Domain of One's Own Incubator at Emory University. Initially I wanted to talk about "Why A Domain of One's Own Matters" but it turned into more a rant (surprise surprise) about the politics of technology.
“What technologies have you seen lately that you like?” people always ask me. It’s a trick question, I reckon. “What in ed-tech is exciting? What in ed-tech innovative?” These questions make me sigh. Heavily.
But I am on record — several times in several places — calling “Domain of One’s Own” one of the most important and innovative initiatives in ed-tech today. The responses I get to such an assertion are always revealing.
Oh sure, I get plenty of nods and shouts that “hell yeah, Jim Groom rocks!” (which he does. and his team rocks even more).
But then too, I get a fair amount of pushback. People question, “What the hell is ’Domain of One’s Own’?” Or “Is it like that Seinfeld episode?” Or ”I’ve never heard of it. Has Techcrunch written about it?” Or more commonly, “Why is having a blog a big deal?” Or “My university gave us web space back in 1994.” Or — in a nutshell, I suppose — “I don’t get it. How is it innovative?”
I. Innovation
What’s different, what’s special about the “Domain of One’s Own” project?
It isn’t simply “a blog.” It isn’t simply slash your user name on the university’s dot edu.
The initiative represents a kind of open learning — learning on the Web and with the Web, learning that is of the Web — and all along the way, “Domain of One’s Own” offers a resistance to the silos of the traditional learning management system and of traditional academic disciplines.
It highlights the importance of learner agency, learning in public, control over one’s digital identity, and the increasing importance of Web literacies.
But I recognize that the “Domain of One’s Own” initiative does, in many ways, run counter to how the tech industry and the ed-tech industry today define and market “innovation” — and how in turn we teachers and students, we consumers, we "users" — are meant to view and admire such developments.
This is innovation.
This is innovation.
These aren’t ed-tech products, of course. And Google Glass and Google’s Self-Driving Car — for the time being, at least — are not being heavily marketed to schools, unlike Google Apps for Education, Google Course Builder, Google Chrome, Google Chromebooks, Google Hangouts, Google Helpouts, Android tablets, YouTube, Google Books, Google MOOCs, Google Scholar, and so on.
We can probably debate whether or not the Google products pushed on schools are really that innovative. And perhaps — and sadly — that’s what we’ve come to expect from ed-tech: it is acceptably behind-the-curve.
Yet unlike technologies that are specifically geared towards classrooms, Google doesn’t really suffer from its association with ed-tech, does it? Instead, it’s credited with bringing a long-overdue technical boost to schools. And it’s free!
Of course, you could argue that Google writ large does have an education-oriented mission of sorts — "to organize the world’s information and make it universally accessible and useful.”
And yet in undertaking this mission, let’s be clear, Google’s “innovation” is not associated that closely with educational institutions — be they libraries or universities or K–12 schools. In other words, the company positions its products as a benefit for educational institutions, but Google is not really of educational institutions. Google clearly brands itself as a tech innovation, a Silicon Valley innovation, even though you could argue it’s surely a Stanford University-born one.
You could argue too that as Google has grown, so too have the implications of its efforts to “organize the world’s information.” These implications are political. Economic. Technological. Scientific. Cultural. And they are global.
Google looks less and less like a library card catalog, if you will, that helps us find what we’re searching for on the Web. That’s particularly true in light of its some of its most recent acquisitions: a company that builds military robots, a company that manufactures solar-powered drones, and one that makes Internet-connected thermostats and smoke alarms, for example.
No doubt, these are new ways, new products that use “the world’s information.” Often, that use our information.
We’re told this exchange — this extraction, if you will — fosters innovation.
Google Glass and Google’s Self-Driving Car are initiatives of the company’s mysterious laboratory Google X. Although Google prides itself (and brands itself) as being an “innovative” company, Google X is even more ambitious. The work of Google X involves technology “moon shots,” as CEO and co-founder Larry Page describes them.
Here’s how an article in Bloomberg Business Week describes the lab, a gentle reminder of the military history from which computer research emerged:
Google X seeks to be an heir to the classic research labs, such as the Manhattan Project, which created the first atomic bomb, and Bletchley Park, where code breakers cracked German ciphers and gave birth to modern cryptography.
…“Google believes in and enables us to do things that wouldn’t be possible in academia,” says Chris Urmson, a former assistant research professor at Carnegie Mellon [and now the head of the self-driving car project].
“Things that wouldn’t be possible in academia…”
Perhaps because academia doesn’t have the resources. Perhaps because of a focus (real or perceived) on theoretical, rather than applied research. Perhaps because of a disregard or distaste for the commercial. Perhaps because academia — some parts of it at least, and some institutions – likes to minimize or ignore or distance itself from its connections to the development of war technologies — as does the current computer industry, no doubt. Perhaps because academia doesn’t do that good a job at promoting its scholarship to the public — you know, “in organizing the world’s information and making it universally accessible and useful.” Perhaps because academia doesn’t do a good job of hyping its achievements.
Perhaps because innovation is increasingly defined as something that comes from industry and not the university, something that is fostered in the private sector and not the public.
Much of what happens at Google X is secret, but among its other research projects we do know of: Google Contact Lenses, an experiment to see if tiny sensors on contact lenses can offer a non-intrusive way to monitor diabetics’ glucose levels. Project Loon, an experiment to use high altitude balloons to deliver Internet to people in remote areas. And Google Brain, a vast computer simulation of the human brain, a “neural network” of thousands of connected computer processors — “Deep Learning” this is called.
A couple of names of those who’ve worked at Google X in addition to Google co-founder Sergey Brin: Sebastian Thrun. Andrew Ng.
Sebastian Thrun, one of the creators of Google’s self-driving car, who also happens to be a Stanford artificial intelligence professor, who also happens to be the co-founder of Udacity, an online education startup.
Andrew Ng, one of the researchers on Google Brain, who also happens to be a Stanford artificial intelligence professor, who also happens to be the co-founder of Coursera, another online education startup.
Now, I’m disinclined to talk to you this afternoon about MOOCs. Dammit, I’m here to talk about the “Domain of One’s Own”! And God knows, we’ve spilled enough ink on the topic of MOOCs over the course of the last 18 months or so. But MOOCs are a subtext of this talk, I confess.
It’s possible Thrun and Ng’s work at Google X might just be an inconsequential line on their CVs — the ties between Stanford and Google ain’t no big thing.
But I think this connection between Google X and MOOCs is noteworthy. It points towards one vision for the future of teaching and learning with technology, one vision of what happens to the content we create as teachers and learners, one vision of who owns and controls all the data.
It’s also interesting to consider why some people balk at a “Domain of One’s Own” being innovative and yet clamor over MOOCs as the greatest and newest thing education has ever seen.
And I’ll add too, it is striking to me — it’s part of the motivation for my writing a book on the history of automation in education — that two of the leading scientists at Google X, two of the leading scientists in the field of artificial intelligence — and artificial intelligence at scale — have opted to launch companies that purport to address developing human intelligence, through instruction and content delivery at scale.
II. Scale
MOOCs as ideology. Google as ideology. “Scale” as ideology.
Scale is important vis-a-vis tech companies like Google and their global reach. It’s important in their economic impact and in the way that economic growth — particularly venture capital funded growth — gets framed.
Scale is important in the breadth of tech companies’ offerings (particularly in the case of Google, which is hardly just a search engine anymore, although it does remain — if you judge it by its revenues — an advertising company).
“Scale” is important as we see the number of sectors being transformed (or threatened with transformation) by new technologies — as media studies professor Siva Vaidhyanathan has described it in his book The Googlization of Everything.
“Scale” is important too in how many of these new technologies work — how they work practically and how they work ideologically. By that I mean that “scale" and this business-technological lens is increasingly framing the way we view the world, so much so that we must ask “Does it scale?” about every idea or initiative, good or bad.
And when we ask “Does it scale?” we often mean, “Can we replicate this across systems in an orderly, standardized way thanks to Internet connectivity and proprietary software solutions and venture capital?” Or we mean “Is this idea ‘the next Google’?"
Our technological world necessitates thinking in and working in and expanding at scale. Or that’s the message from the tech and business sector at least: scale is necessary; technological progress demands it.
And here again, we can see that just as the “Domain of One’s Own” does not fit in an industry-oriented definition of “innovation,” nor does it neatly fit into this view of “scale” either — even, I’d argue, as we see the project spread from the University of Mary Washington to Emory University and elsewhere. The “Domain of One’s Own” initiative grows through the hard work of community-building and capacity-building, not simply through technical replication.
But Google… Google scales.
“Scale” is important to all of Google’s efforts. Google works, of course, at “Web scale,” and scale is, if nothing else, important in the size and distribution of Google’s infrastructure — its server farms.
Take, for example, every time you type a word or phrase into Google’s search box. That query hits between 700 and 1000 separate computers. These machines scan indexes of the Web and generate about 5 million search results, delivered back to you in .16 seconds.
That’s the infrastructure for one search. It’s hard to fathom the complexity and, well, the scale involved in all the search queries Google handles.
And more: it’s hard to imagine the complexity and scale involved across all the products and services Google offers, all of which are now covered by one Terms of Service agreement — your data and your profile shared across them all. What you search for on Google. Your Gmail. Your Google Calendar plans. Your friends on Google+. What you’ve bought with Google Wallet. What you’ve downloaded from Google Play. What you’ve watched on YouTube. Where you head on Google Maps (and by extension, where the Google Self-Driving Car would know to take you.) What you spy with Google Glass.
So much data.
III. Data
We are all creating mind-bogglingly vast amounts of data — in increasing volume, speed, and complexity. In 2012, IBM pegged this at about 2.5 quintillion bytes of data created every day. No doubt that figure has only increased in the past 2 years.
This is, of course, “big data.” Numbers too big for your Casio calculator. Numbers too big for your Excel spreadsheet.
And this is what many entrepreneurs, technologists, technocrats, politicians, venture capitalists, and quants are quite giddy about. Big data — capturing it, processing it, analyzing it — all of which will purportedly bring about more innovation. More innovation in education. More innovation in education at scale, even.
Much of this data explosion comes from various types of sensors — indeed, the number of Internet-connected devices in US homes today now outnumbers the number of people in the country itself. Devices like, for example, the Nest thermostat that Google just acquired.
But plenty of this data is human-generated — if not specifically as what we call “user-generated content, “ then as “data-exhaust,” that is all sorts of metadata that many of us are often quite unaware that we’re creating.
Of course, the general public probably is a bit more aware of metadata now, thanks to the revelations last summer of Edward Snowden, the former CIA analyst who disclosed the vast surveillance efforts of the National Security Agency: the collection of massive amounts of data from telephone and technology companies. “Email, videos, photos, voice-over-IP chats, file transfers, social networking details, and more” siphoned from Apple, Google, Facebook, Microsoft, Yahoo, Skype, AOL, World of Warcraft, Angry Birds and so on. Encryption undermined. Malware spread. Our social connections mapped. Warrantless spying by governments – not just on suspected terrorists, but on all of us.
As privacy researcher and activist Chris Soghoian quipped on Twitter, Google has built the greatest global surveillance system. It’s no surprise that the NSA has sought to to use it too.
Google knows a lot about us. What we search for. Who we email. And when. Where we live. Where we’re going. What we watch. What we write. What we read. What we buy.
It mines this data purportedly to offer us better products and services and, of course, to sell ads.
And again, here is where a “Domain of One’s Own” runs counter to what is a dominant trend in technology today — particularly a growing trend in education whereby all this data and all this metadata will be used to “personalize education."
A “Domain of One’s Own" asks us to consider the infrastructure. It asks us to understand the Web and our place on it. It asks to us to pay attention to the content we create — as teachers and as students — and to weigh where it best resides — who has access to it, and for how long.
It prompts us to ask “what data are we creating” as learners and “who owns it.” Who tracks us. Who profits.
As our worlds become increasingly mediated by computing machines, we’re encouraged to hand over more details of our lives, more data to Google (and to other technology companies, of course.)
Most of us think little about this. We shrug. We agree to the Terms of Service without reading them, often meaning we’ve agreed to hand over our data, to give up control over what’s done with it. We acquiesce more and more of our privacy. In doing so, we’re assured, technology will give us access to better stuff, to more “innovation.”
IV. Magic
As Arthur C. Clarke once famously said, “any sufficiently advanced technology is indistinguishable from magic.”
Technology companies benefit when we think this is all magic. There is little incentive for them to equip us with the critical and the technical capacities to run our own servers, to build our own applications, to use and contribute to open source software, to claim our place on the open Web, and ultimately here, to challenge their business models. Because let’s be clear: for many companies, theirs is a business model predicated on monetizing the content and the data we create.
A “Domain of One’s Own” builds literacies so that the technology of the Web is distinguishable from magic, so those who understand how to manipulate its symbols are not high priests or magicians, so that carving out and operating your own little piece on the Web is manageable. From there perhaps teachers and students will feel empowered to explore more of technology’s terrain, so they feel empowered, even, to resist its “Googlization.”
One quick aside about “magic” — because I’m a folklorist by training, and I don’t want to dismiss or belittle a belief in wonder. Nor do I want us to move away from a world of wonder to a world of technocracy, to simply reduce what we do and what we make to terms like “user generated content” or “personal data” or “code." How cold and empty these sound. Love letters reduced to a status update, love songs, their associated metadata. Human communication as a transaction, not an expression.
I think we’ve convinced ourselves that we’ve wrested the Internet away from its military origins to do just that, to make a space for poetic self-expression, to make a space for self-directed learning. But I’m not really certain we were successful. In fact, I’m deeply pessimistic about the path that technology, particularly education technology, is taking.
“I am a pessimist,” to quote Antonio Gramsci, ”because of intelligence, but an optimist because of will.”
I am a skeptic about much of ed-tech because I am a critic of late capitalism, of imperialism, of militarism and surveillance — I think we err when we ignore or forget the role that education and that technology play therein.
I do try to be an optimist. Those of working in education are by necessity, Gardner Campbell has suggested. As he wrote last year, "It seems to me that educators, no matter how skeptical their views (skepticism is necessary but not sufficient for an inquiring mind), are implicitly committed to optimism. Otherwise, why learn? and why teach?”
Indeed. And we in education believe in ideas, sure. We believe in knowledge. We believe, I’d add too, that through collective contemplation, intellectual reciprocity, and deliberate and wise action, the future can be better. But mostly, I think, educators are optimists because we believe in people.
I’m here today speaking to you because of people. Oh sure, no doubt, I’m here because I do own my own domain. I’ve leveraged the website I control to tell stories about education and technology, stories that are often very different than those promoted by the business and tech industries, from those promoted by education as well.
But I mean before that. Before I bought the www dot hackeducation dot com domain. Before I bought www dot audrey watters dot com. Way before. A decade ago.
A decade ago I was a grad student, working on my PhD in comparative literature. I didn’t start a blog then to talk about academia. Rather, my husband had been diagnosed with liver cancer. He was dying. I was trying to balance care taking him and our 12 year old son. I was losing my world; I was losing my shit. I was trying to still teach classes. I was supposed to be writing a dissertation.
Instead, I wrote a blog.
These were the early days of academic blogging, when there appeared to be very little support for the sorts of public scholarship that we see now via blogs. Many of us wrote under pseudonyms, uncertain if as graduate students or junior faculty, it was really safe for us to be engaged in these public discussions under our real names.
What I found online was not just intellectual camaraderie. I found an incredible community who supported me during my most difficult times in ways that those on my campus never did.
One of those people was David Morgen. I consider him a dear friend and an amazing colleague, even though neither of us are doing now what we thought we’d be doing in academia a decade ago. And even though, until yesterday, we’d never actually met face-to-face.
And that, if anything, is what’s magic about technology.
Not how Google can mount a tiny camera onto some plastic eyeglasses. Not how all the data it’s collected from its Google Maps project can now be used to help power an autonomous vehicle. Not spam filters on emails. Not collaborative editing of documents. Not technology’s business models. Not its political and economic powers. Not the obfuscation of these.
What’s magic: the ability to connect to other people — and connect in deeply meaningful ways — even though separated by physical space.
V. Resistance
Of course, Internet companies like Google love it when we love that capability. That’s the stuff of TV ads. Talking to grandparents and grandchildren and astronauts via Google Hangouts. Real tearjerkers.
This desire for human “connection” — not simply the colder and more technical term “communication” — is perhaps our weak spot, where these companies can so easily strike, where they can so readily convince us to give up control of our data, our content, our digital identity, to trust them, to let them focus on the technology while we non-technologists focus on the rest.
Perhaps it’s what makes education so susceptible to this too — learning can be a moment of such vulnerability and such powerful connection.
But I’d argue too that this desire for connection is just as easily our strength as we grow weary of an emphasis on scaling the technology and scaling the business.
We will, I hope — again, this is why I believe "A Domain of One’s Own” is so important and so innovative — learn to seize these tools and build something for ourselves. One path forward perhaps, with a nod to Donna Haraway: cyborgs.
“The main trouble with cyborgs” she reminds us, ”is that they are the illegitimate offspring of militarism and patriarchal capitalism, not to mention state socialism. But illegitimate offspring are often exceedingly unfaithful to their origins.”
A “Domain of One’s Own” is a cyborg tactic, I reckon. Kin to the learning management system. Kin to Web 2.0. But subversive to today’s Internet technologies and today’s educational technologies, connected as these are, as Haraway’s manifesto reminds us, to command-control-communication-intelligence.
A cyborg tactic, an “illegitimate offspring," the Domain of One’s Own is fiercely disloyal to the LMS — Jim Groom and his team always make that incredibly clear. And I hope eventually too, fiercely disloyal to Google.
The Domain of One’s Own initiative prompts us to not just own our own domain — our own space on the Web – but to consider how we might need to reclaim bits and pieces that have already been extracted from us.
It prompts us think critically about what our digital identity looks like, who controls it, who owns our data, who tracks it, who’s making money from it. It equips us to ask questions — technical questions and philosophical questions and economic questions and political questions about and for ourselves, our communities, our practices — knowing that we have a stake as actors and not just as objects of technology, as actors and not just objects of education technology.
Graffiti from May ’68 in Paris pronounced “Beneath the cobblestones, the beach.” I know it’s hokey to invoke situationist phrases. I realize that the hot new thing would be to invoke Thomas Piketty. But I love that situationist phrase. It’s so punk rock — the idea that if we dig under the infrastructure of society, we’ll find something beautiful. The idea that in our hands, this infrastructure — quite frankly — becomes a weapon.
And that’s how we resist Google, how we resist the tech industry writ large. And that’s why a Domain of One’s Own matters, that’s why it’s incredibly innovative — because it’s so wickedly subversive about the whole notion of tech and ed-tech “innovation."