read

I was a guest speaker in the MA in Elearning class at Cork Institute of Technology this morning. Thanks very much to Gearóid Ó Súilleabháin for the invitation. Here's a bit of what I said...

Thank you for inviting me to speak to your class today. This is such a strange and necessary time to talk about education technology, to take a class about education technology, to get a degree in education technology because what, in the past, was so often framed as optional or aspirational is now compulsory — and compulsory under some of the worst possible circumstances. So it's a strange and necessary time to be a critic of education technology because, while I've made folks plenty angry before, now I am under even more pressure to say something, anything nice about ed-tech, to offer reassurance that — over time, by Fall Term surely — the tech will get better.

I can't. I'm sorry.

It's also an deeply uncomfortable time to be an American with any sort of subject matter expertise — it has been since well before the 2016 election, but particularly since then. I don't want to come off today as making broad sweeping statements about all of education everywhere when I'm very much talking about the education system in the US and the education technology industry in the US. So grain of salt and my apologies and all that.

One of the reasons that I am less than sanguine about most education technology is because I don't consider it this autonomous, context-free entity. Ed-tech is not a tool that exists only in the service of improving teaching and learning, although that's very much how it gets talked about. There's much more to think about than the pedagogy too, than whether ed-tech makes that better or worse or about the same just more expensive. Pedagogy doesn't occur in a vacuum. It has an institutional history; pedagogies have politics. Tools have politics. They have histories. They're developed and funded and adopted and rejected for a variety of reasons other than "what works." Even the notion of "what works" should prompt us to ask all sorts of questions about "for whom," "in what way," and "why."

I want to talk to you a bit today about what I think is going to be one of most important trends in education technology in the coming months and years. I can say this with some certainty because it's been one of the most important trends in education technology for a very long time. And that's surveillance.

Now, I don't say this to insist that surveillance technology is inevitably going to be more important, more pervasive. Me personally, I don't want the future of education to be more monitored, data-mined, analyzed, predicted, molded, controlled. I don't want education to look that way now, but it does.

Surveillance is not prevalent simply because that's the technology that's being sold to schools. Rather, in many ways, surveillance reflects the values we have prioritized: control, compulsion, efficiency. And surveillance plays out very differently for different students in different schools — which schools require schools to walk through metal detectors, which schools call the police for disciplinary infractions, which schools track what students do online, even when they're at home. And nowadays, especially when they're at home.

In order to shift educational institutions away from a surveillance culture, we are going to have to make a number of changes in priorities and practices — priorities and practices already in place long before this global pandemic.

Historically, a good deal of surveillance has involved keeping abreast (and control) of what the teacher was up to. She — and I recognize that teachers aren't always female, but the profession is certainly feminized — is alone in a classroom of other people's children, after all. And I'll return to this notion of teacher surveillance in a bit, but keep in mind, as I talk here, that none of the technologies I talk about affect students alone.

Perhaps the most obvious form of surveillance in schools involves those technologies designed to prevent or identify cheating. Indeed, if we expand our definition of "technology" to include more than just things with gears or silicon, we might recognize much of the physical classroom layout is meant to heighten surveillance and diminish cheating opportunities: the teacher in a supervisory stance at the front of the class, wandering up and down the rows of desks and peering over the shoulders of students. (Teachers, of course, know how to shift this physical setting — move the chairs around, for example. Teachers might be less adept or even able to do the same when the classroom setting is digital.)

Despite all the claims that ed-tech "disrupts," it is just as likely going to re-inscribe. That is, we are less likely to use ed-tech to rethink assignments or assessments than we are to use ed-tech more closely scrutinize student behavior.

Some of the earliest educational technologies — machines developed in the mid-twentieth century to automate instruction — faced charges that they were going to make it easier for students to cheat. If, as promised, these machines could allow students to move through course materials at their own pace without teacher supervision, there had to be — had to be — some mechanism to prevent deceptive behavior. As today, these technologies promised to "personalize" education; but that increased individualization also brought with it a demand to build into new devices ways to track students more closely. More personalized means more surveilled — we know this from Facebook and Amazon, don't we.

And this is key: the fear that students are going to cheat is constitutive of much of education technology. This belief dictates how it's designed and implemented. And in turn it reinforces the notion that all students are potential academic criminals.

For a long time, arguably the best known anti-cheating technology was the plagiarism detection software TurnItIn. The company was founded in 1998 by UC Berkeley doctoral students who were concerned about cheating in the science classes they taught. And I think it's worth noting, if we think about the affordances of technology, they were particularly concerned about how students were utilizing a new feature that the personal computer had given them: copy-and-paste. So they turned some of their research on pattern-matching of brainwaves to create a piece of software that would identify patterns in texts. And as you surely know, TurnItIn became a huge business, bought and sold several times over by private equity firms since 2008: first by Warburg Pincus, then by GIC, and then, in 2014, by Insight Partners — the price tag for that sale: $754 million. TurnItIn was acquired by the media conglomerate Advance Publications last year for $1.75 billion.

So we should ask: what's so valuable about TurnItIn? Is it the size of the customer base — the number of schools and universities that pay to use the product? Is it the algorithms — the pattern-matching capabilities that purport to identify plagiarism? Is it the vast corpus of data that the company has amassed — decades of essays and theses and Wikipedia entries that it uses to assess student work?

TurnItIn has been challenged many times by students who've complained that it violates their rights to ownership of their work. A judge ruled, however, in 2008 that students' copyright was not infringed upon as they'd agreed to the Terms of Service.

But what choice does one have but to click "I agree" when one is compelled to use a piece of software by one's professor, one's school? What choice does one have when the whole process of assessment is intertwined with this belief that students are cheaters and thus with a technology infrastructure that is designed to monitor and curb their dishonesty?

Every student is guilty until the algorithm proves their innocence.

Incidentally, one of its newer products promise to help students avoid plagiarism, and so essay mills now also use TurnItIn so they can promise to help students avoid getting caught cheating. The company works both ends of the plagiarism market. Genius.

Anti-cheating software isn't just about plagiarism, of course. No longer does it just analyze students' essays to make sure the text is "original." There is a growing digital proctoring industry that offers schools way to monitor students during online test-taking. Well-known names in the industry include ProctorU, Proctorio, Examity, Verificient. Many of these companies were launched circa 2013 — that is, in the tailwinds of "the Year of the MOOC," with the belief that an increasing number of students would be learning online and that professors would demand some sort of mechanism to verify their identity and their integrity. According to one investment company, the market for online proctoring was expected to reach $19 billion last year — much smaller than the size of the anti-plagiarism market, for what it's worth, but one that investors see as poised to grow rapidly, particularly in the light of the coronavirus pandemic.

These proctoring tools gather and analyze far more data than just a student's words, than their responses on an exam. They require a student show photo identification to their laptop camera before the test begins. Depending on what kind of ID they use, the software gathers data like name, signature, address, phone number, driver’s license number, passport number, along with any other personal data on the ID. That might include citizenship status, national origin, or military status. The software also gathers physical characteristics or descriptive data including age, race, hair color, height, weight, gender, or gender expression. It then matches that data that to the student's "biometric faceprint" captured by the laptop camera. Some of these products also capture a student's keystrokes and keystroke patterns. Some ask for the student to hand over the password to their machine. Some track location data, pinpointing where the student is working. They capture audio and video from the session — the background sounds and scenery from a student's home.

The proctoring software then uses this data to monitor a student's behavior during the exam and to identify patterns that it infers as cheating — if their eyes stray from the screen too long, for example, their "suspicion" score goes up. The algorithm — sometimes in concert with a human proctor — decides who is suspicious. The algorithm decides who is a cheat.

We know that algorithms are biased, because we know that humans are biased. We know that facial recognition software struggles to identify people of color, and there have been reports from students of color that the proctoring software has demanded they move into more well-lit rooms or shine more light on their faces during the exam. Because the algorithms that drive the decision-making in these products is proprietary and "black-boxed," we don't know if or how it might use certain physical traits or cultural characteristics to determine suspicious behavior.

We do know there is a long and racist history of physiognomy and phrenology that has attempted to predict people's moral character from their physical appearance. And we know that schools have a long and racist history too that runs adjacent to this.

Of course, not all surveillance in schools is about preventing cheating; it's not all about academic dishonesty — but it is always, I'd argue, about monitoring behavior and character. And surveillance is always caught up in the inequalities students already experience in our educational institutions.

For the past few years, in the US at least, a growing number of schools have adopted surveillance technology specifically designed to prevent school shootings. In some ways, these offerings are similar to the online proctoring tools, except these monitor physical and well as online spaces, using facial recognition software and algorithms that purport to identify threats. This online monitoring includes tracking students' social media accounts, "listening" for menacing keywords and phrases. (These products are sold to schools in other countries too, not as school shooting prevention — that seems to be a grotesquely American phenomenon — but often as ways to identify potential political and religious extremism and radicalization among students.)

And there are plenty of other examples I could give you too, unfortunately, of how surveillance technologies permeate schools. Schools using iris-scanners in the lunchroom. Schools using radio-trackers on students' ID cards and monitoring students' mobile phones to make sure they're in class. And all this is in addition to the incredible amounts of data gathered and analyzed by the day-to-day administrative software of schools — from the learning management system (the VLE), the student information system, the school network itself, and so on. Like I said, not all of this is about preventing cheating, but all of it does reflect a school culture that does not trust students.

So, what happens now that we're all doing school and work from home?

Well, for one thing, schools are going to be under even more pressure to buy surveillance software — to prevent cheating, obviously, but also to fulfill all sorts of regulations and expectations about "compliance." Are students really enrolled? Are they actually taking classes? Are they doing the work? Are they logging into the learning management system? Are they showing up to Zoom? Are they really learning anything? How are they feeling? Are they "at risk"? What are teachers doing? Are they holding class regularly? How quickly do they respond to students' messages in the learning management system?

And this gets us back to something I mentioned at the outset: the surveillance of teachers.

For a very long time, the argument that many employers made against working from home was that they didn't trust their employees to be productive. The supervisor needed to be able to walk by your desk at any moment and make sure you were "gonna have those TPS reports to us by this afternoon," to borrow a phrase from the terrific movie Office Space. And much as education technology is designed on the basis of distrust of students, enterprise technology — that is, technology sold to large businesses — is designed around a distrust of workers. Again, there's a long history here — one that isn't just about computing. The punch clock, for example, was invented in 1888 by a jeweler William LeGrand Bundy in order to keep track of what time his employees came and left work. He and his brother founded the Bundy Manufacturing Company to manufacture the devices, and after a series of mergers, it became a part of a little company called International Business Machines, or IBM. Those "business machines" were sold with the promise of more efficient workplaces, of course, and that meant monitoring workers.

Zoom, this lovely piece of videoconferencing software we are using right now, is another example of enterprise technology. Zoom never intended to serve the education market quite like this. And there is quite a bit about the functionality of the software that reveals whose interests it serves — the ability to track who's paying attention, for example, and who's actually working on something else (a feature, I will say, that the company disabled earlier this month after complaints about its fairly abysmal security and privacy practices). Who's cheating the time-clock, that is. Who's cheating the boss.

Social media monitoring tools that are used to surveil students are also used to surveil workers, identifying those who might be on the cusp of organizing or striking. Gaggle, a monitoring tool used by many schools, wrote a blog post a couple of years ago in which it suggested administrators turn the surveillance towards teachers too: "Think about the recent teacher work stoppage in West Virginia," the post read. "Could the story have been different if school leaders there requested search results for 'health insurance' or 'strike' months earlier? Occasional searches for 'salary' or 'layoffs' could stave off staff concerns that lead to adverse press for your school district." In response to one wildcat strike at a US university earlier this month, the administration threatened those graduate student-instructors who had not logged into the learning management system with the loss of their stipends.

One of my greatest fears right now is that this pandemic strengthens this surveillance culture in school. And the new technologies, adopted to ease the "pivot to digital," will exacerbate existing educational inequalities, will put vulnerable students at even more risk. These technologies will for foreclose possibilities for students and for teachers alike, shutting down dissent and discussion and curiosity and community.

Too often in education and ed-tech, we have confused surveillance for care. We need to watch students closely, we tell ourselves, because we want them to be safe and to do well. But caring means trusting, and trusting means being able to turn off a controlling gaze. Unfortunately, frighteningly, it seems we are turning it up.

Audrey Watters


Published

Hack Education

The History of the Future of Education Technology

Back to Archives