read

This is part ten of my annual look at the year’s “top ed-tech stories

Perhaps it’s no surprise that there was so much talk this year about education, technology, and emotional health. I mean, 2017 really sucked, and we’re all feeling it.

As support services get axed and the social safety net becomes threadbare, our well-being – our economic and emotional well-being – becomes more and more fragile. People are stressed out, and people are demoralized, and people are depressed. People are struggling, and people are vulnerable, and people are afraid. And “people” here certainly includes students.

All the talk of the importance of “emotion” in education reflects other trends too. It’s a reaction, I’d say, to the current obsession with artificial intelligence and a response to all the stories we were told this year about robots on the cusp of replacing, out-“thinking,” and out-working us. If indeed robots will excel at those tasks that are logical and analytical, schools must instead develop in students – or so the story goes – more “emotional intelligence,” the more “human” capacity for empathy and care.

Talk of “emotion” has also been the focus of several education reform narratives for the last few years – calls for students to develop “grit” and “growth mindsets” and the like. (So much easier than addressing structural inequality.)

There has been plenty of speculation in the past few years that the latest law governing K–12 education in the US, ESSA (the Every Student Succeeds Act, signed into law by President Obama in December 2015), will promote programs and assessments that focus on “social and emotional learning,” not simply on the traditional indicators of schools’ performance – academics and test scores. Schools should “foster safe, healthy, supportive, and drug-free environments that support student academic achievement,” the law says, which might include “instructional practices for developing relationship-building skills, such as effective communication” or “implementation of school-wide positive behavioral interventions and supports.”

Of course, when all these narratives about “social emotional learning” get picked up by education technologists and education entrepreneurs, they don’t just turn policy mandates or even into TED Talks. They turn into products.

“Every great cause begins as a movement, becomes a business, and eventually degenerates into a racket.” – Eric Hoffer

Hacking the Brain


The current (and long-running, let’s be honest) fascination with AI is deeply intertwined with science and speculation about the functioning of the brain and the possibilities for that organ’s technological improvement. There’s a science fiction bent as well and a certain millennialism, an AI apocalypticism, to much of the invocation of “hacking the brain” – a religious fantasy about the impending merger of “man and machine.”

While new “neurotechnologies” are primarily being developed to help those with disabilities regain speech, sight, and mobility, there is still plenty of talk about “linking human brains to computers” as consumer-oriented “enhancements” that everyone will want to pursue. A “symbiosis with machines” as Bryan Johnson puts it. He’s put $100 million of his own money into his company Kernel (which I guess means we’re supposed to believe it’s a real, viable thing) with the promise of developing computer chip implants that will bolster human intelligence. Elon Musk – a totally reliable person with his predictions of the future of the business of science – has founded a company called Neuralink that does something similar. It too will link human brains to machines. (There’s a cartoon explainer, which I guess means we’re supposed to believe it’s a real, viable thing). In eight to ten years time, Musk assures us, brain implants will enable telepathy.

(I’m keeping track of all these predictions. It isn’t simply that folks get it right or get it wrong. It’s that the repetition of these stories, particularly when framed as inevitabilities, shapes our preparations for the future. The repetition shapes our imaginations about the future.)

“Neuroeducation Will Lead to Big Breakthroughs in Learning,” Singularity Hub proclaimed this fall. Singularity Hub is a publication run by Singularity University, a for-profit (non-accredited) school founded by Ray Kurzweil, one of Silicon Valley’s biggest evangelists for the notion we’ll soon be able to upload human minds into computers. We’re on the cusp of being able to upload skills directly into the brain, Futurism.com wrote this spring. (Futurism.com is a website run by the New York Chapter Leader of Singularity University.) All these promises, if kept, would indeed make for breakthroughs in teaching and learning. “If kept,” of course, is the operative phrase there.

If kept. If possible. If ethical or desirable, even.

There are promises about “brain hacking pills” that will speed up learning. (Well, it turns out, they don’t actually work.) There are promises about “brain zapping” to speed up learning. (What researchers understand about the effectiveness of this procedure is pretty preliminary.) There are promises about “brain training” exercises that will keep the brain “fit” as one ages. (The research is iffy at best.)

Edsurge wrote about Brainco in October, a company that says its devices can monitor students’ brainwave activity in order to ascertain who’s paying attention in class. A few weeks later, Edsurge wrote about InteraXon, a company that says its devices can monitor students’ brainwave activity “for meditation and discipline.” An ed-tech trend in the making, investors hope.

But probably the biggest story in “neuroeducation” this year involved Neurocore, a company that monitors brainwaves and then, through “neurofeedback sessions,” trains people to train their brain to activate certain brainwave frequencies. Neurocore didn’t make headlines because it worked – to the contrary. It made headlines because it was under investigation for making misleading claims about its benefits. (It’s promoted in some circles as a “cure” for autism and ADHD.) And it made headlines because one of its financial backers is US Secretary of Education Betsy DeVos, who despite the dearth of data about the effectiveness of Neurocore, invested even more money in the company this year.

Don’t let the dearth of data fool you though. Many people find all this a compelling story, data or no data – a long-running fantasy about “Matrix-style learning”. And when the story is accompanied with colorful images from fMRIs, it all seems to be incredibly persuasive.

It’s incredibly dangerous too, as Stirling University’s Ben Williamson cautions, as the kind of control that these devices promise should raise all sorts of questions about students’ civil rights and “cognitive liberties.” Williamson argues,

Neuroenhancement may not be quite as scientifically and technically feasible yet as its advocates hope, but the fact remains that certain powerful individuals and organizations want it to happen. They have attached their technical aspirations to particular visions of social order and progress that appear to be attainable through the application of neurotechnologies to brain analytics and even neuro-optimization. As STS researchers of neuroscience Simon Williams, Stephen Katz & Paul Martin have argued, the prospects of cognitive enhancement are part of a ‘neurofuture’ in-the-making that needs as much critical scrutiny as the alleged ‘brain facts’ produced by brain scanning technologies.

Marketing “Mindsets”


While the brainwave monitoring headsets hyped by some in ed-tech might seem too far-fetched and too futuristic for schools to adopt, they are being positioned as part of another trend that might make them seem far more palatable: “social emotional learning” or SEL.

SEL has powerful advocates in powerful places: MacArthur Foundation “Genius” and University of Pennsylvania psychology professor Angela Duckworth. Stanford University psychology professor Carol Dweck, who was awarded the $4 million Yidan Prize for Education Research this fall. The Chan Zuckerberg Initiative, whose vision for the future of education involves “whole child personalized learning” and who says it plans to invest hundreds of millions of dollars into education initiatives to that end. The World Economic Forum. The OECD.

Oh, and the President of the United States, I guess. He did declare 15 - 21 October of this year “Character Counts Week.” Something about grit and determination and moral fiber and whatnot.

For its part, Edsurge pushed hard with its marketing of SEL in 2017: “Want Social-Emotional Learning to Work? The Careful Balance of Tech and Relationships.” “Assessing Social-Emotional Skills Can Be Fuzzy Work; SELweb Offers Concrete Data.” “How Valor Collegiate Academy is Rethinking SEL.” “Can Grit Be Measured? Angela Duckworth Is Working on It.” “How Can Educators Strike a Balance Between Blended and Social-Emotional Learning?” “Panorama Offers New Platform to Help Teachers Track Student’s SEL Growth.” “Panorama’s Student Progress Reports Show More Than Grades (Think Behavior and SEL).” “Social-Emotional Learning Is the Rage in K–12. So Why Not in College?” “ClassDojo and Yale Team Up to Bring Mindfulness to the Masses.” “We Know SEL Skills Are Important, So How the Heck Do We Measure Them?” “​How Game-Based Learning Encourages Growth Mindset” (that’s sponsored content from MangaHigh.com). “Three Ways You Can Harness Personalized Learning to Promote a Growth Mindset” (that’s sponsored content from Edmentum). “What If Students Have More Confidence in Growth Mindsets Than Their Teachers?” “Free Tech Tools Teach Social Emotional Learning in Classrooms” (that’s sponsored content from EverFi). “Panorama Education Raises $16M to Connect Emotional, Academic Wellbeing With Data.” “How to Measure Success Without Academic Achievement.”

The elements shared across many of these stories: the monitoring and measuring of students. Monitoring and measuring studentsdata, that is, and then managing their emotions, sure, but more likely, their behavior.

One of the largest single rounds of funding this year was in the behavior management company Hero K12, which raised $150 million in private equity funding in June. (You can read more about who’s funding this and other trends in my year-end series on funding.hackeducation.com.)

I wrote about this company, along with its better known competitor ClassDojo, in an article in The Baffler. ClassDojo, which is also used by teachers and schools to manage student behavior, boasts that it’s been adopted in 90% of schools – a statistic that cannot be verified since this sort of data is not tracked by anyone other than the company making the claim. With this popularity, ClassDojo has done a great deal to introduce and promote “growth mindsets” and “mindfulness” to educators. (“To the masses” as Edsurge puts it.)

These apps – Hero K12, ClassDojo and other types of behavior management products – claim that they help develop “correct behavior.” But what exactly does “correct behavior” entail? And what does it mean (for the future of civic participation, if nothing else) if schools entrust this definition to for-profit companies and their version of psychological expertise? As Ben Williamson observes “social-emotional learning is the product of a fast policy network of ‘psy’ entrepreneurs, global policy advice, media advocacy, philanthropy, think tanks, tech R&D and venture capital investment. Together, this loose alliance of actors has produced shared vocabularies, aspirations, and practical techniques of measurement of the ‘behavioural indicators’ of classroom conduct that correlate to psychologically-defined categories of character, mindset, grit, and other personal qualities of social-emotional learning.” These indicators encourage behaviors that are measurable and manageable, Williamson contends, but SEL also encourages characteristics like malleability and compliance – and all that fits nicely with the “skills” that a corporate vision of education would demand from students and future workers.

Classroom Management and Persuasive Technologies


In that Baffler article, I make the argument that behavior management apps like ClassDojo’s are the latest manifestation of behaviorism, a psychological theory that has underpinned much of the development of education technology. Behaviorism is, of course, most closely associated with B. F. Skinner, who developed the idea of his “teaching machine” when he visited his daughter’s fourth grade class in 1953. Skinner believed that a machine could provide a superior form of reinforcement to the human teacher, who relied too much on negative reinforcement, punishing students for bad behavior than on positive reinforcement, the kind that better trains the pigeons.

As Skinner wrote in his book Beyond Freedom and Dignity, “We need to make vast changes in human behavior…. What we need is a technology of behavior." Teaching machines, he argued, would be one such technology.

By arranging appropriate ‘contingencies of reinforcement,’ specific forms of behavior can be set up and brought under the control of specific classes of stimuli. The resulting behavior can be maintained in strength for long periods of time. A technology based on this work has already been put to use in neurology, pharmacology, nutrition, psychophysics, psychiatry, and elsewhere.


The analysis is also relevant to education. A student is ‘taught’ in the sense that he is induced to engage in new forms of behavior and in specific form upon specific occasions. It is not merely a matter of teaching him what to do; we are as much concerned with the probability that appropriate behavior will, indeed, appear at the proper time – an issue which would be classed traditionally under motivation.

Skinner was unsuccessful in convincing schools in the 1950s and 1960s that they should buy his teaching machines, and some people argue that his work has fallen completely out of favor, only invoked when deriding something as a “Skinner’s Box.” But I think there’s been a resurgence in behaviorism. It’s epicenter isn’t Harvard, where Skinner taught. It’s Stanford. It’s Silicon Valley. And this new behaviorism is fundamental to how many new digital technologies are being built.

It’s called “behavior design” today (because at Stanford, you put the word “design” in everything to make it sound beautiful not totally rotten). Stanford psychologist B. J. Fogg and his Persuasive Technology Lab teach engineers and entrepreneurs how to build products – some of the most popular apps can trace their origins to the lab – that manipulate and influence users, encouraging certain actions or behaviors and discouraging others and cultivating a kind of “addiction” or conditioned response. “Contingencies of reinforcement,” as Skinner would call them. “Technique,” Jacques Ellul would say. “Nudges,” per behavioral economist Richard Thaler, recipient of this year’s Nobel Prize for economics.

New technologies are purposefully engineered to demand our attention, to “hijack our minds.” They’re designed to elicit certain responses and to shape and alter our behaviors. Ostensibly all these nudges are supposed to make us better people – that’s the shiniest version of the story promoted in books like Nudge and Thinking about Thinking. But much of this is really about getting us to click on ads, to respond to notifications, to open apps, to stay on Web pages, to scroll, to share – actions and “metrics” that Silicon Valley entrepreneurs and investors value.

There’s a darker side still to this as I argued in the first article in this very, very long series: this kind of behavior management has become embedded in our new information architecture. It’s “fake news,” sure. But it’s also disinformation plus big data plus psychological profiling and behavior modification. The Silicon Valley “nudge” is a corporate nudge. But as these technologies are increasingly part of media, scholarship, and schooling, it’s a civics nudge too.

Those darling little ClassDojo monsters are a lot less cute when you see them as part of a new regime of educational data science, experimentation, and “psycho-informatics.”

Personalized Learning and the Nudge


In May, The Australian obtained a 23-page document from Facebook’s Australian office detailing how the company had offered to advertisers the ability to target some 6.4 million young people (some as young as 14) during moments of emotional vulnerability – when they felt “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” or “a failure.” Facebook denied the reporting, stating that The Australian article was misleading and insisting that the company “does not offer tools to target people based on their emotional state.” Of course, five years ago, the company did conduct a mass experiment on the manipulation of users’ emotions. It published those findings in an academic journal.

Facebook might not offer tools to identify users’ emotions to others, but it certainly uses them internally. In November, it unveiled a service to detect if users are suicidal. And earlier this year Facebook IQ, the company’s research division, did publish a paper on how marketers could utilize the emotional experiences people will have in VR. (Remember: Facebook owns Oculus Rift.) “All participants wore EEG headsets to analyze their brain signals and measure their level of comfort and engagement.” The company also announced at its annual developer conference this spring that, like Elon Musk, it too is working on a brain-computer interface. Facebook won’t say if it plans to use brain activity for advertising.

“Is Spending Time on Social Media Bad for Us?” Facebook’s Director of Research David Ginsberg pondered in a company blog post last week. And certainly there has been a ton of ink spilled this year on this very question, noting the increased depression and anxiety (particularly among teens) that some researchers are tracing to the Internet in general and to social media specifically. “Increased Hours Online Correlate With Uptick In Teen Depression, Suicidal Thoughts,” Mindshift reported this fall. “Have Smartphones Destroyed a Generation?” asked Jean M. Twenge in a widely-read article in The Atlantic, also published this fall. Many in education technology like to scoff at these sorts of claims, I realize. They’re prone to side with Facebook’s Ginsberg: “our research and other academic literature suggests that it’s about how you use social media that matters when it comes to your well-being.”

Facebook is the largest social network in the world. As of June, it boasted some 2 billion active monthly users. The manipulation of users’ social and emotional experiences should not be minimized or dismissed. And for educators, it’s important to recognize that interest in social and emotional experience and behavioral design is not just something that happens on the Facebook platform (or with other consumer-facing technologies).

Mark Zuckerberg and his Chan Zuckerberg Initiative have pledged to spend hundreds of millions of dollars to promote his vision of “personalized learning.” It’s a vision that, as the head of its education work Jim Shelton recently wrote in an article on Medium, “embraces the role of social-emotional and interpersonal skills, mental and physical health, and a child’s confident progress toward a sense of purpose.”

“More personal means more equitable and just,” Shelton insisted in that Medium essay. And I do not doubt that the Chan Zuckerberg Initiative and Mark Zuckerberg and Facebook all believe that. They believe that they have our best interests at heart, and they will guide us – algorithmically, of course – to “good” academics and “good” thoughts and “good” feelings and “good” behavior, defining and designing, of course, what “good” looks like.

Financial data on the major corporations and investors involved in this and all the trends I cover in this series can be found on funding.hackeducation.com. You can read more at 2017trends.hackeducation.com.

Audrey Watters


Published

Hack Education

The History of the Future of Education Technology

Back to Archives