Article Image

This is the third in my series The Top Ed-Tech Trends of 2015

Last year, I named one of the “Top Ed-Tech Trends of 2014” “School as ‘Skills.’” By that, I wanted to refer to the powerful narrative that the primary purpose of education – at both the K–12 and university levels – is to prepare students for the workforce. This year, I’m looking at the continuation of that trend, borrowing a phrase that George Siemens used in a keynote this summer: “the employability narrative.”

Last year’s article included a number of updates about credentialing, competency-based education, and coding bootcamps – these have become important trends in their own right and warrant more in-depth discussion in separate posts.

It’s the Economy, Stupid

The US unemployment rate fell to 5% in October of this year, the lowest level in seven years. A closer look at those numbers gives you a slightly different picture: the overall unemployment rate for adult men is 4.7% and 4.5% for adult women. The unemployment rate for teens is 15.9%. For whites, it’s 4.4%, but for Blacks it’s 9.2%. For Asians it’s 3.5%, but for Latinos it’s 6.3%. The unemployment rate for those with no high school diploma is 6.8%. The unemployment rate for high school graduates is 4.9%, and the rate for those with a college degree 2.5%. (The latter constitutes “full employment.”)

The US economy added 271,000 jobs in October, “marking the strongest three years of job creation since 2000,” the White House boasted. Jobs were created in the following sectors: professional and business services, health care, retail trade, food services and drinking places, and construction. Employment in mining fell; employment in other industries, including tech, stayed the same.

(For what it’s worth, I’m using October’s figures as that was the latest dataset released by the Bureau of Labor Statistics as I worked on this article. The November data has since been made available.)

Just to put things in perspective globally however, "about 39 million people ages 16 to 29 across the globe weren’t employed and weren’t participating in any kind of education or training in 2013. That’s 5 million more than before the economic crisis of 2008, a new OECD report stresses, and 2014 predictions don’t look much better."

I want to start this examination of the “employability narrative” in 2015 with a look at actual employment figures. I want to do so, in part, because of the steady drumbeat of stories that posit that schools are not adequately readying graduates for the job market and that graduates – particularly those who study the liberal arts – are unprepared for work.

Take the report released in January by the Association of American Colleges and Universities, for example, which found a discrepancy between how “career ready” soon-to-be college graduates describe themselves and how “career ready” employers think college graduates actually are. Or take the Gallup/Lumina Foundation poll from April in which just 13% of respondents “strongly agreed” that college graduates were prepared to succeed in the workforce. Or take the poll conducted by the Robert Morris University Polling Institute in June that found that only 49.2% of respondents felt colleges were paying enough attention to job market trends and just 54.8% thought there was enough emphasis in higher ed on job placement for graduates. Or take the Gallup/Purdue survey from September in which only half of college alumni ‘strongly agree’ that their education was worth what they paid for it.

So, the debate about whether or not college is “worth it” raged on in 2015, despite strong results on the job market for college graduates, including recent ones. Well, most graduates. As the Pacific Standard observed in February, “Elite University Degrees Do Not Protect Black People From Racism.”

Often when people refer to “most graduates” who do or do not succeed, it’s a statement not about racism but a judgment about which major the graduates selected. A study published this summer and covered by Inside Higher Ed noted that fears about unemployment and uneasiness with the job market do prompt students to select certain majors that they believe to be more lucrative, even if those majors – pre-law, for example – don’t actually offer strong job prospects and even if their switching majors doesn’t actually result in that big a shift in employment or wages. A survey of students attending the for-profit university Laureate Education also found that they want higher education to focus on “career outcomes.” But none of this should really be that surprising – particularly the findings of the latter survey. “Career outcomes” have long been the focus of for-profit higher education; and students do have to be realistic about how they prepare for a future that feels – is – quite precarious.

How does one defend against that precarity? It’s probably not just a “fix” for or by or through education – well unless you like invoking silver bullets as ed-tech entrepreneurs and politicians sure do. But surely it isn’t up to the institution of (higher) education alone to address employability and economic precarity. (Echoing a remark I made last year, I urge you to read everything Tressie McMillan Cottom writes – there are things, she has argued, that the “education gospel” cannot address.)

Nevertheless, employers have been happy in recent decades to wag their fingers at schools and to offload the responsibility of job training elsewhere, although in all fairness, a few companies did announce programs this year to subsidize their employees’ continuing education, perhaps marking a shift in their commitments: Chipotle, McDonalds, Starbucks, Chrysler, for example. Many schools in turn seem quite content to outsource job placement services too, along with so much of their ed-tech infrastructure, to third-party providers.

According to a report from the Georgetown University Center on Education and the Workforce in February, the US spends $1.1 trillion on college and workforce training. “However, the rate of increase for spending on formal training has been faster in higher education – an 82 percent increase since 1994 compared to a 26 percent increase by employers.” Those with college degrees receive the most formal training investment from their employers.

In March, President Obama announced a $100 million TechHire initiative that “aims to convince local governments, businesses, and individuals that a four-year degree is no longer the only way to gain valuable tech skills.” But four-year degree or not, post-secondary credentialing of some sort is expected. (Who are we kidding. Employers still specifically demand four-year degrees. I’ll explore this in more detail in the next post in this series.)

The So-Called Skills Gap

The “skills shortage” or “skills gap” is still invoked with great frequency, by politicians and tech entrepreneurs alike. This shortage has fueled a great deal of interest in “jobs training” and “learn-to-code” startups – more on those below – that promise to meet the (purported) demands of the job market. From an article I wrote this fall about coding bootcamps:

I write “purported” here even though it’s quite common to hear claims that the economy is facing a “STEM crisis” – that too few people have studied science, technology, engineering, or math and employers cannot find enough skilled workers to fill jobs in those fields. But claims about a shortage of technical workers are debatable, and lots of data would indicate otherwise: wages in STEM fields have remained flat, for example, and many who graduate with STEM degrees cannot find work in their field. In other words, the crisis may be “a myth.”

But it’s a powerful myth, and one that isn’t terribly new, dating back at least to the launch of the Sputnik satellite in 1957 and subsequent hand-wringing over the Soviets’ technological capabilities and technical education as compared to the US system.

It’s a myth that’s found some play on the presidential campaign trail as well, as several candidates have openly mocked those who major in the liberal arts (even though many of the candidates themselves have degrees in the liberal arts).

(I just want to observe in passing here that it’s probably not liberal arts majors who are the most disillusioned about their professional futures; it’s college athletes.)

At a Republican Party debate this fall, Florida Senator Marco Rubio argued that higher education “is too expensive, too hard to access, and it doesn’t teach 21st-century skills.” Graduates, he contended, end up with massive amounts of student loan debt but cannot find jobs. “I don’t know why we have stigmatized vocational education,” he also said. ”Welders make more money than philosophers. We need more welders and less philosophers.“ The droves of self-appointed fact- and grammar-checkers on Twitter quickly pointed out that 1) correct usage is ”fewer philosophers“ and that 2) philosophers actually earn more. Politifact also rated Rubio’s claims about welders as ”false."

According to Payscale, philosophy majors make an average first-year salary of $42,200. The average mid-career pay for philosophy majors is even better: $85,000 per year.

Additionally, the median pay for philosophy professors is nearly $90,000 per year, according to The top 10 percent of philosophy professors make more than $190,000 a year.

Rubio will be hard-pressed to find a welder who makes a comparable salary. According to the Bureau of Labor Statistics, the median wage for welders, cutters, solderers and brazers is $37,420 – about $18 an hour. The top 10 percent of welders earn $58,590 or more. That's significantly less than the top 10 percent of philosophy professors, who earn $190,000 or more.

Of course, we’d be hard-pressed too to call welding a “21st century skill,” despite Rubio’s stump speech. Humans have been connecting metal with heat for, ya know, millennia. But nobody pays attention to the history of tech when you can slogan-ize the hell out of things.

And nobody pays attention to the history of ed-tech when you can act as though you’re the first person or first organization to support programming education. (Happy 35th anniversary to Seymour Papert’s Mindstorms!)

Everybody Should Learn to Code

Remember the Year of Code? That was Codecademy’s big push back in 2012 (and every year since) that “everyone should learn to code.” Even NYC Mayor Michael Bloomberg said he was going to learn to code.

Hey, maybe that’s why he’s not running for President. He’s too busy working on his ruby-on-rails app. Or something.

Since I first started covering this renewed “learn-to-code” trend in 2011 and 2012, we have seen more and more people – government officials, educators, entrepreneurs – jump on the bandwagon.

In January, Kentucky legislators proposed a bill that would allow computer science courses to count as a foreign language credit. (Ugh.) Washington state passed legislation in June strengthening computer science education, thankfully rejecting an earlier attempt to allow CS classes to count as a foreign language credit. In September, NYC Mayor Bill de Blasio announced a ten-year deadline for all schools in the city to offer computer science (although CS won’t become a graduation requirement). (Sociology professor Zeynep Tufekci’s response is well worth reading.) LAUSD also said it plans to expand computer science to every grade by 2020 – the district’s billion-dollar iPad investment are going to be super useful for that. And here’s an awesome headline by Edsurge from February: “‘For the Preservation of Public Peace,’ Arkansas Students Will Learn to Code.” (That’s how the legislation framed the importance of programming: “immediately necessary for the preservation of the public peace, health and safety.”) In March, the state also passed a law requiring students learn cursive. Regardless, Wired said that Arkansas is “leading the learn to code movement,” because good grief, ed-tech journalism is terrible.

Even worse: before he left office this year, Australia’s education minister Christopher Pyne approved a new curriculum that replaced the teaching of history and geography with the teaching of programming. “The new curriculum echoes successful programs implemented in the United States such as and ‘Hour of Code’, with the support of Google and Microsoft, including the United Kingdom who introduced coding in primary schools last year,” Business Insider reported, making me question what industry folks think “successful programs” actually look like.

As Georgia Tech professor Mark Guzdial recently noted, “Computing education is being discussed today because of the technology industry. We would not be talking about CS in K–12 without technology industry needs. It’s the NYC tech industry who pushed for the initiative there (see their open letter). It’s the tech industry funding (see funders here). That’s not necessarily a bad thing to have the tech industry funding the effort to put computer science in schools, but it is a different thing than having a national consensus about changing public school education to include computer science.”

While the legislative and school-based efforts to push the “learn to code” mantra ostensibly open up this skill to everyone, there are still many inequalities when it comes to learning computer science and joining the tech sector (more on that below), something that’s exacerbated, arguably, with the push to make “coding” into a consumer-oriented product. That is, it’s still something affluent parents can more readily offer their kids (even at the preschool level) through expensive hardware, software, toys, and private classes. For example, the learn-to-code startup Tynker announced it would be offering classes at some 600 Sylvan Learning locations.

The push for more computer science education has also become a nice marketing label for the educational toy industry. Disney’s all over it, of course, partnering with Google for a new cartoon series, Miles from Tomorrowland, that aims to inspire kids to code. Disney’s also pushing Star Wars branded learn-to-code stuff too. Disney’s pushing Star Wars branded everything though, let’s be honest.

More “learn to code” startups did release school-focused marketing and products this year: Sphero, MakerBot, LittleBits, and the Micro Bit, for example. But these “learn to code” companies’ business models don’t always make sense for schools or libraries, as librarian David Lee King wrote about this fall as his library struggled with its Treehouse subscription.

Nevertheless, the push for more computer science has been a boon for companies offering this sort of education. A small sample of the “learn to code” and “skills training” startups that have raised venture capital this year: ($186 million). LittleBits ($44.2 million). Thinkful ($4.25 million). CodeHS ($1.75 million). Globaloria ($995,000). Flatiron School ($9 million). Kano ($15 million). Wonder Workshop ($6.9 million). One Month ($1.9 million). Udacity ($105 million). General Assembly ($70 million). Andela ($10 million). Udemy ($65 million). (This doesn’t include the boom in funding for private student loan providers, many of which are targeting students attending coding bootcamps. I’ll write more about that disheartening trend in my upcoming post on for-profit higher ed. And I’ll have a lot more to say about these bootcamps, “the new for-profit higher ed,” in that post too.)

A sample of the “learn to code” and “skills training” startups that were acquired this year: Hack Reactor bought Makersquare. Pluralsight bought Code School. LinkedIn bought The Apollo Group bought Iron Yard. And in a follow-up from one of last year’s big ed-tech acquisitions, it appears that Microsoft hasn’t fucked up Minecraft. Yet.

What Do We Mean by "Coding"?

What exactly do we mean when we say “everyone should learn to code” or “everyone should make”? (The latter question seems particularly relevant as, according to the latest K–12 Horizon Report, makerspaces (among other things) are “on the horizon.”)

Early in 2015, the Computer Science Teachers Association released the results of its 2014 high school survey. Among the findings, “participants applied the term ‘computer science’ to a vast array of topics and courses, many of which were submitted as ‘other’ courses in response to the topics that were provided in the survey. Participants classified studies in business management, yearbook layout, artificial intelligence, robotics, office applications, and automated design as computer science courses.” It’s not clear how, even with legislative support in some states for computer science counting towards high school graduation, schools will fit it into the curriculum. (And perhaps this explains why the “Hour of Code” is so popular. I mean, it’s just an hour.)

Recommended reading: “Why I Am Not a Maker” by Olin College of Engineering professor Debbie Chachra. Also recommended, this conversation between Garnet Hertz and Alexander Galloway on “Critique and Making”:

…To be a person in the modern world, one should know at least one foreign language and one computer language. So let’s learn how to code, but let’s also read Plato.

Meanwhile, education industry behemoth Pearson is doing the “maker” thing now too – which just might be all you need to know about that particular buzzword and how it’s being wielded and co-opted by education and education technology companies.

Diversity, Employability, and the Meritocracy Lie

The technology industry continues to struggle with diversity, despite claiming that it’s working to hire more people of color and women. If, indeed, education is forced to bend towards the “employability narrative,” the discrimination and bias in the tech sector is going to be even more of a problem, as we’re pushing for a certain sort of skill-set for work in an industry that remains incredibly exclusionary.

Part of the problem is still the pipeline (and making things pink or “girly” isn’t going to fix things, thank you very much, nor is highlighting athletes from the least diverse of the big four professional sports). Female students and students of color remain underrepresented in computer science majors and classes, and many simply don’t see technology as a good career choice for them.

Even as the “learn to code” movement congratulates itself for its expansion and normalization, a report from the Level Playing Institute released this year “found that public schools with a high number of students of color are half as likely to offer computer science classes as schools with a predominately white or Asian student body.” (Level Playing Institute founder Freada Klein and her husband Mitch Kapor announced $40 million in funding for diversity initiatives to “fix the leaky tech pipeline.”)

Edsurge asked in July “Will Teaching New Computer Science Principles Level the Playing Field?” No. It won’t. Not until the culture of the tech industry changes.

See, the pipeline is just part of the problem. As The LA Times observed in February, women are leaving the tech industry “in droves.” And here’s a gem from The NY Magazine this summer:

A former Google diversity head decided he wanted to build a museum dedicated to women’s history in London’s East End, so he submitted a proposal outlining his goals last October. But after the proposal was approved and construction on the museum began, Mark Palmer-Edgecumbe promptly switched tactics, and instead decided to build a museum dedicated to the notorious serial killer Jack the Ripper, who exclusively targeted female sex workers. His excuse? Jack the Ripper is less boring than exploring women’s history and accomplishments.

You sorta get the feeling that when people say “everybody should learn to code” in order to close that so-called “skills gap” that there’s an asterisk there: certain restrictions may apply.

Such was the case at the University of Massachusetts at Amherst that announced in February that it would “no longer admit Iranian national students to specific programs in the College of Engineering (i.e., Chemical Engineering, Electrical & Computer Engineering, Mechanical & Industrial Engineering) and in the College of Natural Sciences (i.e., Physics, Chemistry, Microbiology, and Polymer Science & Engineering).” (It later changed its mind, reversing the new policy.)

Also running afoul of preconceptions about who should code and who should “make,” 9th grader Ahmed Mohamed, who was handcuffed and interrogated by police after he brought a homemade clock to school. (Pro tip: “How to Make Your Own Homemade Clock That Isn't a Bomb.”)

Support for Mohamed was overwhelming. Even the POTUS weighed in. (For what it’s worth, I think it’s important that we call out the injustice of the school-to-prison pipeline as it affects all students, particularly students of color, not just those students who are science whizzes.)

LinkedIn and the Employability Narrative

In April, on the heels of the news that it had acquired for $1.5 billion, MindWires Consulting’s Michael Feldstein wrote that “LinkedIn is the most interesting company in ed tech.”

LinkedIn is the only organization I know of, public or private, that has the data to study long-term career outcomes of education in a broad and meaningful way. Nobody else comes close. Not even the government. Their data set is enormous, fairly comprehensive, and probably reasonably accurate. Which also means that they are increasingly in a position to recommend colleges, majors, and individual courses and competencies. An acquisition like gives them an ability to sell an add-on service – “People who are in your career track advanced faster when they took a course like this one, which is available to you for only X dollars” – but it also feeds their data set. Right now, schools are not reporting individual courses to the company, and it’s really too much to expect individuals to fill out comprehensive lists of courses that they took. The more that LinkedIn can capture that information automatically, the more the company can start searching for evidence that enables them to reliably make more fine-grained recommendations to job seekers (like which skills or competencies they should acquire) as well as to employers (like what kinds of credentials to look for in a job candidate). Will the data actually provide credible evidence to make such recommendations? I don’t know. But if it does, LinkedIn is really the only organization that’s in a position to find that evidence right now. This is the enormous implication of the acquisition that the press has mostly missed, and it’s also one reason of many why Pando Daily’s angle on the acquisition – “Did LinkedIn’s acquisition of Lynda just kill the ed tech space?” – is a laughable piece of link bait garbage.

LinkedIn may well be at the very center of the “employability narrative” trend, and the acquisition of certainly underscores the importance of “skills training” as ongoing professional development and, as Feldstein notes, as signalling. As such, LinkedIn is helping to push for a changing notion of credentialing as well. (More on that in the next post in this series.)

Oh and some fun trivia for Marco Rubio: LinkedIn co-founder Reid Hoffman has a master’s degree in philosophy.

Blog Logo

Audrey Watters



Hack Education

The History of the Future of Education Technology

Back to Blog