read

An Explainer

Testing the hacked extension tube

Following a lengthy legal battle culminating in a New York State Court of Appeals ruling last week, the city of New York has released data about almost 18,000 individual math and English teachers' performance. The Teacher Data Reports rank teachers based on their students' gains on the state's math and English tests over the course of 5 years (up until the 2009-2010 academic year).

Also known as "value-added assessment," the data is purported to demonstrate how much a teacher "adds value," if you will, to students' academic gains. By looking at students' previous scores, researchers have developed a model that predicts how much improvement is expected over the course of a school year. Whether students perform better or worse than expected is then tied to the impact that a particular teacher had.

Proponents of value-added assessment -- which includes Secretary of Education Arne Duncan and former NYC School Chancellor (and now head of News Corp's education division) Joel Klein -- argue that this model demonstrates teachers' effectiveness, and as such should be used to help determine how to compensate teachers, as well as who to fire.

While the teacher's union and teachers have been vocal in their opposition to the city's move to release this data, they aren't the only ones deeply skeptical and deeply troubled by this particular measurement -- let alone the appearance of individual teachers' names and rankings in local newspapers. Arguing that "shame is not the solution," even ed-reformer Bill Gates thinks the release of the Teacher Data Reports is a bad idea.

Problems with Value-Added Assessments

In no small part, that's because there are many problems with relying on value-added assessments to determine a teacher's effectiveness. First and foremost, of course, it reduces the impact that a teacher has on a student to a question of standardized test scores. How does a teacher help boost a student's confidence, critical thinking, inquisitiveness, creativity? What about other subjects other than math or English? What about poverty and other socio-economic influences on students' lives?

Furthermore, there is a sizable margin of error in these value-added assessments. As The New York Times notes, the scores could be off as much as 54 to 100 points, and the city is only "95 percent sure a ranking is accurate." As Gotham Schools reported several years ago, the margins of error has meant that "31 percent of English teachers who ranked in the bottom quintile of teachers in 2007 had jumped to one of the top two quintile by 2008. About 23 percent of math teachers made the same jump." Did these teachers suddenly get better? We just don't know.

Even academic researchers who've helped develop the value-added measurements caution against using these as sole assessments for teachers. The NYT cites Professor Douglas N. Harris, an economist at the University of Wisconsin who helped develop the methodology the city used in its Teacher Data Reports, who says that "because value-added research is so new, ... 'we know very little about it.' Releasing the data to the public at this point, Mr. Harris added, 'strikes me as at best unwise, at worst absurd.'”

The Role of the Media

Most of the nation's major news organizations are busily combing through the data that the city has released. The NYT says that "SchoolBook is processing the data and plans to publish the ratings for individual teachers and schools as soon as possible."  (It is also asking teachers to comment on their own reports, which just seems to compound rather than relieve the nastiness of having your name printed like this in the country's most prestigious newspaper.)

It's a move that echoes one undertaken by the LA Times back in 2010 when it obtained the test score data and crunched the numbers to determine the effectiveness of its city schoolteachers. Just as the NYT plans to do, The LA Times opted to publish the names of individual teachers it deemed "good" or "bad." It also offered an easy way for anyone to search for a teacher's name and find their value-added ranking.

The local education blog Gotham Schools, however, is bucking here with what's bound to be a major pile-on on the profession. Gotham Schools is covering the news closely but it says that it will not publish individual teachers' names or scores.

"We determined that the data were flawed, that the public might easily be misled by the ratings, and that no amount of context could justify attaching teachers’ names to the statistics. When the city released the reports, we decided, we would write about them, and maybe even release Excel files with names wiped out. But we would not enable our readers to generate lists of the city’s 'best' and 'worst' teachers or to search for individual teachers at all."

Challenges for Open (Education) Data

It's important to look at the controversy in New York City today too in light of other considerations surrounding open governmental data initiatives. After all, as many parents and publications point out, there's a lot of reason why people should want to know (and even have the right to know) the quality of a particular public school and even of a particular teacher.

But as LynNell Hancock wrote in an investigative piece last year about the city's plans to release the test scores, the issue of releasing value-added assessments isn't really a matter of open data or parental assessment. Rather this is very much a matter of the administration (at the city and federal level) and its own political agenda. Noting how some education reporters in the city felt as though was uncharacteristically speedy in responding to FOIL (Freedom of Information Laws) in the case of this data, Hancock observes that some reporters felt "as if the city was using them." She concludes that "[E]ducation reporters have increasingly found themselves herded toward a narrow agenda that reflects the corporate-style views of the new reformers, pulling them farther and farther away from the rich and messy heart and soul of education."

There is a lot of richness and messiness when it comes to covering education. No doubt, there's also a lot of fascinating data that we can -- and should -- look at in order to crack open what can be the black box surrounding teaching and learning.

But how do we move forward in opening data and do so with integrity, honoring the privacy expectations of students and parents and teachers? How do we balance scientific research and methodologies with efforts that, in this case clearly, havelarger political motives and ramifications?  

How do we have a conversation about the quality of teaching, armed with as much data as possible?  But how do we make sure it's actually a quality conversation?  Because it's likely that what the city of New York has undertaken here, with media outlets in tow, will provide neither.

Image credits: Philippe Teuwen

Audrey Watters


Published

Hack Education

The History of the Future of Education Technology

Back to Archives