I recently saw a billboard proclaiming that charter schools graduate 80% of their students.
Fans of The Wire are already familiar with the awesome phrase "jukin' the stats." And, as The Wire itself pointed out in Season 4, schools are no strangers to this phenomenon. The regular public schools in my district graduate just under 60% of their students. Do charter schools really graduate 20% more? Of course not.
These days in education, as in many other realms, data is king. Everyone has an almost blind allegiance to rationalism. Numbers don't lie, right? But, the thing is, numbers can lie. Public education reform policy is replete with ideas involving common-sense, easy-to-understand "data" that, beneath the surface, has very little validity.
Take, for example, the biggest thing that our Education Secretary Arne Duncan has been pushing in his Race to the Top agenda: evaluating teacher "performance" and "effectiveness" by using so-called "value-added" measures. The premise behind this program is that we can measure how "effective" a teacher has been by taking her students' scores on the state test at the beginning of the year and subtracting them from the scores on the test at the end of the year. Sounds simple, right? And of course, the next thing you know, newspapers like The L.A. Times and The New York Times are filing FOIA petitions to publish teachers' scores. The thing is, most statisticians will tell you that this method produces numbers for individual teachers that vary widely from year to year for the same teachers, have large margins of error, and are, basically, mostly bogus. In my own experience, what a student would have to score to show "gains" is within the margin of error of the test. In other words, if a student scored X in September, and needs to score Y in May, the student may actually score Y or X, give or take, taking the test in two consecutive weeks. Because most of the tests being used to get these scores don't even claim anywhere near that kind of accuracy for a student taking it just once. That means that "value added" actually only works--only actually tells you something significant--when you measure large groups of students and large groups of teachers over time, like a school's improvement over several years.
And then let's look at the premise of value-added itself: this is a method that assumes that children are vessels into which teachers can simply add more volume over the course of a year of teaching. Any real teacher or student of education knows that this is not what teaching and learning actually looks like. The people who espouse this policy are essentially treating children as if they are so many widgets on an assembly line, a view of education that is now nearly a century old--as old as the assembly line itself. You can see this attitude in the language they use--talk of teacher "efficiency" and "productivity." These are simply code words for spending less money to educate more students. Those who say that teacher quality is "the most important factor" in a student's education really mean that it is the most important *school* factor. More than that, they also mean that it is the most cost-efficient factor, far more cost-efficient than, say, decreasing class size, which has been shown to be effective especially for urban minority and low-income students. In other words, the recent attacks on teachers and their unions is actually a movement to find the cheapest way to "save" public education.
And so this is where jukin' the stats leads us. To believe that there is an easy, common-sense way to fix public education, just like on The Wire the higher-ups want an easy, common-sense way to fix all of the problems of urban poverty--crime, the drug trade, schools, everything. But, if anything, we should be intensely suspicious of simple solutions and straightforward-seeming numbers. Teachers are, after all, professionals. Do people honestly think that millions of smart people who have been working on this problem for decades haven't thought of these simple solutions before?