More Meaningless Education Research
There is no shortage of dubious education research. Reports “proving” that new teachers are better than old, charter schools are better than non-charter schools, and graduate schools of education are useless seem to pop up frequently. If you have a loose-grasp of statistics and the willingness to tell someone what they want to hear, chances are there’s funding available for your study.
So it was no surprise to see exam schools finally make their way into the discourse. The following study appeared in the New York Times, grabbing headlines with its claim that “the impact of attending an exam school school on college enrollment or graduation is, if anything, negative.”
Exam schools grant admission based on a standardized test. By achieving a minimum score on the test (the school’s “cutoff”), the student can choose to attend the school. These public schools typically offer advanced courses and more rigorous instruction, and one would think that students would get a lot out them. Not according to the authors of this study, who conclude that, in these schools, students’ “actual human capital essentially remains unchanged”. In jargon common to these kinds of studies, exam school schools don’t add any value to the educational experience of students.
A cursory review of the study suggests some obvious problems, many of which are pointed out in the comments section of the original Times article. However, a close review of the study revealed something so absurd, it makes the study seem not so much flawed as intentionally misleading by design.
The basic premise of the study is to compare students who just make the cutoff for an exam school with those who just miss that cutoff. In theory, since these students have similar tests scores, they start with similar levels of ability. Some of them enter the exam school, and some of them don’t. By comparing their later achievement, we can get a sense of what, if anything, attendance at the exam school adds.
Let’s say that Student 1 just makes the cutoff for Exam School A, and Student 2 just misses that cutoff and thus attends a different school. The study claims that Students 1 and 2 will go on to have similar SAT scores and have roughly the same chance of graduating college. That is, attending the exam school does not add any value for Student 1.
What the study doesn’t take into account is that the school Student 2 ends up attending is also likely to be an exam school! Student 2, who just missed the cutoff for Exam School A, might very well attend Exam School B, which has a lower cutoff. In the eyes of this study, however, Student 2’s success at Exam School B counts as evidence that exam schools don’t add value!
In the New York City system, where this study was conducted, this situation arises frequently. A student might miss the cutoff for one exam school but attend another exam school. Indeed, the authors themselves note that in the case of one particular school, 40% of the students who miss the cutoff end up attending a second particular exam school. And when they succeed, they all count as evidence against exam schools.
There are other serious issues regarding this study’s methodology, but to me this is the most significant. Moreover, the obvious gap between what was actually done and what was purported to be done is very disturbing.
I wonder how closely such studies are read, and I wonder what this has to say about the state of current education “research” in general.