More Meaningless Education Research

There is no shortage of dubious education research.  Reports “proving” that new teachers are better than old, charter schools are better than non-charter schools, and graduate schools of education are useless seem to pop up frequently.  If you have a loose-grasp of statistics and the willingness to tell someone what they want to hear, chances are there’s funding available for your study.

So it was no surprise to see exam schools finally make their way into the discourse.  The following study appeared in the New York Times, grabbing headlines with its claim that “the impact of attending an exam school school on college enrollment or graduation is, if anything, negative.”

http://artsbeat.blogs.nytimes.com/2011/08/16/thinking-cap-angst-before-high-school/

Exam schools grant admission based on a standardized test.  By achieving a minimum score on the test (the school’s “cutoff”), the student can choose to attend the school.  These public schools typically offer advanced courses and more rigorous instruction, and one would think that students would get a lot out them.  Not according to the authors of this study, who conclude that, in these schools, students’ “actual human capital essentially remains unchanged”.  In jargon common to these kinds of studies, exam school schools don’t add any value to the educational experience of students.

A cursory review of the study suggests some obvious problems, many of which are pointed out in the comments section of the original Times article.  However, a close review of the study revealed something so absurd, it makes the study seem not so much flawed as intentionally misleading by design.

The basic premise of the study is to compare students who just make the cutoff for an exam school with those who just miss that cutoff.  In theory, since these students have similar tests scores, they start with similar levels of ability.  Some of them enter the exam school, and some of them don’t.  By comparing their later achievement, we can get a sense of what, if anything, attendance at the exam school adds.

Let’s say that Student 1 just makes the cutoff for Exam School A, and Student 2 just misses that cutoff and thus attends a different school.  The study claims that Students 1 and 2 will go on to have similar SAT scores and have roughly the same chance of graduating college.  That is, attending the exam school does not add any value for Student 1.

What the study doesn’t take into account is that the school Student 2 ends up attending is also likely to be an exam school!  Student 2, who just missed the cutoff for Exam School A, might very well attend Exam School B, which has a lower cutoff.  In the eyes of this study, however, Student 2’s success at Exam School B counts as evidence that exam schools don’t add value!

In the New York City system, where this study was conducted, this situation arises frequently.  A student might miss the cutoff for one exam school but attend another exam school.  Indeed, the authors themselves note that in the case of one particular school, 40% of the students who miss the cutoff end up attending a second particular exam school.  And when they succeed, they all count as evidence against exam schools.

There are other serious issues regarding this study’s methodology, but to me this is the most significant.  Moreover, the obvious gap between what was actually done and what was purported to be done is very disturbing.

I wonder how closely such studies are read, and I wonder what this has to say about the state of current education “research” in general.

What Costs More in 2011?

This is a nice representation of Consumer Price Index data from the outstanding FlowingData.com:

What costs more in 2011?

Charting the change in prices from March 2010 to March 2011, transportation and education prices went up the most, while communication and apparel dropped a bit.  A nice feature of this infographic is that it includes inflation as a benchmark; it’s easy to see here that even though food prices increased, their increase was consistent with inflation overall.

The creator of this infographic wonders why the government itself doesn’t do what FlowingData.com does:  namely, why doesn’t the Bureau of Labor Statistics create simple, easy-to-understand graphics like this with its data, rather than just publishing a text file full of numbers every month?

That’s a good question, and a good opportunity to get students involved!  Making data easier to understand means making data more useful, so take a look at the Bureau of Labor Statistics (http://www.bls.gov/), the Center for Disease Control (http://www.cdc.gov/), or some other government agency.  Grab some public data, create some visual representations, and make the data understandable!  And use FlowingData’s great work here, and elsewhere, as a guide.

Math and Science Education: State-by-State Rankings

This report from the American Institute of Physics ranks U.S. states by their proficiency in Math and Science education:

https://www.aip.org/press_release/state_outcomes_math_science_education_reveal_big_disparities.html

The study uses student performance in physics and calculus courses (measured by various standardized exams) as well as teacher certification requirements to rate each state.

Massachusetts comes in first, with New York placing a respectable fifth.  Mississippi is dead last by a wide margin.

I originally came upon this story in the Huffington Post, and readers posted some interesting responses.  One comment compared and contrasted these rankings with the average math SAT scores for each state.  And another person remarked how closely these state rankings in math and science education align with state voter preference!

Analysis of NBA Finances

This is a comprehensive and insightful look into the NBA’s claims of financial distress from Nate Silver:

https://fivethirtyeight.blogs.nytimes.com/2011/07/05/calling-foul-on-n-b-a-s-claims-of-financial-distress/

As the NBA prepares to battle the player’s union over revenue, the league has made several public claims about how they have been losing money for years.

Silver takes a deep look into those claims.  He crunches the numbers and compares player revenue as a share of league revenue across the four major sports leagues; he looks at salary growth relative to league growth; and he also discusses some of the dubious accounting tricks teams and leagues use to make profits disappear!

As usual from Nate Silver, this is a very interesting and readable application of mathematics and statistics.  His conclusion is summed up best by a recent message from @fivethirtyeight on Twitter:  “If David Stern really thinks the NBA lost $370 million last season, shouldn’t he have fired himself?”

Follow

Get every new post delivered to your Inbox

Join other followers: