I have no idea, but two gentlemen from the Johnson School of Management at Cornell and one from the Economics department at Purdue seem to think so. They have written a paper, which has found interest at Slate, which they boast as an “exclusive.”
Waldman’s (and others) paper, which is a couple of years old, is a prime example of how to get carried away with an idea, so it is worthwhile to review it. It’s best to download the paper so you can follow along (the paper is freely available).
The genesis of the idea was noticing that autism rates at birth in the state of California started to increase in the early 1970s, picking up pace until 2000, when their data stops (see their Figure 1). It is true to say that something caused this increase. But what?
There is no way to know, but we can posit causes and then test them. The best way to do that is by direct measurement: Propose a cause, design an experiment or collect data in which the cause was controlled and the effect happened. That is difficult to do in the case of autism, of course, since you won’t know a child is inflicted for some time after his birth. But, of course, it would not be ethical to let a cause stay in place if you suspected it would lead to autism. It is also not clear when the cause, or causes, whatever they might be, have to manifest themselves. That is, the same cause might be in place for two children, but miss its timing, so to speak, in the first case and get it right in the second. A plausible biological mechanism for the cause that fits in with other known medical science must also be in place. In short, this kind of investigation is not impossible, but it is difficult and must proceed by, if I can use the pun, baby steps.
Another way to assign a cause is by guessing. This is the easiest method by far. It starts with some guys sitting around a table and saying something like, “How about watching TV? That can’t be good. Especially if kids watched old Three Stooges reruns.” Something like that happened here. Waldman said “I asked around and found that medical researchers were not working on this [possible connection], so accepted that I should research it myself.”
To demonstrate that watching TV is a possible cause involves nothing more than showing that watching TV is correlated (a word I use here in it’s non-technical, plain English sense) with autism. If watching TV had something to do with autism, then the two pieces of data would be correlated, it is true. But if TV had nothing to do with autism, the two pieces of data might still be correlated. The kick is, there is no way to know, by just looking at the TV/autism data, whether this correlation was real or spurious. Of course, autism might be correlated with lots of things, none of which were its cause. This fact means we are on thin ice, and the slightest misstep will cause us to fall through.
Since autism rates have increased since the 1970s anything that also increased, even at different rates, during that same period will be correlated with autism. As the Slate article cautions “petroleum use also rose during the period but is unrelated to autism.” I agree that oil and autism are not related, but you must understand that this is not something which could be learned by examining the data. Oil use and autism are correlated. It is only an extra-statistical judgment that tells us the relationship is silly.
In medicine, there is something called dose response, a fancy way of saying that more of the drug leads to stronger effects (or increased side effects). What might this dose-response be with TV and autism? Well, watching more TV is an obvious culprit: the more hours spent in front of the tube, the greater the chance of developing autism—so speculated the researchers.
But how can you tell how much TV all these kids watched? You can’t. There is no way to go back to 1970 and count how many hours each baby watched TV. This is a dilemma, because we would really like to test the dose-response. Perhaps there is a proxy? A proxy is a stand-in variable that is so strongly associated with hours of TV watched that it’s almost as good as the real thing. Can you think of any?
How about precipitation? Sure, rain and snow. After all, when it rains, what else is there to do but watch TV? Actually, lots, and when it snows, there’s even more. But, this is the proxy chosen by the researchers (their Figure 6 will hold some interest for those interested in global warming).
They plotted up maps by county for California, Oregon, and Washington, and colored in counties that had more than median precipitation (from 1990-2001) and then colored those with higher than median autism rates. These colored squares tended to be in the same spot, and is what led them to the conclusion that watching TV causes autism. Case closed.
But can we think of alternate explanations? Take a look at their Figures 3-5 and you’ll see that the high precipitation counties are all in coastal areas, which are the same places that people choose to live. That is, there is, in these three states, a vastly greater population density on the coasts then in the interior and it is in these regions where the autism rates are higher. Populations in these high-density areas are also more heterogeneous, with greater disparities in behavior, income, health care access, and on and on. Wouldn’t it be more likely that one or more of these disparities were the cause or causes of autism rather than the county’s precipitation rate?
That’s my bet, but again this in a extra-data judgment. The authors of the paper do go on and calculate “linear regression models”, which ties all these and some other variables together. These models have the assumption that everything is related by a straight line, which is probably not true or even a good approximation. But even if they were straight-line related, the models cannot fix the problem that any other variable that increased since the 1970s—like oil use—would have also given significant results. These models only quantify the idea of correlation, after all.
What happens, then, is that these great, complicated tables, built with sophisticated software have given the authors the illusion of certainty. A reporter picks up the story, and then “activists” get involved, and, well, you know the rest.
(Here’s a cute article on other scares, most brought about in the same way as the TV/autism research was done.)