The gullible media and the chocolate factory
Briefly stated, the Gell-Mann Amnesia effect works as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the ‘wet streets cause rain’ stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story—and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about far-off Palestine than it was about the story you just read. You turn the page, and forget what you know. — Michael Crichton (Why Speculate?)
According to John Bohannon, he fooled newspapers and magazines around the world into reporting on bad chocolate science. Now, given that he’s claiming to have lied to the world, it’s understandable if our first response is, why should we trust you now?
But his description of what they did makes perfect sense. Basically, they measured a whole bunch of things. By purely random chance, if you measure a whole bunch of things, some of your measurements will look like they’re significant even though, taken as a whole, they aren’t.
Think about it this way: if you toss one marble at a cup twenty feet away, it’s pretty impressive if you get it in. If you toss a hundred marbles at the cup, it’s no longer so impressive if one of them makes it in.
Bohannon basically ran the study as too many researchers do nowadays, just for the hell of it, then reported as if they had run a study for what they found afterward. This is a big, big mistake where randomness plays a part. When I was a young psychology student at Cornell, I was even taught never to run a study without knowing what, specifically, you were looking for and why. Even to find things to run a study about later.
That is, they could have run this study, discovered that among all the things they were measuring their chocolate subjects lost weight faster, and then designed a study around that finding to determine whether chocolate causes dieters to lose weight faster. That at least would have some significance about it.
But it still wouldn’t be science.
Science is: theory, prediction, test.
You can’t just go looking for things. You have to know why you are looking for that thing. You have to have a theory that predicts something. Then, from the combination of that theory and that prediction from the theory, you test that prediction.
You might say, I don’t need a theory of gravity to know that letting go of rocks means they’ll fall.
You’re right, you don’t. But that’s not science, or at least, it isn’t the scientific method.
Galileo had a theory, that things must fall at the same rate because otherwise you could connect things by strings and both sides of the connection would fall faster, just because they were connected. As theories go, that’s not Einstein-level, but it did allow a prediction: drop two different items from a tall height, and they should drop at the same acceleration. He tested that theory; the test matched his prediction.1
What John Bohannon describes is all too common in research today. Researchers want to know the “effects” of things. So they measure all sorts of “effects”. Sometimes they don’t even plan to measure an effect, but they gather the data as a side effect of some other measurement, and then that side effect turns out to be the only “significant” finding.
Significant here, however, is a misnomer. Statistical significance requires that you were looking specifically at that one effect and already predicted what form that effect would take. Otherwise, you’re relying, as Bohannon did, on chance to provide something that you can publish.
Further, because human testing is a pain in the ass—that’s a technical term—sample sizes are almost always smaller than we’d like anyway.
Without any idea of why you found what you did, there is no way for someone else to disprove your theory. Your theory was basically that if you run a whole bunch of tests, something interesting will pop up among that population. If someone else runs a similar test and comes up with different results, it doesn’t disprove your theory because you don’t have one.
The most important lesson, though, is that reporters are simply not intellectually equipped to sort out the good science from the bad. The news media is notoriously bad at getting even basic reporting right—the Gel-Mann amnesia effect is the most famous example—but the scientific method is an alien thing to most journalists. Good science is about numbers, replicability, and a systemic process. Journalists think in terms of anecdotes, black swans, and intentions.
Even Bohannon, whose writing is engaging, clear, and precise, gets it somewhat wrong at the end. After explaining why the shotgun approach makes a hash of experimental results, he then says:
If a study doesn’t even list how many people took part in it, or makes a bold diet claim that’s “statistically significant” but doesn’t say how big the effect size is, you should wonder why.
Those are important, especially the difference between statistically significant and practically significant. But he already identified the most important criteria in determining whether a study is “statistically significant”: how many effects they were looking at ahead of time vs. how many effects they reported on. He even went through the math to explain why most “significant” results are in fact nowhere near significant when taken in contest of everything else the researchers measured.
That is the one thing that requires shoe-leather reporting, because it will almost always be left out of the study’s press release even if sample size and size of the effect are presented.
Now if you’ll excuse me, I need to go over to Trader Joe’s and get some dark chocolate. Because I like it, not because it’s healthy.
Assuming, for the moment, that our legend of the event has come down correctly.
↑
- Chocolate cookies (for breakfast)
- We woke up this morning and then we stayed in bed for a few hours. And when we finally did get up, we desperately needed breakfast. And sugar. And chocolate. What we needed were chocolate cookies for breakfast. Made quickly with a minimum of fuss.
- Galileo’s Leaning Tower of Pisa experiment at Wikipedia
- “Imagine two objects, one light and one heavier than the other one, are connected to each other by a string. Drop this system of objects from the top of a tower. If we assume heavier objects do indeed fall faster than lighter ones (and conversely, lighter objects fall slower), the string will soon pull taut as the lighter object retards the fall of the heavier object. But the system considered as a whole is heavier than the heavy object alone, and therefore should fall faster. This contradiction leads one to conclude the assumption is false.”
- I Fooled Millions Into Thinking Chocolate Helps Weight Loss. Here's How.: John Bohannon
- “We ran an actual clinical trial, with subjects randomly assigned to different diet regimes. And the statistically significant benefits of chocolate that we reported are based on the actual data. It was, in fact, a fairly typical study for the field of diet research. Which is to say: It was terrible science. The results are meaningless, and the health claims that the media blasted out to millions of people around the world are utterly unfounded.” (MediaGazer thread)
More confirmation journalism
- False positives, the Internet, and the grievance media
- The ability of the modern mass media to search the entire population ensures that most stories are wrong.
- The Elements of Journalism
- Now that the Internet empowers readers to check the veracity of news reports, journalists need to come up with more and better justifications for their bias.
- Confirmation journalism and the death penalty
- Iterative journalism is like the Red Queen in Alice in Wonderland: “Sentence first, verdict after.” The Elements of Journalism praises David Protess’s project that railroaded a mentally disabled man into prison for fourteen years, because it served their bias.
More media
- Spin Cycle: Inside the Clinton Propaganda Machine
- This is basically the story of how the Clinton administration gave the media an excuse for not following up.
- Amusing Ourselves to Death: Public Discourse in the Age of Show Business
- Amusing Ourselves to Death is a disjointed effort to prove that the speed of modern communications is killing us, but it ignores basic features of modern communications, such as the ability of both sides to respond; and to the extent that modern communications empowers the individual he sees that as an evil, preferring the bundling of individuals by self-appointed elites as in the age of Tammany Hall.
- The evolution of news to candy
- When you remove the news from the newsstand, what do you get?
- The Powers That Be
- David Halberstam’s tome about the growth of media power is repetitive, burdensome, it circles itself like an overweight prizefighter attempting to gain the advantage of the mirror, but like the aging boxer is filled with anecdotal glory.
More scientific method
- Our Cybernetic Future 1954: Entropy and Anti-Entropy
- In 1954, Norbert Wiener warned us about Twitter and other forms of social media, about the breakdown of the scientific method, and about the government funding capture of scientific progress.
- The scientific creed
- If science is your religion, you have chosen the hardest religion of all. If science is your religion, you don’t prove yourself right. You prove yourself wrong.
- Climate priests cry wolf one more time?
- In science, if your theory’s predictions don’t happen, you need a new theory. In religion, if your beliefs predict something that doesn’t happen, you just keep moving that prediction further into the future.
- Global warming vs. oiled dolphins
- Catastrophic anthropogenic global warming critics are more dangerous than oil execs who kill dolphins, and need to be buried deeper than two million year old bones. But this makes CAGW a non-science. Science requires criticism or it isn’t science. Science-oriented media outlets are doing CAGW scientists a disservice by protecting them from competing theories.