Low Carb Kung Pau
Finding a Doctor
Low Carb Baking
The Low Carb Kitchen
Letting Go of Stress
Best of The Low Carb Blogs
How Pets Help People
15 Tips for Kissable Lips
SIGN UP TO SUBSCRIBE
More and more people from the "low carb world" are taking their thoughts to the web in
the form of "blogs" (short for weblogs). And they're making a lot of sense. In fact,
blogging — an activity that's reaching phenomenon status is probably the best
way to get a message "out there."
So each month, we'll be bringing you the "Best of the low carb Blogs." The topics
won't always be about low carb per se. We're simply choosing those entries that we, at
Low Carb Luxury, find to be buzz-worthy.
This month, we feature an entry from Jonny Bowden's Blog. Jonny
has a unique gift for making complex subjects clear and interesting, and has changed the
landscape of diet, fitness, and healthy living.
Jonny is a good friend of ours, and sits on Low Carb Luxury's Expert Panel. Visit
his blog often to read all that he has to offer!
Media Reporting 101
"Seatbelts Found to be Ineffective in Two Major Studies"
Copyright © August 2006 Jonny Bowden and Low Carb Luxury
Suppose you opened the newspaper tomorrow and read the above headline.
You read a little deeper and you find the following: "Seatbelts were ineffective in preventing some of the most important causes of death, researchers said today. A study, published today in the American Journal of Anti-Seatbelt Research, found that seatbelt users were no less likely to die of cancer, heart attacks or hepatitis than non-seatbelt wearers. In a second, related study, it was found that seatbelts did not offer any protection from divorce."
But wait... there's more. How about this summary: "Seatbelts found ineffective in preventing three major causes of death."
This is exactly what happens every day when the media reports vitamin research. Research which is often — though not always — funded by drug companies. And which is always published in a medical culture (Journal of the American Medical Association, New England Journal of Medicine) in which pharmaceuticals are king and contempt for "natural" cures is rampant. (I don't recall seeing any attention-grabbing headlines on the hundreds of pro-nutrient studies published weekly in journals like the American Journal of Clinical Nutrition.)
The recent spate of headlines proclaiming that some of our most important vitamins "don't work" have driven me mildly mad. Why? Well, for one thing, the information is dead wrong. For another — and this is the one that's more frustrating — explaining why it's dead wrong takes more than ten words and requires a modicum of sophistication in how science works and how studies are done.
Few people have time to read the small print and fewer still — unless you've been trained in statistics — would understand it. Everyone however, can — and does — form a strong impression from the headline "Omega 3's Found Useless Against Heart Arrhythmias," (which can easily be shortened to "Omega 3's Found Useless" and translated over coffee to "Hey Joe, did you hear? They found out that that fish oil stuff doesn't do anything!"
And while I know that you, dear reader, would never make such an leap, I'll bet you know someone who does. Only always.
When I used to teach personal trainers about how to interpret research results, I'd use the following example: "Suppose," I'd say to the class of muscle-heads, "I want to find out which of two particular exercise routines is better at building muscle. Let's divide the room in half. Half of you — everyone to my left — will try exercise routine number one for a few weeks, everyone on my right will try exercise routine number two. Sound OK?" (Actually it's very far from OK, but let's not get into that for now. My trainers thought it was just fine.)
Then I'd say: "Now suppose, right before we divide the room in half, a dozen guys, average height 6'3", average weight 260, average body fat 4%, all of whom are competing in the Mr. Olympia bodybuilding competition next door at the Beacon Theatre, happen to wander into our classroom and sit together on the left side of the room. What just happened to my experiment?"
That they got.
How a study is done — what criteria were used to measure success, how the subjects were matched, what variables were studied — makes all the difference in the world. And, in case you don't feel like reading further, let me cut to the proverbial chase: you should completely ignore such recent headline grabbers as "B Vitamins Don't Prevent Death," "Omega-3's Useless for Preventing Arrhythmias" and "Calcium Doesn't Help Build Strong Bones." They're pretty much cut from the same cloth as "Seatbelts Don't Work."
Which they don't. If you're measuring heart disease.
Let me set the stage for you:
Suppose you were reading the "car ratings" issue of Consumers Reports, where they tell you the "Top Ten Cars for 2006." Which you really want to know, because you're about to buy a car and would like to know what the "best" ones are. So far, so good. Now suppose I told you that the criterea used for determining the "best" car included what station the car's radio was tuned to. Would that influence your opinion of the results? I hope it would. Why? Because obviously what station the car's radio is on has absolutely nothing to do with anything. But most people do not know — or care — what the "fine print" of a study says, or, in our example, what criteria was used to determine the "best" car. They just read the ratings. Or, in the case of a study, the CNN headline that "summarizes" the results, written by a reporter who has no more science literacy than the average high school student.
Now, you might say, the people who do these studies are really smart. They wouldn't make such silly mistakes as using criteria analogous to "what station the car radio is turned to." Well, I can understand your feelings, but I respectfully submit that the reason you think that is because you don't hang around with enough researchers. They can and do make mistakes just like that, and just as unforgiveable. And when economics gets into the mix — i.e. when the studies are funded by people who have a vested interest in the results — the "mistakes" or "omissions" are all the more maddening.
As Upton Sinclair once said, "It's difficult to get a man to understand something if his salary depends upon him not understanding it."
Take home point: There are an infinite number of "data points" when you're collecting statistics. What you pay attention to, what you underline, what you take-home is a subjective act of interpretation. Statistics are simply numbers. They need to be organized and interpreted. Interpretation is never objective, and always dependent on many criteria such as what you consider important and what you're looking at. What you hear on the news is almost always interpretation. Contrary to conventional belief, the "facts" do not speak for themselves — they require an interpreter to extract conclusions from them (Facts: We've built some hospitals in Iraq and they just had elections. Interpretation: the war in Iraq is going well. Facts: We?ve lost 2500 people and 47% of the population thinks it's fine to kill Americans. Interpretation: The war in Iraq is going horribly.) Pick your numbers, draw your inferences.
Welcome to the world of statistics. Or as they like to say, "God is in the details."
Back to the vitamin studies.
Recently, a headline proclaimed that fish oil did not reduce the risk of serious abnormal heart rhythms, leading to the widely reported sound byte that omega-3's "didn't work." Well, by the criteria of reducing serious abnormal heart rhythms, they don't. And seat belts don't reduce the risk of divorce. Does that mean we shouldn't wear them?
No one ever said that omega-3's reduce the risk of serious abnormal heart rhythms. What omega-3's do is prevent strokes and heart attacks, reduce both systolic and diastolic blood pressure, reduce inflammation and improve mood. To "evaluate" their use by looking at a criteria that's fairly irrelevant and then — even worse — proclaiming them "useless" based on their inability to affect that criteria, is like deciding Kobe Bryant is a terrible athlete because he can't play tennis. Or that a car doesn't deserve a high rating because its radio station is tuned to Heavy Metal 101.
But don't get me started.
Visit the Jonny
Bowden Solutions website.