In 2010, thanks in large part to Christopher’s McDougall‘s Born to Run, barefoot running became the newest running trend. Everywhere I went I heard about the health benefits of running barefoot. I heard it all, from “You’ll never get injured again!” to “You’ll be so fast!”. And research seemed to back up the barefoot running phenomena, yet I remained stubbornly unwilling to try. I was a bit relieved, then, when other research emerged that contradicted the earlier study’s findings, but not before Vibram sold millions of its five-finger minimal running shoes. These later findings resulted in a class-action lawsuit against the company.
This is just one example of companies and other interest groups using purported research to promote an agenda and get into our pocketbooks. We need to examine all the claims we see every day in running magazines, on blogs, and especially those made by marketers that say that we runners must be doing this or that thing or use this or that product if we want to be a healthy high-performing runner.
Why do we need to give a critical eye to research? Media reports on science to gain eyeballs, so it’s not always in their interest to report good science (think: clickbait). Marketers report on, spin, or even manufacture science (I’m looking at you, Gatorade) to sell their products. Think about all the things that media and marketers have used science to claim we need to stay healthy and perform our best:
I’m not saying either way that any of these things are good or bad. In fact, I swear by a few of them as an experiment of one (foam rolling and sports massages), but not all of them (orthotics and compression socks). The research and subsequent articles espousing the benefits of these products is often contradicted in one or more studies. Naturally, the company that makes those products doesn’t show you that research.
For example, while we see many of our favorite athletes rocking knee high compression socks, the research is mixed about the benefits of these garments. One study failed to find significant differences in performance between sessions with and without compression. Another group did not find differences between groups during a fatiguing single-leg hopping task. Researchers have had a little more luck with cyclists, citing limited, but likely physiological benefits.
Additionally, most of us runners have been inundated with comments from well-meaning friends citing “research” that running is bad for us, something that we sense not to be true. Running isn’t shaking my uterus loose, my heart is quite healthy, and my knees are in fantastic shape. My point? We need to be taking the so-called running research with a grain of salt.
Here’s how to investigate the validity of scientific claims about running products and practices.
Find the studies
Sometimes there is no study directly on point. For instance, when we were researching the claims some cryotherapy proponents make about how it can boost metabolism and contribute to weight loss, we could not find studies about cryotherapy and weight loss. However, we did find studies on the benefits of cold therapy in recovery from hard workouts that mention metabolic effects. Were these the studies that the marketers were using to make their claims? Probably. One way to find the research behind the claims is a simple google search. Even with my access to university databases, I always go to google first to see what is being said about the topic. Most of the time, googling the topic, say using the keywords “cryotherapy” and “athletic performance,” will pull up the different articles written based on the research. From there, you can see the range of articles not immediately available in an article promoting a single product or message. However, if your google search isn’t pulling up any research, you might have to tap into your friend in college or who works in higher education. They will likely have access to the databases where the journals publishing this research can be found and can guide you towards the research that will answer your questions.
Once you find a study that might say what the articles or marketers claim they say, then you must analyze that study by answering some questions like:
Is correlation being stated as causation?
Carefully review whether the research is being used to state causation when, at best, the research only shows that two factors are related (correlation). For example, a study could find that 90% of runners eat ice cream and claim that the act of eating ice cream leads to running, which would be awesome, but is not, in fact, what the research found. Similarly, much of the research that relates running to knee injuries are correlational (and recently contradicted). Those studies found that these injuries occurred frequently in runners, but they were unable to state that running (and not other factors, like prior injury and genetics) caused knee problems.
Who is included in the sample?
By sample, what I mean is who participated in the study? For example, if the researchers want to know how a certain supplement affects recovery time but they only talk to runners in their mid-20s, then the research only really applies to runners in their mid-20s, not to runners in their mid-50s or teens.
How many people actually participated in the study?
If the researchers studied thousands of people, you probably won’t be as skeptical as you would be if the study only included 30 people. Based on the sample size, were enough people studied for you to feel it necessary to change your behavior? Most of the time, if relatively few people were studied, I’d say probably not.
Who funded the study?
Although valid research may be funded by a corporation, research funded by a company with a stake in the outcome should always warrant a deeper look and a healthy dose of skepticism.
What do the findings actually say?
A famous psychology researcher once said that research should make sense. What he meant is that most research isn’t shocking or life changing, so we should be able to read the research and think yeah, I already knew that. Use your logic. Do the findings make sense with your own experiences? If not, then take a deeper look and search for contradictory evidence before making any changes to your routine.
Is the research stated as incontrovertible fact?
Good research, no matter how large the study is and truly random the sample is, will never state that something is fact. It will say things like, “the data supports the alternate hypothesis that finds that …” or, “research suggests that _____ happens”, but never things like “x causes y.” Why? Because no matter how many people they talk to, unless they talk to every single person in a population (e.g. runners), they can’t know for a fact that something causes or predicts something else. It’s not good science to state a finding with absolute certainty.
What does the other research say?
Good research will include a list of other studies that have similar and contradictory findings. For similar research, it’ll summarize how the new findings fit into that research. And for the contradictory research, it’ll explain how the current study is different – and will often suggest that future research examines those contradictions. Because, as I stated above, good research is never fact. It will always state, usually explicitly, that more and better research should be done.
For the most part, good science won’t conclusively defy logic, it will reinforce what you already intuitively know to be true. If it doesn’t (and doesn’t fit into your lifestyle), keep on doing what you’ve been doing. But if it makes sense and maybe addresses a problem or concern you have, do a little legwork before you put your money where your mouth is.
What running research have you seen over the years that you questioned? How do you decide what research to believe?
*Thanks to Pesto, our PhD to be, who contributed to and reviewed this post for Type I and Type II errors.