Skip to main content

A study on weight training published by British researchers last month included questionable methodology.Getty Images/iStockphoto

Last month, researchers in Britain published a remarkable study showing that your genes may dictate whether you respond better to power- or endurance-style training. Doing the "right" training for your DNA would triple your gains in both strength and endurance, the results suggested.

As a science and health journalist, I live for such breakthroughs, which combine cutting-edge science with immediate practical impact. I planned to write about the study, but as I began canvassing other experts in the burgeoning field of exercise genetics, I encountered vigorous skepticism.

"The DNA test in question is one of many companies who make bold claims on what their products can do, but in reality there is no scientific evidence to support those claims," Tuomo Rankinen, a prominent exercise genetics researcher at the Pennington Biomedical Research Center's Human Genomics Laboratory in Louisiana, said in an e-mail.

"In other words, these products are complete nonsense, they have no predictive power regarding exercise responsiveness, and they are simply a waste of money."

Others expressed similar doubts, and raised questions about the study's design; the obscure journal in which it was published (Biology of Sport, the official journal of the Polish Institute of Sport); and the potential for conflicts of interest. Several of the study authors are employed by or affiliated with the company DNAFit, whose genetic test was used in the study and sells for £99 in Britain.

I figured the episode might serve as a good case study about the dangers of overhyped or biased science. I scrutinized the paper more closely, and asked the authors to send me some of the original data. But even under the magnifying glass, no obvious errors or misdirections leaped out. The study wasn't perfect, but it was no worse than many others that receive respectful media coverage.

The researchers, led by Nicholas Jones of the University of Central Lancashire, had assembled 123 young athletes to undertake an eight-week training program of one to two workouts per week in addition to their usual sports training. Both groups did the same six strength-training exercises; the only difference was that one group did "power" workouts with 10 sets of two reps for each exercise, while the other group did "endurance" workouts with three sets of 10 to 20 reps for each exercise.

Before starting, each volunteer gave a spit sample for the DNAFit test, which assesses the presence of 15 gene variants that previous studies have linked to power or endurance performance, then combines them in a proprietary (and confidential) algorithm to classify each subject as better suited to power or endurance training. The DNA results were "double-blinded," hidden both from the athletes and from the researcher supervising the training.

Half of the power athletes were assigned to power workouts, while the other half did endurance workouts; the endurance athletes were similarly divided. After eight weeks, those whose training matched their genes had improved their countermovement jump (a measure of explosive power) and performance in a three-minute cycling test (a measure of endurance) by 6 to 7 per cent. In contrast, those whose training and genes were mismatched improved on both measures by 2 to 3 per cent.

So what's wrong with the study? There are certainly some methodological issues, such as a high dropout rate. Only 67 of the 123 volunteers completed the entire study, which could have skewed the results.

And the issue of potential bias has to be considered, even if you assume that the researchers were honest. "Lots of research has shown that the funding sources matter to results," notes Tim Caulfield, a professor of health law and science policy at the University of Alberta who has written extensively about direct-to-consumer genetic testing. "Conflicts of interest – operating largely unconsciously – can impact data collection, interpretation, and reporting."

But the biggest issue is that the results are so much more dramatic than previous studies of the 15 individual gene variants would predict. That's why, according to a position paper published by 24 leading scientists in the British Journal of Sports Medicine last year, "the general consensus among sport and exercise genetics researchers is that genetic tests have no role to play in talent identification or the individualized prescription of training to maximize performance."

In other words, Caulfield says, "we have a small study, that cuts against consensus, by a group with an interest in the results."

Still, in considering these arguments, I couldn't shake the feeling that I was prejudging the results based on my (and others') prior expectations, which isn't entirely fair. If DNAFit really has found a way of combining those 15 gene variants in a useful way, how else are they supposed to demonstrate it other than with a double-blinded, peer-reviewed study?

And the larger idea that different people respond best to different types of training is a powerful one, an inducement to experiment with different workouts even if you don't bother with a genetic test.

That's why, in the end, I've decided to write about the study, with caveats front and centre. It's the most interesting and provocative exercise genetics study I've seen in the past few years – even though the safest bet is still to assume that the prevailing (and skeptical) orthodoxy will turn out to be correct.

That's a mixed message that even Caulfield can endorse. "Taking off my 'cynical hat': intriguing and worth watching," he says, "but don't draw any big conclusions."