With new scientific studies about food and nutrition being reported daily, it's hard to know what to believe. The onslaught – much of it conflicting – can be frustrating even for the savviest consumers. Is saturated fat now good for us? (No.) Are calcium supplements bad for the heart? (Maybe.) Are low-carb diets best for losing weight? (No.) Should I stop eating gluten? (That depends on who you are.)

Scientists don't expect to find answers from individual studies and neither should we. Think of scientific research as a conversation between researchers that usually goes on for many years.

Before you change your diet – which could rob you of much-needed nutrients – it's important to judge which results are really important to you and which ones are simply interesting. And since most of us don't have the tools to dissect and critically analyze a research paper – let alone get our hands on a copy – here's a layman's guide to making sense of nutrition news stories.

Story continues below advertisement

Study design affects reliability

Different study designs have certain weaknesses, or biases, that can distract you from the truth.

Retrospective, or case control, studies look back at people's medical records and ask them questions about diet and lifestyle to find out if certain factors put them at more or less risk.

Retrospective studies provide clues but they're only as good as someone's memory. Can you accurately recall how often you ate oily fish – or took a multivitamin – over the past 10 years?

Story continues below advertisement

Prospective studies follow thousands of people forward for many years and gather diet, lifestyle and medical information at regular intervals. After a specified amount of time, researchers see if those who developed risk factors, or became ill, adhered more or less to a certain diet or consumed more or less of a food or nutrient than those who stayed healthy. Prospective studies uncover associations, and they're less prone to memory errors, but they don't prove cause and effect.

Randomized, controlled trials provide evidence of cause and effect. They are carefully planned experiments that use methods that reduce the potential for bias. Half of the participants are randomly assigned to get the prescribed diet or supplement; the others continue their usual diet or take a placebo (the control group). Researchers then wait several years to see if one of the two groups has a higher risk of disease (or greater rate of weight loss, for example).

A meta-analysis pools data from multiple prior studies using special statistical methods to report the findings as if it were one large study. Meta-analysis is often used to assess the clinical effectiveness of a nutritional treatment (or drug).

What 'risk' really means

Story continues below advertisement

To understand how diet impacts health (positively or negatively), researchers determine how it affects someone's risk. How risks are presented can often influence how you feel about the finding and whether you'll change you diet. Sometimes they make dietary factors seem better – or worse – than they are.

Say, for example, you read about a study that found women who had the highest intake of meat had a 40 per cent increased risk of colorectal cancer compared to women who consumed the least. This "relative" risk compares the likelihood of disease between two groups of similar people.

A 40-per-cent-higher risk of cancer sounds like a lot, but what does it really mean to you as an individual?

First, you need to know the absolute risk – your own risk – of developing colorectal cancer over time.

Story continues below advertisement

In Canada, a woman's lifetime risk of developing colorectal cancer is 1 in 15. So, if the relative risk of colon cancer is increased by 40 per cent in heavy meat eaters, a woman's absolute rise in risk is 0.4 (40 per cent x 1). That means, then, that the absolute risk of female heavy meat eaters developing this disease is 1.4 in 15. Not as scary as 40 per cent. (I am not implying it's healthy to eat a lot of meat; there are also other health reasons to limit meat intake.)

How sure are researchers?

Undoubtedly you've seen the term "significant" in health reports. For example, "people who consumed the most cereal fibre had a significantly lower risk of heart disease." Scientists use statistical significance to evaluate how confident they are that the findings are real and not a fluke.

Typically a finding passes the significant test at the 5 per cent level, meaning it has a less than 1 in 20 probability of occurring by chance. Even if a finding is significant, it doesn't mean it's practically important or that the study is a good one. It's simply a way researchers evaluate conclusions.

How important are the headlines to your diet?

Story continues below advertisement

When you read or hear about a new study linking diet to health, ask yourself the following questions before you overhaul your diet:

Leslie Beck, a registered dietitian, is based at the Medisys clinic in Toronto. She is a regular contributor to CTV News Channel; lesliebeck.com.