Skip to main content
the long view

The smartest thing many investors can do is to stop looking for really smart things to do.

Once you get past a few simple notions – things such as saving regularly, keeping your investment costs to a minimum, diversifying widely and taking advantage of tax shelters – the payoff from further market insights can be rather limited. Even big-name investors and hedge fund managers touting the latest financial research often don't know what they're talking about.

In fact, a surprising amount of the time, the findings of the market cognoscenti don't bear up under inspection. "We argue that most claimed research findings in financial economics are likely false," write Campbell Harvey of Duke University, Yan Liu of Texas A&M University and Heqing Zhu of the University of Oklahoma.

In a recent paper, the three finance professors examined a list of 296 factors touted by various researchers as having a statistically significant ability to predict returns from stocks. Some factors were old reliables such as price-to-earnings ratios or payout yields; others were more exotic, such as political campaign contributions by companies.

Sadly, that impressive catalogue of ways to beat the market crumbled when scrutinized more closely. As Prof. Harvey and his colleagues subjected the researchers' list to multiple statistical tests, they concluded that half were probably false discoveries – although exactly which half is open to debate. While the academics didn't put it quite so bluntly, much of the research in the field appears to be garbage.

Many fields suffer from the same trash build-up. John Ioannidis, a professor of medicine at Stanford University, published a landmark paper in 2005 that argued most published scientific research is false. Small sample sizes and personal bias are to blame for some misleading conclusions. The most insidious problem, though, is researchers' habit of trying out multiple explanations for events until one hypothesis happens to more or less fit the data at hand.

This apparent fit can result from pure chance. Scientists usually won't label a factor as statistically significant until they're confident there is no more than a one in 20 chance that it could create a given result by accident. However, if a researcher tests 20 or more possible factors, at least one of the factors is likely to qualify as statistically significant even if it's actually irrelevant.

Prof. Ioannidis noted that the problem is particularly noticeable in research areas that are most popular, where many rival teams are racing each other to find statistically significant results.

Finance seems to be one of those fields. Prof. Harvey and his co-authors found that from 1980 to 1991 researchers discovered about one factor a year that could supposedly predict future stock returns. Then as interest in the subject grew, so did new findings. The number of new factors jumped to around five a year in the 1991-to-2003 period, then soared to around 18 a year more recently.

Perhaps there are actually that many novel insights into the stock market. It's odd, though, that this profusion of discoveries isn't resulting in better performance. Hedge funds, supposedly the smart guys in the room, have been lagging behind the S&P 500 for years, according to the HFRI Fund Weighted Composite Index, the most widely used benchmark for hedge fund performance.

The likeliest explanation for the recent outburst of supposedly significant factors is that researchers are trying out countless possibilities on a limited amount of data and reporting the ones that edge past the minimum standard for statistical significance, even if there's a high likelihood that many of those findings are actually the result of chance.

Prof. Harvey and his colleagues suggest that investors demand a much higher level of statistical significance before accepting future market discoveries. That's good advice if you're mathematically savvy. However, a simpler approach is just to be skeptical of strategies that go beyond a basic program of using low-cost index funds to track major markets.

If such an investing plan sounds embarrassingly primitive, consider an interesting experiment conducted earlier this month by Tim Edwards, senior director of index investment at S&P Dow Jones Indices. He compared the average performance of hedge funds over the past five years to that of a simple-minded strategy that plunked half its money into a global stock index and half into U.S. bonds.

He found the simple strategy did much better than the hedge fund geniuses. But he also discovered an interesting relationship: If you assumed that someone had to pay absurdly high fees to use the simple strategy – including an annual management charge of 1.5 per cent of assets as well as a 15-per-cent slice of returns, such as would be typical at many hedge funds – then the simple strategy and the average hedge fund produced nearly identical results, going up and down in tandem.

"The average hedge fund looks like a fixed blend of cheap investments, at high cost," Mr. Edwards concluded. Yep, that sounds about right.