Pundit Predictions: How Accurate Are the 'Pros' Bold Guesses?
Harry Dent was wrong ... but he wasn't alone.
Every day of the week, dozens of analysts, pundits and talking heads of all types take to the airwaves, op-ed pages and the web to make the most startling of predictions. In fact, the bolder the better. With so many "heads" doing so much talking, a prediction by Dent that the S&P would slip only 5 percent, or perhaps rise something like the historical average of 10 percent, wouldn't even register against all the background noise.
That's Just Crazy Talk
The crazier the prediction, the more memorable. And the more memorable the prediction, the more likely the talking head who made it will get asked back to make another extreme guess on national TV when a slow news day rolls around.
So it's understandable why Dent climbed so far out on a limb when he predicted the S&P's implosion. And why that same month, Seabreeze Partners president Doug Kass predicted that Sears Holdings (SHLD) would declare bankruptcy in 2012. And why, two months later, hedge fund manager James Altucher predicted Apple (AAPL) would hit a $1 trillion market cap by February 2013. And why political pundit John McLaughlin predicted the breakup of the United Kingdom by 2014, and ESPN Radio host Colin Cowherd pegged the Chicago Bears as winners of Super Bowl XLVII, for that matter.
But are any of these predictions worth even the virtual paper they're not printed on?
That's the question asked by a new website called "PunditTracker," which laments the fact that the system today rewards pundits for making brash predictions and rarely checks back on how these predictions play out. As a result, pundits can bury their mistakes, and "selectively tout [their occasional wins] for self-promotion."
PunditTracker's website says it was established with the aim of "cataloging and scoring the predictions of pundits," to hold them accountable. "Pundits who demonstrate a track record of making of accurate, out-of-consensus calls will appropriately receive their due. Meanwhile, those who are bombastic solely to garner media attention will be exposed."
So How Are They Doing?
As a work in progress, PunditTracker does not yet have useful data on all pundits in all spheres. According to the site's proprietary scoring system, a pundit needs to make at least 25 predictions, weighted based on the "boldness" of the predictions, to earn an official PunditTracker grade. (Make a prediction like "The Harlem Globetrotters will win this week," and you get far less credit than you would have earned if, a few years ago, you'd boldly predicted that high-flying Netflix was about to implode.)
But data is starting to roll in on a few of the more prolific predictors.
For example, CNBC analyst Jim Cramer has been credited with making 456 "graded calls" since January 2011. About 47 percent of them turned out to be right, generating a "hit rate" of 47 percent, and an overall PunditTracker grade of D-.
At the other end of the scale, PunditTracker seems mildly optimistic about the trend in New York Times columnist David Brooks' predictions. He's seven for nine on his political predictions during the past 22 months, and is likely to earn a solid B once the rest of the data starts rolling in.
Swiss investor and "Gloom Boom & Doom Report" publisher Marc Faber is doing even better. Accurate on 73 percent of the 191 graded calls that PunditTracker has evaluated so far, Faber's earned himself a sterling A+ rating from the website.
Oh, and do you remember that guy who said Da Bears will win the Super Bowl? Don't laugh too soon, folks, because PunditTracker's got Colin Cowherd's number, too -- and with 151 incredibly "out there" predictions having paid off already (out of 260 bold predictions made), Cowherd has already earned himself a reputation as an A+ sports predictor.
Patriots fans, eat your hearts out.
Motley Fool contributor Rich Smith owns shares of Apple, as does The Motley Fool. Motley Fool newsletter services recommend Apple. See how a few of Rich's predictions worked out right here.