Is Big Data Really That Big of a Deal?

Updated

Big data is the next big thing, or so we're told. According to IBM we create 2.5 quintillion bytes of data every single day. In fact, helping businesses get a handle on the information flowing into their operations is becoming one of the cornerstones of its future growth plays. Big Blue's business analytics division saw revenue jump 28% last quarter, making it the company's fastest-growing segment.

Coca-Cola reportedly was able to cut overtime costs by 46% through analyzing data while MasterCard is launching a whole new venture targeting advertising to its customers because it knows so much about us and our spending habits.

Which end up?
Yet it seems for every Fortune 500 company using big data to supposedly increase revenues, cut costs, and expand profits, this magical elixir apparently isn't catching on everywhere. According to a new study by OpTier, most businesses can't make heads or tails out of their big data and 85% "will fail to effectively exploit big data for competitive advantage."


The problem is the numbers are just so big. Too big, even. Pouring all that information into a funnel creates bottlenecks and processing it takes an immense effort even before it gets to the analytical stage. IDC says big-data technology and services will grow from a sleepy $3 billion business in 2010 to almost $17 billion by 2015 -- a 40% compounded annual growth rate with 7.9 zettabytes of data stored globally. I'm not sure how big that is, but it sure sounds like a ginormous number.

That's why companies spend as much as 60% of their time just organizing their data before shipping it out for analysis. It's there that much of the early money is being made. The opportunity to help transform bits and bytes into usable information is giving not only the large established players like IBM, Oracle , and SAP the incentive to expand their expertise, but smaller opportunists like TIBCO Software and Teradata the chance to make a name for themselves.

Does it compute?
To make that transition, though, means convincing more companies that the effort is worth it. For example, research by American Banker found 71% of the 170 bankers it surveyed don't make use of customer analytics, and even the number that plan to begin doing so within the next year isn't big. It found just 2% plan to buy customer analytics within the next six months, 4% in the next six to 12 months, and only 14% will do so in the next year or so.

It might not be the biggest reason for failing to take up the big data mantle, but doubts about its ability to extract useful information or provide a viable return on the significant investment needed to be made still registered with nearly a quarter of all respondents (cost and other, higher priorities took precedence).

And this is from the green eyeshade types, the ones supposedly best in the know about whether there's money to be made. It sounds like the purveyors need to do a better sales job if they want to convince bankers, let alone anyone else, there's value here. It's an expensive exercise, one that takes a long time to complete, and is complex in its functionality.

Big data on a small scale
To get around those high hurdles, the OpTier study says there may be another means of getting the job done, creating contextual inferences for data already residing in a company's system. Upstart Splunk is one company moving in that direction, but it requires something of an even greater educational effort to convince companies that a leaner, low-cost option can deliver the same results that expansive but clunkier solutions offer.

In certain respects, "big data" reminds me of the Japanese total quality management initiatives advocated by W. Edwards Deming that swept over business back in the 1980s and then the Six Sigma movement that was a big part of General Electric and unsurprisingly showed up in Home Depot when GE alum Robert Nardelli presided over the company. The concept of "kaizen" is yet another outgrowth that has hooked some corporations.

While a lot of this just sounds like a fad to give business consultants work, there is obviously value to be found in data. Applying best practices might help smaller shops without the budget of a Fortune 500 company to mine the most from its data. Splunk is certainly one company that can do that. Teradata is another, and has been called by at least one analyst as the "best of breed" of those providing the service.

Big data is a big deal. But only if companies -- and investors -- can make sense of it.

Bigger is better
To make sense of this trend and pick out a winner, The Motley Fool has compiled a new report called "The Only Stock You Need to Profit From the NEW Technology Revolution." The report highlights a company that has gained 300% since first recommended by Fool analysts but still has plenty of room left to run. Thousands have requested access to this special free report, and now you can access it today at no cost. To get instant access to the name of this company transforming the IT industry, click here -- it's free.

The article Is Big Data Really That Big of a Deal? originally appeared on Fool.com.

Rich Duprey owns shares of General Electric and Oracle. The Motley Fool owns shares of General Electric, International Business Machines, MasterCard, and Oracle. Motley Fool newsletter services recommend Home Depot, International Business Machines, Coca-Cola, Teradata, and TIBCO Software. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

Copyright © 1995 - 2012 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.

Advertisement