Skip to main content

Jargon is an inevitable part of any industry, but with data it’s been particularly bad because of the rapid evolution of terms and because some solution providers know their customers are as enthusiastically uninformed as their pockets are deep. This pattern started with the “Big Data” craze, when overnight everyone from nonprofits, to marketing agencies, to multinational organizations, embraced the term as a catchall for any information found via some form of data analysis.

Despite all the hype, Big Data was a temporary technological issue caused by the fact that it was easier to create and store data than it was to process data with off the shelf tools like SAS and SPSS. This was a direct result of the digital revolution, where connected devices and machines allowed us to quantify the previously unquantifiable, paired with slow software development schedules at companies like IBM.

And when I say temporary I mean it, since most Big Data issues have already been solved thanks to open-source solutions like R, Python, Hadoop, and Spark. As a result, data jargon pivoted to “predictive analytics,” then when this wasn’t cool anymore everyone started focusing on “real-time analytics.” Right now it’s all about “artificial intelligence” and I predict the next jargon terms will be “block chain” and “quantum computing.”

Failure Is Everywhere

In each case the promise of the jargon term is real, but few organizations actually need Big Data and AI solutions, which has led to a disastrous rate of failure as organizations invested in expensive infrastructure, issued mandates to divert resources from their core competencies to breaking down the “data silos” in their organization, and began fighting a proxy war for expensive human talent. Despite all of this, a recent CIO study found that while “75 percent of business leaders from companies of all sizes, locations and sectors feel they’re ‘making the most of their information assets,’ in reality, only 4 percent are set up for success. Overall, 43 percent of companies surveyed ‘obtain little tangible benefit from their information,’ while 23 percent ‘derive no benefit whatsoever.’

Cutting Through the Jargon Hype

One of my colleagues studied political science in college so he would, and I quote, “know when I was being screwed.” If you’re committed to using data in your decision-making, which you should be, then here are some tips that will help you cut through the hype to find a solution that doesn’t result in failure.

Ask for Analysis, Not Description – If you’ve been in a meeting where the results of a survey were explained question by question with donut charts and bar charts, and you left feeling unimpressed, it was for good reason. That was not analysis, that was description. The same thing is typically done with digital marketing metrics, which are copied and pasted from a dashboard to some monthly report template. If you’re going to use data as a competitive advantage, and again you should, then descriptives aren’t enough. Analysis answers “why” something happened, descriptives just show “what” happened.

Beware of Blind Faith in Data – At the other extreme of descriptives are people with a near religious belief in parametric statistical methods. It’s pretty easy to identify these people because they love to point out things like “correlation doesn’t equal causation” and “anything with less than 30 observations isn’t statistically significant.” There are many other ways of analyzing data and all of them have limitations, which means none have a monopoly on truth. Data, however analyzed, provides varying degrees of evidence, and ultimately it’s up to you to consider that evidence and make a decision, but you need to accept the possibility that your decision (and the data it’s based on) is wrong.

Black Boxes Filled with Secret Sauce – If I never hear the term “secret sauce” again, I’ll die a happy person. This term is often used as a cute way of deflecting questions about how a proprietary canned solution functions. Anytime there’s a lack of transparency with data, be very, very weary. Some providers hope that by automating a specific group of models (something that is easy to do with open-source code) they can avoid meaningful differentiation by invoking their proprietary methods. Google and others have shown that the data is the differentiator, not the models, so if you already have the data (or can easily get it), you’re just paying these providers for some simplistic automation.

They Use Impersonal Sales Processes for a Silver Bullet Solution – There’s no single algorithm to rule them all, and the type of approach you take will depend on the insights you need from the data, how fast you need them, how you’re going to get the data, what form that data is going to take, and how you’re going to use the insights once you have them. It’s extremely unlikely that any preexisting solution will perfectly align all of those things for your organization. But customization is hard to sell at scale, especially through automated sales platforms, which has led to a bad habit in the data world: over promising and under delivering. While many of the providers include their growth rate in their sales pitch (look, we’ve grown 10,000% so we must be awesome!) what they aren’t telling you is that their retention rates are shockingly poor.

If It’s Too Good To Be True…

One of my former students works with the Department of Commerce identifying and preventing fraud. She’s told me that the only consistent predictor of a con is the promise of high return with no risk, which perfectly describes how some data providers are selling their products. It’s up to you as a consumer to fully vet anything before making a purchase. With a few critical questions, you should be able to cut through the hype and find something that helps you move into the data age and succeed where most everyone else has failed.