You Don't Hear "Big Data" Much Anymore, And For That I'm Thankful

📅️ Published:

🔄 Updated:

🕔 5 min read ∙ 907 words

This is about Big Data.

You really can’t over-estimate what suckers business people are.

I’m not talking about people who create things or improve things. Guys like Steve Jobs and Donald Trump built great businesses. But they weren’t business people. They were builders. Running a business was a necessary evil that allowed them the freedom to build what they wanted to build. They were great businessmen because they cared about the work their companies did. Just like every small business owner who has the courage to strike out and build a business.

Most business executives, though, are not builders. They’re managers. They don’t really care what their companies do or how their products improve (or hurt) people’s lives. They care about quarterly reports and resource utilization targets. Put another way, Jobs and Trump saw the numbers as the result of their great work; managers see the work as a necessary evil to achieve the numbers.

Big Data was a buzzword for about six years. At first, it was a way for Silicon Valley types to get funding for their startups. Then, it was a way for corporate IT geeks to get funding for internal projects. IT sold Big Data as a way for their companies to “differentiate.”  Then it became a corporate strategic initiative at every company in America when IT convinced managers that “we are the ONLY company that DOESN’T have a Big Data strategy.”

Then Big Data predicted Hillary Clinton’s landslide win with 98% confidence. And Big Data went the way of the Iomega Zip Drive. (I know of very senior executive at a very large beverage company who, in 1996, shifted his entire portfolio, 100%, to Iomega stock.)

In 2012, I attended an innovation symposium. One topic was Big Data. The speaker breathlessly warned that “Big Data Is Coming!” like the Red Coats. As if Big Data were a thing. (In case you’re wondering, “Big Data” means “lots of data” usually about people and their behavior.) He said “Big Data” at least 100 times in a 25-minute presentation. Since I’m a former geek, a lot of people asked me afterwards, “That was such a great talk, but did you understand what we should do about it?” I wasn’t sure. Neither was the speaker. Except “invest” in it. Or invest to stop it. I can’t remember which.

Paul Revere raced through Massachusetts warning that the British were coming, just as our speaker warned “Big Data is coming!” But the people Revere warned had been prepped on what to do when the news came. Warnings about Big Data were useless because most people had no prior arrangements for dealing with the news.

Except for IT.

Corporate IT folks are masters at creating urgent needs for funding. They invented the Y2K bug. (I profited handsomely from that panic from 1997 to 1999.) No one knew exactly what the Y2K bug was, but gullible managers forked over billions and billions of dollars to fix it.

The Big Data invasion was another Y2K bug, only more mysterious. So it needed even more funding. When the election was called for Trump around 2:00 a.m. on November 9, it was like midnight January 1, 2000, all over again. The great IT emergency was dead.

Now, the IT folks have a new emergency that requires billions and billions of funding. IT professionals are A/B testing whether to call it AI (artificial intelligence) or “machine learning.” Both AI and machine learning are real things, but they’ve become buzzwords to seduce money from gullible business executives. That means IT folks are busy finding out what their executives' greatest fears are, then creating pitch decks that “prove” AI/machine learning is exactly the thing to kill that bogeyman. (I’m watching this A/B testing happen, and it’s amazing to see. When the business execs reject AI for one reason or another, IT simply does a search and replace of “AI” with “machine learning,” pitches the exact same deck a month later, and gets the funding. You can’t make this up.)

What the executives don’t know is that AI and machine learning are both overhyped exaggerations, just like Iomega, Y2K, and Big Data. Says Andrew Orlowski of the Register (via ZeroHedge):

As with the most cynical (or deranged) internet hypesters, the current “AI” hype has a grain of truth underpinning it. Today neural nets can process more data, faster. Researchers no longer habitually tweak their models. Speech recognition is a good example: it has been quietly improving for three decades. But the gains nowhere match the hype: they’re specialised and very limited in use. So not entirely useless, just vastly overhyped. As such, it more closely resembles “IoT”, where boring things happen quietly for years, rather than “Digital Transformation”, which means nothing at all.

The more honest researchers acknowledge as much to me, at least off the record.

The bad news for most people: AI/machine learning will cost a lot of non-techies their jobs over the next few years. IT leaders have gotten really good at bilking gullible managers out of money for buzzwords like Y2K and Big Data. And business people ain’t getting any smarter. The AI bubble will burst after some catastrophe caused by a crappy algorithm—a catastrophe that a hydrocephalic 4-year-old could have anticipated and averted. By then, IT will have a new bogeyman.

But until then, I say do a shot every time you hear “AI” or “machine learning.” That’s what buzzwords are for.