Artificial Intelligence : Taking the Buzz Out of Buzzwords

Disclaimer: This is an opinion piece. The views expressed in this article are mine and does not represent my employer.

Smart, sentient machines! The latest (well, not really) hype! Look back a week or two, and think about the number of days you went without hearing about how AI is going to change your career, health, medicine, food, travel or whatever. Television, newspapers, and blogs remain constantly flooded with announcements about the imminent disruption <insert field here> that is going to witness due to using AI.

Let me show you some, ahem, examples.

We have here (in the order of increasing horror):

  • AI-powered Air Conditioners

Artificial Intelligence

  • AI-powered Washing Machines


Source – Gizmodo

  • AI-powered Suitcases


Source – Indiegogo

  • AI-powered Phones

Artificial Intelligence

  • AI-powered Toilet

Kohler’s smart toilet

Source – The Verge

  • AI-powered Underwear!

AI Boxer
Okay, I made that last one up. But for a second there, you guys did believe me, right? RIGHT?

That is the sad state of affairs. We are all techies here, and might think “wait, WHAT?”. But the vast majority of the not so technical audience out there sees AI as magic. They see it as something beyond their cognitive ability to process and accept any BS branded as “AI-powered” without questions. Thus, we have this article!

LG Everything AI

Source – Mashable

So what is the truth with AI? If you dig deep enough, or if you peel off enough layers(pun intended), what is happening?

Before we move on to taking the buzz off of buzzwords, let’s look at some core concepts.

Related Read: Top Artificial Intelligence Trends to Watch Out for In 2019

What is AI?

From wiki, Artificial intelligence is intelligence demonstrated by machines. It is the study of “intelligent agents”: any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals.

But Really, What Is Artificial Intelligence?

IM[not so H]O, AI is just a buzzword. Really, it is just meaningless jargon. Okay, maybe not meaningless, but it’s still jargon. Don’t believe me? Let me give you some examples:

  • Computers playing checkers and beating the best human players was considered AI. Until it was not when it was accomplished around 1994 by Chinook, the checkers-playing computer program.
  • Computers playing chess and beating the best human players was considered AI. Until it was not when it was accomplished around 1997 when IBM’s Deep Blue defeated the then world champion, Garry Kasparov.
  • Cruise control was considered AI. Until it was not when it started being available in production cars in 1990+(partial) and 2010+(full speed range).
  • Automatic parking was considered AI. Until it was not when it started being available in production cars somewhere around 2006.
  • Human speech recognition was considered AI. Until it was not when it started being available as Google Assistant, Cortana, Siri, etc. Now we have a real-time speech translation!

Obligatory XKCD.

I could go on, there are quite a few examples of this phenomenon, formally known as(yes, it is so well known that it has a name) the AI effect [wiki].

So a much better definition of AI was put forth by Douglas Hofstadter.

“AI is whatever hasn’t been done yet.”   

 – Douglas Hofstadter

Just Computation

“Every time we figure out a piece of it, it stops being magical; we say, ‘Oh, that’s just a computation’.”

– Rodney Brooks

So, if it’s all just computation, why was it not, well, “computed” earlier?

Yes, computation, or rather, the capacity for computation is the key. A lot of problems were characterized as AI because, at the time, algorithms for solving that were not known yet, or because the resources to compute those were not available yet.

  • Availability of Computation Power

Eg. Chess/other games, etc.

Moore’s law and the explosion in storage availability have played a major role in turning the tables. [It is important to note that the tables have not turned completely. Yet. There is so much more ground to cover.]

  • Availability of Unbiased Data

Eg. Natural language processing (NLP).

Okay, now you may be thinking “Enough data was not available for speech recognition? This guy is full of BS”, but hear me out. With the explosion of social networks, so much content is created and made freely available that finding huge swaths of unbiased(this is the key here) voice/video of natural speech is available, which in turn has helped the advances in NLP.

  • Availability of Infrastructure

I guess I don’t have to mention the improvement in internet speed that happened over the decade. This has accelerated content creation, real-time processing, etc.

So, What is All the Current Hype About?

The hype is not current. There has been huge interest around AI from the time it was first proposed around the 1950s. The sheer number of films about it tells us about how much.

But the current wave of hype and buzz surrounding AI comes from the recent advances made in, drumroll please, Machine Learning.

What is Machine Learning?

Machine learning is

  • giving computers the ability to learn
  • to find patterns in data
  • from experience
  • without explicit programming.

ML is essentially about classifying and predicting stuff.

The typical operation is something like:

  1. Take some data
  2. Learn patterns in the data
  3. When presented with new data, classify it for the best guess of what it probably is, based on the “learning” that happened in [2].

Related Read: Machine Learning- Deciphering the most Disruptive Innovation

Meh! So what is the big deal?

Once trained for one purpose, the same ML system can be reused(with additional training) to learn new concepts. This can be done without rewriting the code. Now that is a big deal.

Let’s look at a simple example: Classifying emails.

Traditional programming:

if the email contains "it's never a job, its always a career"

then send to trash;

if the email contains ...

then ...

if the email contains ...

then ...

ML programs:

try to classify some emails;

change self to reduce errors;


That was a two-minute primer on Machine Learning. So next time someone starts talking about Artificial I, I hope you feel the pang and say “Excuse me, I think you mean Machine Learning, not AI”.

Source – HubSpot

Stay up to date on what's new

    About the Author


    I have been programming since 2000, and professionally since 2007. I currently lead the Open Source team at Fingent as we work on different technology stacks, ranging from the "boring"(read tried and trusted) to the bleeding edge. I like building, tinkering with and breaking things, not necessarily in that order.

    Recommended Posts

    AI for Document Processing

    18 Apr 2024 B2B

    Applied AI For Document Processing

    "It's becoming increasingly clear that AI is the future, and almost everything else is a sideshow." - World-renowned computer scientist Geoff Hinton AI has taken over almost every aspect of……

    AI trends

    10 Jan 2024 B2B

    AI Trends Set to Transform Businesses in 2024

    In the dynamic realm of modern business, the profound impact of artificial intelligence (AI) continues to unfold, reshaping industries and redefining conventional practices. As we step into 2024, the transformative……

    AI in Aviation

    26 Nov 2023 B2B

    Finding Success in the Aviation Business with AI

    “Aviation is the branch of engineering that is least forgiving of mistakes.” - Freeman Dyson, British-American theoretical physicist and mathematician. The truth in that statement is sobering indeed. The precision……

    Generative AI

    27 Oct 2023 B2B

    Generative AI – Magnifying the Power of AI in Business

    Are you sick and tired of performing the same monotonous task every day? Well, if your answer is yes, then Generative AI can benefit you.  Technology is evolving at a……

    Talk To Our Experts