Artificial Intelligence : Taking the Buzz Out of Buzzwords
Disclaimer: This is an opinion piece. The views expressed in this article are mine and does not represent my employer.
Smart, sentient machines! The latest (well, not really) hype! Look back a week or two, and think about the number of days you went without hearing about how AI is going to change your career, health, medicine, food, travel or whatever. Television, newspapers, and blogs remain constantly flooded with announcements about the imminent disruption <insert field here> that is going to witness due to using AI.
Let me show you some, ahem, examples.
We have here (in the order of increasing horror):
- AI-powered Air Conditioners
AI-powered Washing Machines
Source – Gizmodo
Source – Indiegogo
Source – The Verge
Okay, I made that last one up. But for a second there, you guys did believe me, right? RIGHT?
That is the sad state of affairs. We are all techies here, and might think “wait, WHAT?”. But the vast majority of the not so technical audience out there sees AI as magic. They see it as something beyond their cognitive ability to process and accept any BS branded as “AI-powered” without questions. Thus, we have this article!
Source – Mashable
So what is the truth with AI? If you dig deep enough, or if you peel off enough layers(pun intended), what is happening?
Before we move on to taking the buzz off of buzzwords, let’s look at some core concepts.
Related Read: Top Artificial Intelligence Trends to Watch Out for In 2019
What is AI?
From wiki, Artificial intelligence is intelligence demonstrated by machines. It is the study of “intelligent agents”: any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals.
But Really, What Is Artificial Intelligence?
IM[not so H]O, AI is just a buzzword. Really, it is just meaningless jargon. Okay, maybe not meaningless, but it’s still jargon. Don’t believe me? Let me give you some examples:
- Computers playing checkers and beating the best human players was considered AI. Until it was not when it was accomplished around 1994 by Chinook, the checkers-playing computer program.
- Computers playing chess and beating the best human players was considered AI. Until it was not when it was accomplished around 1997 when IBM’s Deep Blue defeated the then world champion, Garry Kasparov.
- Cruise control was considered AI. Until it was not when it started being available in production cars in 1990+(partial) and 2010+(full speed range).
- Automatic parking was considered AI. Until it was not when it started being available in production cars somewhere around 2006.
- Human speech recognition was considered AI. Until it was not when it started being available as Google Assistant, Cortana, Siri, etc. Now we have a real-time speech translation!
I could go on, there are quite a few examples of this phenomenon, formally known as(yes, it is so well known that it has a name) the AI effect [wiki].
So a much better definition of AI was put forth by Douglas Hofstadter.
“AI is whatever hasn’t been done yet.”
– Douglas Hofstadter
“Every time we figure out a piece of it, it stops being magical; we say, ‘Oh, that’s just a computation’.”
– Rodney Brooks
So, if it’s all just computation, why was it not, well, “computed” earlier?
Yes, computation, or rather, the capacity for computation is the key. A lot of problems were characterized as AI because, at the time, algorithms for solving that were not known yet, or because the resources to compute those were not available yet.
Availability of Computation Power
Eg. Chess/other games, etc.
Moore’s law and the explosion in storage availability have played a major role in turning the tables. [It is important to note that the tables have not turned completely. Yet. There is so much more ground to cover.]
Availability of Unbiased Data
Eg. Natural language processing (NLP).
Okay, now you may be thinking “Enough data was not available for speech recognition? This guy is full of BS”, but hear me out. With the explosion of social networks, so much content is created and made freely available that finding huge swaths of unbiased(this is the key here) voice/video of natural speech is available, which in turn has helped the advances in NLP.
Availability of Infrastructure
I guess I don’t have to mention the improvement in internet speed that happened over the decade. This has accelerated content creation, real-time processing, etc.
So, What is All the Current Hype About?
The hype is not current. There has been huge interest around AI from the time it was first proposed around the 1950s. The sheer number of films about it tells us about how much.
But the current wave of hype and buzz surrounding AI comes from the recent advances made in, drumroll please, Machine Learning.
What is Machine Learning?
Machine learning is
- giving computers the ability to learn
- to find patterns in data
- from experience
- without explicit programming.
ML is essentially about classifying and predicting stuff.
The typical operation is something like:
- Take some data
- Learn patterns in the data
- When presented with new data, classify it for the best guess of what it probably is, based on the “learning” that happened in .
Related Read: Machine Learning- Deciphering the most Disruptive Innovation
Meh! So what is the big deal?
Once trained for one purpose, the same ML system can be reused(with additional training) to learn new concepts. This can be done without rewriting the code. Now that is a big deal.
Let’s look at a simple example: Classifying emails.
if the email contains "it's never a job, its always a career" then send to trash; if the email contains ... then ... if the email contains ... then ...
try to classify some emails; change self to reduce errors; repeat;
That was a two-minute primer on Machine Learning. So next time someone starts talking about Artificial I, I hope you feel the pang and say “Excuse me, I think you mean Machine Learning, not AI”.
Source – HubSpot