2022.1.1.30569 Patch Release Update

The 2022.1.1.30569 Patch/Minor release has been removed from the Download Portal due to a missing signature in some of the included files. This causes the files to not be recognized as valid files provided by Alteryx and might trigger warning messages by some 3rd party programs. If you installed the 2022.1.1.30569 release, we recommend that you reinstall the patch.

Data Science

Machine learning & data science for beginners and experts alike.
12 - Quasar

Image credit: Shannon Liao / The VergeImage credit: Shannon Liao / The Verge

The hype around Artificial Intelligence (AI) reached a fevered pitch as the world entered 2019.  Business leaders from nearly every industry are clamoring to tell anyone that will listen how they have an AI strategy to transform the way everything is done. At the annual Consumer Electronics Show (CES) in Las Vegas, a plumbing company was demonstrating an AI-powered toilet. Yes, you read that right, an ‘intelligent toilet.’


Clearly, the marketers have taken over and are adding to the general confusion in their overuse of the ‘AI’ label.  Combine that with the Hollywood-driven perception of AI in movies like The Terminator and Her, and it’s easy to see why many people find it so perplexing.


What does AI really mean? And how is it related to similarly mystifying terms like ML (machine learning) or ‘deep-learning’ which are seemingly used interchangeably with AI? This article will cut through the buzz and help you understand these terms.


The easiest way to think about these terms is to go back to some elementary school biology that we’re all familiar with. Remember we learned that there is a broad category called ‘animals,’ and within animals, there is a more specific class called ‘mammals’ and then within mammals, there are more specific species like a dog. So, a dog is a mammal and an animal, but not all animals are mammals, and not all mammals are dogs.


Simple enough, right? Then onto AI…


Let’s start at the highest level: Artificial Intelligence. AI is the broadest categorization for incorporating what is perceived to be human intelligence into machines and computers. The AI term is quite old, first coined by John McCarthy in 1956. It is also a very broad term. One could program a computer with a bunch of ‘IF-THEN’ statements as a way of evaluating data to make a decision, and consider it AI! Situations like this might fall under the broadest ‘simulating human intelligence’ sense of the word (more on this later).


Naive_Bayes.pngIn the same way that a mammal is a particular type of animal, Machine Learning (ML) is a particular type of AI. The term ML is also quite old, first defined in 1959, and it means the ‘ability to learn without being explicitly programmed.’ Instead of programming a bunch of statements to tell the computer what to do, we instead ‘train’ a model by feeding it lots of data and letting it adjust itself to (hopefully) improve over time. There are many different approaches to ML, many of which are available out of the box with Alteryx, such as decision tree learning, clustering, Bayesian classifier, and artificial neural networks.


In the same way a dog is a kind of mammal, neural networks (that last one), are the specific kind of ML that is referred to as ‘deep learning.’  Deep Learning works by simulating the biological way a brain works and passing information through different ‘layers’ to learn particular features of a data set and solve a problem.


In the same way a dog is a mammal which is an animal, deep learning is a kind of machine learning which is a kind of artificial intelligence.


It’s worth noting that our perception of what ‘is AI’ and what ‘is not AI’ seems to shift over time as computers are able to do more. In 1997, when IBM’s ‘Deep Blue’ computer beat world champion Gary Kasparov in a game of chess, everyone heralded the rise of ‘intelligent’ machines because the computer beat a person in a game that was thought to be the domain of human intelligence. The machine was basically playing chess through ‘brute force’ – trying every possible combination of moves until it found the one with the highest probability of success.


20+ years later, most no longer consider this brute force programming to be ‘AI,’ but rather consider it to be just something computers can do. In 2016, when Google’s AlphaGo (a deep learning model) beat the Go world champion, everyone once again hailed the arrival of AI.


I suspect, however, in another 20 years, we’ll have moved the goalposts again in our mind and may no longer consider this computer deep learning to be truly human-level intelligence.