开发者社区> 芷沁> 正文
阿里云
为了无法计算的价值
打开APP
阿里云APP内打开

The Differences between AI, Machine Learning, and Deep Learning

简介: In the past two years, the growth of artificial intelligence and machine learning has been immense. Machine learning, as a type of artificial intellig
+关注继续查看

Comparison_Between_AI_Machine_Learning_and_Deep_Learning

Introduction

On November 9, 2015, Google released an open source Artificial Intelligence (AI) system known as TensorFlow. Ever since the launch of TensorFlow, the growth of AI and machine learning has been immense. Machine learning, as a type of AI, enables software to elaborate or predict future events based on a large volume of data. Today, leading technology giants are all making substantial investments in machine learning, including Facebook, Apple, Microsoft, and even China's leading search engine – Baidu.

In 2016, Google DeepMind's AlphaGo project defeated South Korean player Lee Se-dol in the world-famous Go game. The media used the terms AI, machine learning, and deep learning to explain the reasons for DeepMind's victory, causing a mass confusion of these terms among the public.

Differences and Similarities

Although conceptually similar, the terms AI, machine learning, and deep learning are not interchangeable. Referencing the interpretations from Michael Copeland of NVIDIA, this article unveils the concepts of AI, machine learning, and deep learning. To understand the relationship between the three, let us look at the figure below:

01

Figure 1

As shown in the figure, machine learning and deep learning belong are subcategories of AI. The concept of AI appeared in the 50's, while machine learning and deep learning are relatively newer topics.

AI: From Irrelevance to Global Adoption

Since 1956, when computer scientists coined the term AI at Dartmouth Conferences, there has been an endless stream of creative ideas about AI. AI was one of the hottest topic or research because many perceived AI as the key to a bright future of human civilization. However, the idea of AI was quickly discarded for being too pretentious and whimsical.

In the past few years, especially after 2015, AI has experienced a new surge. A large contributor to this growth is the widespread use of graphics processors (GPUs) that make parallel processing faster, economical, and powerful. Additionally, the emergence of almost infinite storage spaces and massive data (big data movement) also benefitted the development of AI. These technologies allow unlimited access to all kinds of files, including images, text, transaction data, and map data.

Next, we will look at AI, machine learning, and deep learning one by one from their development processes.

AI and Its Applications

02

Figure 2

When AI pioneers sat in the meeting room in Dartmouth College, they dreamed of creating a complicated machine with a human-level intelligence using the emerging computers at that time. This is what we call the "General AI" (General AI) concept, a machine capable of reasoning and the ability to interact with the five senses. This is a recurrent theme in films, such as the human-friendly C-3PO and the enemy of humanity, Terminator. However, so far general AI machines are only fictional for a simple reason: we cannot achieve it yet, at least so far.

One of the main challenges of general AI is its extensive scope. Instead of creating an all-purpose machine, we can narrow down our requirements to achieve specific goals. This task-specific implementation of AI is also known as narrow AI. There are many examples of narrow AI in reality, but how are they created? Where does the intelligence come from? The answer to these questions lies within the next topic – machine learning.

Machine Learning and Its Applications

03

Figure 3

The concept of machine learning comes from AI and it is a way to achieve general AI. Researchers in earlier days got together and developed algorithms that include decision tree learning, inductive logic programming, enhanced learning, and Bayesian networks. In simple terms, machine learning utilizes algorithms to analyze data, and then learns from the results to make inferences or predictions. Unlike the traditional use of preprogrammed software, machine learning uses data and algorithms to "train" itself and improve on its own program.

Computer vision is one of the most well-known application of machine learning, but it requires a lot of manual coding work to complete tasks. Researchers manually write some classifiers, such as edge detection filters, to help programs identify the boundaries of objects. Based on these manually written classifiers, researchers can then develop algorithms for machines to analyze, identify, and understand images.

Nevertheless, this process is prone to errors because of the primitiveness of existing technologies.

Deep Learning and Its Applications

04

Deep learning is a technology for achieving machine learning. The concept of artificial neural networks is an early development of deep learning, but it remained unknown for decades after its invention. The idea of creating interconnected components was inspired by the human brain, but its implementation differs significantly from biological systems. In humans, the neurons of the human brain can connect to any neuron within a specific range to implement a variety of tasks. However, the data transmission in an artificial neural network must go through different layers in different directions.

For example, you can splice an image into smaller pieces and input them into the first layer of the neural network. The preliminary calculation occurs in the first layer, and then the neurons pass the data to the second layer. Neurons in the second layer will then execute the task. All the layers follow the same rule, and the result is presented as an output.

Each neuron has a specific weight assigned to it. These weights determine the final output, and are calculated based on the correctness and error of the neuron relative to the task of execution. Let us look at an example of a system analyzing a stop sign. The neurons subdivide and "check" for the properties of a stop sign image, such as the shape, color, character, size, and movement. The neural network will produce a probability vector, which is, in fact, an estimation result based on the weights. In this example, the system may be 86 percent sure that the image is a stop sign. The network architecture then determines whether this judgment is correct.

However, the problem is that even the most basic neural network consumes a lot of computing resources, an aspect that made neural network infeasible back then. A small group of enthusiastic researchers, led by Professor Geoffrey Hinton of the University of Toronto, stuck to this method and eventually enabled the supercomputer to execute the algorithm in parallel to prove the viability of the algorithm.

If we go back to the stop sign example, the accuracy of the prediction is dictated by the amount of training the neural network receives. This implies that constant training is necessary. Tens of thousands or even millions of images are needed to train the machine. With sufficient training, the input weights of neurons can be adjusted to a very precise level for a consistently accurate answer.

Currently, deep learning trained machines can outperform humans in image recognition, including challenging and critical tasks such as identifying signs of cancer in blood. Facebook uses a similar type of neural network to recognize faces in pictures, and Google's AlphaGo is capable of beating the world's best Go players by training its algorithm intensively.

Conclusion

The foundation of AI lies in the intelligence of machines, while machine learning specifically refers to the deployment of calculation methods that support AI. Simply put, AI is science, while machine learning is the experimental methods that make AI possible. To some extent, machine learning makes AI. We hope that this article was helpful in explaining the differences and relationships between the three variants of machine intelligence.

版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。

相关文章
《Deep Learning vs.Machine Learning-the essential differences you need to know!》电子版地址
Deep Learning vs.Machine Learning-the essential differences you need to know!
0 0
周志华《Machine Learning》学习笔记(17)--强化学习
强化学习(Reinforcement Learning,简称RL)是机器学习的一个重要分支,前段时间人机大战的主角AlphaGo正是以强化学习为核心技术。
0 0
Machine Learning-L16-概率图模型
Machine Learning-L16-概率图模型
0 0
Machine Learning-L19-条件随机场
Machine Learning-L19-条件随机场
0 0
Machine Learning | (12) 非监督学习-k-means
Machine Learning | (12) 非监督学习-k-means
0 0
吴恩达《Machine Learning Yearning》总结(31-40章)
31.解读学习曲线:其他情况 下图反映了高方差,通过增加数据集可以改善。 下图反映了高偏差和高方差,需要找到一种方法来同时减少方差和偏差。 32.绘制学习曲线 情况:当数据集非常小时,比如只有100个样本,这时绘制出来的学习曲线可能噪声非常大。
1074 0
吴恩达《Machine Learning Yearning》总结(21-30章)
21.偏差和方差举例 前提:对于人类而言,可以达到近乎完美的表现(即人类去做分类是误差可以接近0)。 (1)假设算法的表现如下:训练误差率:1%,开发误差率:11%;此时即为高方差(high variance),也被称为过拟合(overfitting)。
875 0
吴恩达《Machine Learning Yearning》总结(11-20章)
11.何时修改开发集、测试集和度量指标 开展一个新项目,尽快选好开发集和测试集;例子,根据度量指标A分类器排在B分类器前面,但是团队认为B分类器在实际产品上优于A分类器,这时就需要考虑修改开发集和测试集,或者评价指标了。
1420 0
Deep Learning vs. Machine Learning vs. Pattern Recognition
Deep learning, machine learning, and pattern recognition are highly relevant topics commonly used in the field of robotics with artificial intelligence.
3972 0
+关注
芷沁
https://www.alibabacloud.com/blog/
文章
问答
文章排行榜
最热
最新
相关电子书
更多
Deep Learning vs.Machine Learn
立即下载
Deep Learning vs, Machine Lear
立即下载
Deep Learning and the Artifici
立即下载