The Summers and Winters of Artificial Intelligence

The Summers and Winters of Artificial Intelligence

Copyright: © 2018 |Pages: 10
DOI: 10.4018/978-1-5225-2255-3.ch021
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Artificial Intelligence (AI) is a branch of Computer Science whose aim is to make computers intelligent. These “intelligent” activities, include thinking, reasoning, receiving stimuli from the environment and responding to them, solving puzzles, speaking and understanding language, etc. It was John McCarthy who coined the word, “Artificial Intelligence” at the conference on computers in Dartmouth in 1954 indicating that its goal was to achieve a digital equivalent of human level intelligence. In 1970s, AI entered a low-productive period known as the AI winter. During this period, scientific and notably commercial activities in AI dropped dramatically. The victory of IBM's Deep Blue AI program over the reigning world chess champion in 1997 is probably hailed as the biggest achievement of AI. Yet another great AI achievement is the victory of IBM's Watson over the world Jeopardy champions in 2011. This chapter is a brief outline of how through numerous ups and downs AI has come to be where it currently is, and where we might expect it to be heading in the next couple of decades.
Chapter Preview
Top

Background

As it happens to any new technology, the history of AI, too, ran through a hype curve (Menzies, 2003). The early AI programs like the ones elegantly proving theorems and skillfully playing board games aroused great interest and expectations. This was followed by the successful application of Expert Systems in business and academia. This early period in the development of AI is referred to as the “peak of inflated expectations” shown in the AI hype curve (Figure 1).

Figure 1.

The AI hype curve

978-1-5225-2255-3.ch021.f01

Key Terms in this Chapter

Deep Blue: IBM supercomputer that defeated world chess champion in 1997.

Turing Test: A test proposed by Alan Turing in 1950 to test the intelligence of computers.

Machine Learning: AI programs that extract useful patterns from vast amounts of data and makes unforeseen predictions based on these data, without explicit programming.

Watson: IBM supercomputer that beat human champions in the Jeopardy TV contest in 2011.

Cloud Computing: The distributed and parallel computing technology that allow provides users with applications and data storage over the internet. The cloud is a synonym for the internet.

Connectionism: Another term for neural networks.

Neural Networks: Network system, modeled after the human brain, capable of learning from data.

Artificial Intelligence (AI): The science of building intelligent machines.

Expert Systems: Computers programs styled on the knowledge and reasoning of experts.

Complete Chapter List

Search this Book:
Reset