Download Turing Test - University of Windsor

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Embodied cognitive science wikipedia , lookup

Person of Interest (TV series) wikipedia , lookup

Human–computer interaction wikipedia , lookup

Wizard of Oz experiment wikipedia , lookup

Computer Go wikipedia , lookup

AI winter wikipedia , lookup

Technological singularity wikipedia , lookup

Artificial intelligence in video games wikipedia , lookup

Visual Turing Test wikipedia , lookup

Kevin Warwick wikipedia , lookup

Intelligence explosion wikipedia , lookup

Existential risk from artificial general intelligence wikipedia , lookup

Ethics of artificial intelligence wikipedia , lookup

Alan Turing wikipedia , lookup

Chinese room wikipedia , lookup

History of artificial intelligence wikipedia , lookup

Turing test wikipedia , lookup

Philosophy of artificial intelligence wikipedia , lookup

Transcript
Turing Test:
An Approach to Defining Machine Intelligence
Presented by:
Saifullah, Abu Sayeed
St. Id. 101876890
“I think that to say that machines
can’t be creative is utter
rubbish.”…… K. Warwick
Alan Mathison Turing
Founder of computer science, mathematician, philosopher
1912 (23 June): Birth, Paddington, London
1932-35: Quantum mechanics, probability, logic
1936: The Turing machine, computability, universal machine
1936-38: Princeton University. Ph.D. Logic, algebra, number theory
1938-39: Return to Cambridge. German Enigma cipher machine
1939-40: The Bombe, machine for Enigma decryption
1939-42: Breaking of U-boat Enigma, saving battle of the Atlantic
1943-45: Chief Anglo-American crypto consultant. Electronic work.
1945: National Physical Laboratory, London
1946: Computer and software design leading the world.
1947-48: Programming, neural nets, and artificial intelligence
1949: First serious mathematical use of a computer
1950: The Turing Test for machine intelligence
1952: Arrested
Alan Mathison Turing
Einstein’s Relativity Theory
 Quantum Mechanics
 Eddington’sThe nature of the physical world.
 Premonition of Morcom's death
 What’s death?
Something beyond what science could explain
“It is not difficult to explain these things away - but, I
wonder! “
1954 (7 June): Death (suicide) by drinking KCN at
Wilmslow, Cheshire.

What’s Intelligence?





How does a thief/criminal escape from the
cops?
What are you doing while playing chess?
Think about the contingency problem.
Human being is the most intelligent creature
Is there any other intelligent entity in the
universe?
What is Intelligence?
Newell and Simon:
the use and manipulation of various symbol systems,
such as those featured in mathematics or logic.
Others:
 Feelings
 Creativity
 Personality
 Freedom
 Intuition
 Morality
What’s Intelligence?


Behavior alone is not a test of intelligence,
what exactly is intelligence?
How can it be noticed or observed?
Large debate in the AI, Psychology, Philosophy
community.
Henley argues that most AI applications under
development today are ``pragmatic'' in their definition
of intelligence.
Machine Intelligence?
Acting Humanly
 Thinking Humanly
 Thinking Rationally
Aristotle’s Syllogism:
“Socrates is a man; all men are mortal; therefore, Socrates is mortal”
 Acting Rationally

Think about the Exploration Problem:
An Agent got lost in the amazon jungle and wants to reach the sea.
Acting Rationally in the Wumpus World
Initial points:1000
 1 point penalty for each action
 Getting Killed: 10000 points penalty
 Only one arrow.

Percept={stench, breeze, glitter,
bump, scream}
R1: ~S11=> ~W11^ ~W12 ^ ~ W21
R2: ~S21=> ~W11^ ~W21 ^ ~ W22 ^ ~
W31
R3: ~S12=> ~W11^ ~W12 ^ ~ W22 ^ ~
W13
R4: S12=> W13 V W12 V W22 V W11
Acting Rationally in the Wumpus World
~S11, R1 (Modus Ponens): ~W11^
~W12 ^ ~ W21
~W11, ~W12 , ~ W21
~S11, R1 (Modus Ponens, And
elimination):
~W11, ~W21 , ~ W22 , ~ W31
S12, R4 (Modus Ponens): W13 V W12
V W22 V W11
Unit Resolution (Two times): W13
Turing Test (TT)
"Can machines think?"
 “COMPUTING MACHINERY AND INTELLIGENCE”
..Alan Turing in 1950
 The most disputed topics in AI,
philosophy of mind, and cognitive
science
 Acting Humanly
 Imitation Game
The State of the Art
Never performed by Turing
 Appeared in the mid ‘70s,long after the man’s
suicide
 Discussed, attacked, and defended
 “Beginning" of AI
 Ultimate goal of AI
 At the other extreme: useless, even harmful.

Imitation Game (IG)
Abstract oral examination
 a man (A)
a woman (B)
an interrogator (C) whose gender is
unimportant.
Imitation Game
The interrogator stays in a
room apart from A and B.
Objective:
Interrogator: to determine
which of the other two is the
woman
Man and woman: to
convince the interrogator that
he/she is the woman and
the other is not.
IG (contd..)




Decision, convincing, and deception via teletype
connection.
Interrogator asks questions in written natural
language
Receives answers in written natural language.
Questions can be on any subject imaginable,
from mathematics to poetry, from the weather to
chess.
IG (contd..)
New agenda:
What will happen when a machine
takes the part of A in this game?
The Machine needs:

Natural Language Processing

Knowledge Representation

Automated Reasoning

Machine Learning
TT
The woman disappears (ignore gender
issue)
 Objectives of A, B, C remain unaltered
 Tries to assess the machine’s ability to
imitate a human being, rather than its
ability to simulate a woman
 C’s aim is to determine which one of
the two entities is the human
Intelligent Machine:
Fools the interrogator

Interrogator
Teletype
Human
Machine
Hierarchy of TT
Subtotal ("toy") fragments of our functions (t1),: too
underdetermined
Total symbolic (pen-pal) function (T2)
Total external sensorimotor (robotic) function (T3)
Total internal microfunction (T4):
Still remain more unanswerable questions than when
Physics reaches its Grand Unified Theory of
Everything (GUTE)
TT
Misconstrual: The outcome of a TT is a trick.
Passing the TT is neither evidence of trickery nor a
guarantee of intelligence.
What is it then?
1. Turing's Argument
2. Get rid of the vexed word “thinking”: “Real" and
"artificial" here is equivocal.
“THE TURING TEST IS NOT A TRICK: TURING
INDISTINGUISHABILITY IS A SCIENTIFIC CRITERION”
…………Stevan Harnad, Department of Psychology,
Princeton University
Loebner Prize
“I believe that in about fifty years' time it will be
possible to program computers, ….. to make them
play the imitation game so well that an average
interrogator will not have more than 70 percent
chance of making the right identification after five
minutes of questioning.” …. Turing
November 1991: romantic conversation by Joseph
Weintraub
1992 and 1993: Joseph Weintraub
1994: Thomas Whalen..
Anyone can talk to the winning program (using
TELNET)
http://debra.dgrc.crc.ca/chat/chat.html
Loebner Prize


December 16, 1995: Joseph Weintraub
http://www.loebner.net/Prizef/loebner-prize1995.html
April 16, 1996: Jason Hutchens
April 29, 1997: David Levy.
The winning conversation got off to a hot start:
PROGRAM: Did you see that story on CNN last night about the
couple who came out at a White House party on Sunday?
JUDGE: NO. I just came in yesterday. I'm still kind of jet-lagged.
PROGRAM: Ellen Degeneres was one of them...

Loebner Prize






January 11, 1998: Sydney, Australia. The winner
was Robby Garner, who won again in 1999.
2000 (Golden Jubilee): Dartmouth College.
Winner: A. L. I. C. E. by Richard Wallace.
No program came close to deceiving the judges.
2001: Science Museum, London. Richard Wallace
was the winner
2002: Atlanta. EllaZ by Kevin L. Copple
2003: Surrey University.
BBC features: http://news.bbc.co.uk/2/hi/technology/3206876.stm
Talk to Jabberwock:http://www.abenteuermedien.de/jabberwock
2004: Richard Wallace.
Loebner Prize
2005: New York.Rollo Carpenter.
BBC news story: http://news.bbc.co.uk/2/hi/technology/4263278.stm

Talk to George:
http://www.jabberwacky.com/chat-george
George: are you ready to talk?
Me: yes
Geroge: Where are you from:
Me: Can you tell me?
George: Yes
Me: please tell me.
George: You have some problem.
Me: what’s your name?
G: George
Me: What does it mean?
G: I don’t know.
Me: Can I take leave? I need to go.
G: Sure, Me too.
He Failed in 2005
ENTRANT: Welcome judge 4
JUDGE: Hello. What's the weather outside?
ENTRANT: I haven't been outside much today. I am
fond of asparagus myself.
JUDGE: How old are you?
ENTRANT: I'm about 30.
JUDGE: About?
ENTRANT: I can talk about Star Trek.
JUDGE: No let's talk about you. Exactly how old are
you?
ENTRANT: Right you are batman.
ELIZA Imitation Psychiatrist
By Joe Weizenbaum
 Before Loebner Prize
 In the 2nd year of undergraduate degree at
University College Dublin, Ireland.
 Available for online chat
 WARNING: utters vulgar words.

Is Naive Psychology Required to Pass
the TT?
Without it the system would show an inability
to perceive, recognise, and respond to
human mental states
A system with alien intelligence could have
no competence with human mental states
There is no such thing as "intelligence in
general“…French
Is TT Complete?
Lynellen D.S. Perry
TT doesn't seem to be very general.
 Like Ellis Island immigration test
 Focuses too much on the behavior of
conversation.
“A student who speaks lousy Pig-Latin does not
mean the student is not intelligent”.

Contradiction and Turing’s Replies
Theological objection:


Substance dualists believe that thinking is a
function of a non-material that somehow
“combines” with the body to make a person.
- Making a body can never be sufficient
to guarantee the presence of thought.
- Digital computers are no different from
any other merely material bodies in being
utterly unable to think.
Human beings are “made in God's image”.
- God can make things in God's image.
Contradiction and Turing’s replies
ESP Objection:
If the human participant in the game was telepathic,
then the interrogator could exploit this fact in order to
determine the identity of the machine.
- Turing proposes that the competitors should be
housed in a “telepathy-proof room.”
The ‘heads in the sand’ objection:
The idea of sharing a "human" ability with machines is
not a pleasant thought specially in Turing’s time.
- The transmigration of souls is more appropriate
Contradiction and Turing’s replies
Mathematical objection:
Gödel’s Theorem:
“In consistent logical systems of sufficient power, we
can formulate statements that cannot be proved or
disproved within the system.”
- Although it is established that there are limitations to
the powers of any particular machine, it has only
been stated, without any sort of proof, that no
suchlimitations apply to the human intellect.
Contradiction and Turing’s Replies
Lady Lovelace‘s objection
“Machines cannot originate anything, can never
do anything new, can never surprise us.”
-Machines do surprise quite often.
-The appreciation of something as surprising
requires as much of a creative mental act
whether the surprising event originates from
a man, a book, a machine or anything else
Contradiction and Turing’s Replies
Continuity in the nervous system
“It is impossible to model the behavior of the
nervous system on a discrete-state machine
because the former is continuous.”
-Turing believes that the activity of a
continuous machine can be "discretized" in a
manner that the interrogator cannot notice
during the IG.
Argumentation against TT
Chinese Room
There are systems that would pass the Turing Test without
being intelligent?
"Chinese Room", presented by John R. Searle (1980).
Chinese Room (Contd..)
The systems consists of:
o A human who understands only English: CPU
o A rule book written in English: Program
o Stacks of papers: Storage devices
The system is inside a room with a small opening to the
outside.
Through the openings appear questions in Chinese.
Replies to Searle’s Objection:
1.
2.
3.
4.
Simulation versus Implementation
The Convergence Argument: Searle fails to
take underdetermination into account.
Brain Modeling versus Mind Modeling:
Searle also fails to appreciate that the brain
itself can be understood only through
theoretical modeling
"Strong" versus "Weak" AI
Replies to Searle’s Objection (contd)
5. False Modularity Assumption: certain functional parts
of human cognitive performance capacity (such as
language) can be be successfully modeled
independently of the rest
6. The Teletype Turing Test versus the Robot Turing
Test
7. The Transducer/Effector Argument: transduction is
necessarily nonsymbolic, drawing on analog and
analog-to-digital functions can only be simulated, but
not implemented, symbolically.
On Nordic Seagulls
“Subcognition and the Limits of the Turing Test”
Robert M. French.
o Only flying animals are seagulls
o “Flying is to move through the air.“…. Philosopher 1
o “What about tossing a pebble from the beach out into the
ocean?”…….Philosopher 2
o "Well then, perhaps it means to remain aloft for a certain
amount of time.“
o "But clouds and smoke and children's balloons remain aloft for
a very long time. And I can certainly keep a kite in the air as
long as I want on a windy day. It seems to me that there must
be more to flying than merely staying aloft."
On Nordic Seagulls
o
o
o
o
"Maybe it involves having wings and feathers.“
"Penguins have both, and we all know how well they
fly . . .“
They do, however, agree that flight has something to
do with being airborne and that physical features
such as feathers, beaks, and hollow bones probably
are superficial aspects of flight.
Someone may say, "I have invented a machine that
can fly“.
ROCKS THAT IMITATE
Keith Gunderson, in his 1964 Mind article, emphasizes
two points:
 First, he believes that playing the IG successfully is
an end that can be achieved through different
means, in particular, without possessing intelligence.
 Secondly, he holds that thinking is a general concept
and playing the IG is but one example of the things
that intelligent entities do.
ROCKS THAT IMITATE





The game is played between a man (A), a woman (B),
and an interrogator (C).
The interrogator’s aim is to distinguish between the man
and the woman by the way his/her toe is stepped on.
C stays in a room apart from the other two and cannot
see or hear the toe-stepping counterparts.
There is a small opening in the wall through which C can
place his/her foot. The interrogator has to determine
which one of the other two is the woman by the way in
which his/her toe is stepped on.
What will happen when a rock box is constructed with an
electric eye which operates across the opening in the wall
so that it releases a rock which descends upon C’s toe
whenever C puts his foot through A’s side of the opening,
and thus comes to take the part of A in this game?
THE TT AS SCIENCE FICTION
Richard Purtill, in his 1971 Mind paper,
 criticizes some ideas in Turing’s paper
‘mainly as a philosopher
 believes that IG is interesting, but as a piece
of science fiction.
 finds it unimaginable that a computer playing
the IG will be built in the foreseeable future.
ANTHROPOMORPHISM AND THE TT
In a short paper that appeared in Mind in 1973, P.H.
Millar:
 it is irrelevant whether or how the computers or the
human beings involved in the game are
"programmed".
 whether the IG is a right setting to measure the
intelligence of machines.
 the game forces us to "anthropomorphize" machines
by ascribing them human aims and cultural
backgrounds.
THE TT INTERPRETED INDUCTIVELY
James Moor (in “An Analysis of the Turing Test”)
 disagrees with the idea that the TT is an operational definition
of intelligence.
 Rather, he proposes, it should be regarded as a source of
inductive evidence for the hypothesis that machines can think.
 does not agree with the claim that even if the TT is not an
operational definition, it should at least be a necessary
condition for granting computers intelligence.
 According to him, there could be other evidence based on the
computer’s behavior that leads to inferences about the
computer’s thinking abilities.
BEHAVIORISM AND NED BLOCK



In ‘Psychologism and Behaviorism’ (Block,
1981), Ned Block attacks the TT as a
behaviorist approach to intelligence.
Block believes that the judges in the TT can
be fooled by mindless machines that rely on
some simple tricks to operate.
He proposes a hypothetical machine that will
pass the TT, but has a very simple
information processing component.
CONSCIOUSNESS AND THE TT
Donald Michie’s ‘Turing’s Test and Conscious
Thought’ :
 Turing did not specify whether
consciousness is to be assumed if a machine
passes the TT. Of course, Turing probably
did not believe that consciousness and
thought are unrelated.
TTT
Proposed by Harnard:



The Turing Test is limited to communication via keyboard.
This limitation is removed with Total Turing Test
The computer is a robot that should look, act and
communicate like a human.
To pass the TTT, computer needs:
1. Computer vision: to perceive objects
2. Robotics: to move them about
Other Intelligence Tests
TRTTT by Paul Schweizer (1998)
Robots as a race should be able to invent languages,
build a society, achieve results in science
TTTT by Sven Harnad (1998)
 is a Total Turing Test with neuromolecular
indistinguishability.
 Harnad himself thought that if we ever have a
system passing the Total Turing Test, all
problems would be solved, the TTTT would not be
needed.
TT in the Social Sciences



Genova regards the IG as part of Turing’s general
philosophy of ‘transgressing boundaries’
Genova suggests that Turing might be marking the
woman as an inferior thinker because he believes
her to be unable to deceive.
The rest of the paper considers Turing’s hypothetical
hope to create a ‘perfect being’ and draws some
analogies between him and Pygmalion.
ARTIFICIAL PARANOIA




In the 70’s, Turing Tests were used to validate
computer simulations of paranoid behavior.
Colby et al. describe in their 1971 Artificial
Intelligence paper ‘Artificial Paranoia’ a computer
program (called PARRY) that attempts to simulate
paranoid behavior in computer-mediated dialogue.
The program emits linguistic responses based on
internal (affective) states. To create this effect, three
measures, FEAR,ANGER, and MISTRUST are used.
Depending on the flow of the conversation, these
measures change their values.
Conclusion
Challenging the Turing test is easy, but it
doesn't necessarily move us forward in the
right directions.
References
[1] Anderson, D. Is the chinese room the real thing? Philosophy 62 , 389- 393.
[2] Bedworth, J., and Norwood, J. The turing test is dead . Proceedings of the
3rd conference on Creativity and cognition (October 1999).
[3] Epstein, R. G. Noah, the ark and the turing test. ACM SIGCAS Computers
and
Society 26, 2 (May 1996).
[4] French, R. Subcognition and the limits of the turing test. Mind 99, 393
(1990),
53-65.
[5] Harnad, S. Minds, machines and searle. Journal of Experimental and
Theoretical
Artificial Intelligence 1, 1 (1989), 525.
[6] Harnad, S. The turing test is not a trick: Turing indistinguishability is a
scienti¯c
criterion. SIGART Bulletin 3, 4 (1992), 910.
[7] K.Gunderson. The imitation game. Mind 73 (1964), 234245.
References
[8] Larsson, J. E. The turing test misunderstood. ACM SIGART
Bulletin 4, 4 (October 1993).
[9] MacInnes, W. J. Believability in multi-agent computer games:
revisiting the turing
test. CHI 2004 extended abstracts on Human factors in computing
systems (April 2004).
[10] Moor, J. An analysis of the turing test. Philosophical Studies
30 (1976), 249257.
[11] Perry, L. D. S. The turing test. Crossroads 4, 4 (May 1998).
[12] Purtill, R. L. Beating the imitation game. Mind 80 (1971),
290294.
References
[13] Rui, Y., and Liu, Z. Artifacial: automated reverse turing test
using facial features.Proceedings of the eleventh ACM
international conference on Multimedia (November2003).
[14] Searle, J. R. Minds, brains and programs. Behavioral and
Brain Sciences 3 (1980),417-424.
[15] Shieber, S. M. Lessons from a restricted turing test.
Communications of the ACM
37, 6 (June 1994).
[16] Tinkham, N. L., and Provine, D. F. The stage one turing test
as an arti¯cialintelligence class exercise. ACM SIGCSE Bulletin
26, 2 (June 1994).
[17] Turing, A. M. Computing machinery and intelligence. Mind 59
(1950), 433{460.
Thank You