Download File - Mr. Warner`s US History

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of artificial intelligence wikipedia , lookup

Embodied cognitive science wikipedia , lookup

Chinese room wikipedia , lookup

Philosophy of artificial intelligence wikipedia , lookup

Functionalism (philosophy of mind) wikipedia , lookup

Transcript
Section 2.3
I, Robot
Mind as Software
McGraw-Hill
© 2013 McGraw-Hill Companies. All Rights Reserved.
Functionalism
 According to functionalism, mental states
are functional states.
 To perform a function is to take a certain
input and produce a certain output.
 When two things perform the same
function, they are said to have the same
“causal role.”
2.3-2
Functionalism vs. Behaviorism
 For behaviorism, mental states are
neither causes nor effects.
 The only causes behaviorism recognizes
are physical stimuli and the only effects,
physical responses.
 For functionalism, mental states are both
causes and effects.
2.3-3
Artificial Intelligence
 The goal of artificial intelligence is to create a
machine that can think for itself; that has a mind
of its own.
 According to strong AI, there’s nothing more to
having a mind than running the right kind of
program.
 Strong AI claims that the mind is to the brain as
the software of a computer is to its hardware.
2.3-4
Thought Experiment: Lewis’s
Pained Madman
 “There might be a
strange man who
sometimes feels pain,
just as we do, but
whose pain differs
greatly from ours in its
causes and effects.”
 This possibility suggests
that being in a particular
functional state is not a
necessary condition for
being in a mental state.
2.3-5
Functionalism and Feeling
1.
2.
3.
If functionalism were true, it would be
impossible for someone to be in pain
and function differently than we do
when we are in pain.
But, as Lewis’s pained madman
shows, that’s not impossible.
So, functionalism is false; being in a
certain functional state is not a
necessary condition for being in a
mental state.
2.3-6
Thought Experiment: Block’s
Chinese Nation
 “Suppose we convert the government of China
to functionalism, and we convince its officials
that it would enormously enhance their
international prestige to realize a human mind
for an hour.”
 Suppose the people of China run a mind
program. Would there now be another mind on
Earth?
 This is known as the “absent qualia objection”
to functionalism.
2.3-7
Block’s Argument
1.
2.
3.
If functionalism were true, then anything
that had the right sort of functional
organization would have a mind.
But as Block’s Chinese nation shows,
something can have the right sort of
functional organization and not have a
mind.
So functionalism is false; having the right
sort of functional organization is not a
sufficient condition for having a mind.
2.3-8
Thought Experiment: Putnam’s
Inverted Spectrum
 “Imagine your spectrum
becomes inverted at a
particular time in your life and
you remember what it was like
before that.”
 Imagine further that you learn
to function as before.
 This possibility suggests that
being in a particular functional
state is not sufficient for being
in a particular mental state.
2.3-9
Putnam’s Argument
1.
2.
3.
If functionalism were true, it would be
impossible for people with the same
functional organization to have different
mental states.
But, as Putnam’s inverted spectrum shows,
it is possible for people with the same
functional organization to have different
mental states.
So functionalism is false; having a certain
functional organization is not a sufficient
condition for being in a certain mental state.
2.3-10
Thought Probe: Pseudonormal
Vision
 Pseudonormal vision may
occur when the sensations
of green and red are
reversed.
 People with pseudonormal
vision would be functionally
indistinguishable from
normal people.
 Does the possibility of
pseudonormal vision support
the claim that functionalism
can’t account for conscious
experiences?
2.3-11
Thought Experiment: The
Turing Test
 Suppose an interrogator
is allowed to conduct a
conversation with both a
human and a computer
and, after a specified
period of time, cannot
tell which is which.
 Turing claims that any
computer that passed
such a test would have
to be able to think.
2.3-12
Thought Experiment: Searle’s
Chinese Room
 Suppose that a person in
a room with a rulebook
and a set of Chinese
symbols were able to use
them to answer questions
put to him in Chinese.
 The person could do this
without understanding
Chinese.
 This possibility shows
that performing a
particular function is not
sufficient for
understanding meaning.
2.3-13
Syntax and Semantics
 How a symbol can be combined with
other symbols to form a sentence is
determined by its syntax.
 What a symbol means is determined by
its semantics.
 Searles point: syntax does not equal
semantics--one can put together
syntactically correct strings of symbols
without knowing what they mean.
2.3-14
Searle’s Argument
1.
2.
3.
If a computer could understand a
language solely in virtue of running a
program, then the man in the room
would understand Chinese.
But the man in the room doesn’t
understand Chinese.
So computers can’t understand a
language solely in virtue of running a
program.
2.3-15
Replies to the Chinese Room:
 Systems reply: the man in the room doesn’t
understand Chinese, but the whole system does.
 Robot reply: the man in the room doesn’t
understand Chinese, but if the room were put in a
robot, the robot would.
 Brain simulator reply: the man in the room doesn’t
understand Chinese, but if the program simulated
nerve firings, the system would.
 Combination reply: even if each of the above replies
is inadequate, taken together they would create a
system that understands Chinese.
2.3-16
Searle’s Chinese Gym
 Connection machines attempt to mimic the
architecture of the human brain by connecting
processors in parallel.
 Because there is no central processor, some
believe such machines don’t fall prey to the
Chinese Room Argument.
 Searle counters by postulating a gym full of
people carrying out the same operations as the
nodes in a connectionist machine.
 In such a situation, the system as a whole
would not understand what the symbols mean.
2.3-17
Thought Probe:
Total Turing Test
 To pass the Total Turing Test, the computer
being tested would have to be able to do
everything that a normal human being does,
including walking, riding a bicycle, swimming,
dancing, playing a musical instrument, and so
on.
 Is passing the Total Turing Test either
necessary or sufficient for being intelligent and
thus having a mind?
2.3-18
Intentionality
 Intentionality is the property of being of
or about something.
 Mental states can have intentionality
because they can be of or about
something.

For example: the belief that the Yankees
will win the pennant is about the Yankees,
the pennant, and the proposition that the
Yankees will win the pennant.
2.3-19
Intentionality and the Chinese
Room
 An adequate theory of the mind should
explain how it is possible to think about
things.
 Searle claims that the Chinese room
shows that functionalism cannot account
for intentionality.
2.3-20
Machines and the Chinese
Room
 Searle does not take the Chinese room
to show that machines can’t think, for we
are machines and we think!
 What it shows is that there is more to
thinking that running a computer
program.
2.3-21
Thought Experiment: Block’s
Conversational Jukebox
 Suppose that all of the intelligent
conversations that could be had in an
hour are stored as a list on a tape.
Suppose further that a computer
carries on a conversation by searching
the list.
 The computer would seem to be
intelligent, but that would be an illusion.
 This possibility shows that performing a
certain function is not sufficient for
being in a mental state.
2.3-22
Thought Probe: Devout
Robots
 Suppose a robot
that passed the
Turing test asked
to be baptized.
 Should it be?
 Should it be given
the same rights
that we have?
2.3-23