Download Eliezer Yudkowsky Singularity Institute for AI

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Mind Is All That Matters:
Reasons to Focus on Cognitive Technologies
Eliezer Yudkowsky
Singularity Institute for Artificial Intelligence
singinst.org
Eliezer Yudkowsky
Singularity Institute for AI
Eliezer Yudkowsky
Singularity Institute for AI
(Image from Oak Ridge National Laboratory)
Eliezer Yudkowsky
Singularity Institute for AI
Felleman, D.J. and Van
Essen, D.C. (1991)
Distributed hierarchical
processing in primate
visual cortex. Cerebral
Cortex, 1: 1-47.
Eliezer Yudkowsky
Singularity Institute for AI
Eliezer Yudkowsky
Singularity Institute for AI
"Book smarts" vs. cognition:
"Book smarts" evokes
images of:
Other stuff that happens
in the brain:
•
•
•
•
•
•
•
•
•
•
Calculus
Chess
Memorizing facts
Strict rules in wellunderstood situations
Eliezer Yudkowsky
Persuasiveness
Enthusiasm
Empathy
Strategic thinking
Musical talent
Rationality
Singularity Institute for AI
The scale of intelligent minds:
a parochial view.
Village idiot
Eliezer Yudkowsky
Einstein
Singularity Institute for AI
The scale of intelligent minds:
a parochial view.
Village idiot
Einstein
A more cosmopolitan view:
Mouse Village idiot
Chimp Einstein
Eliezer Yudkowsky
Singularity Institute for AI
"Supermen are superthinkers;
anything else is a side issue."
-- SF author Robert Heinlein
Eliezer Yudkowsky
Singularity Institute for AI
©Marlo Steed
Eliezer Yudkowsky
Singularity Institute for AI
One of these technologies
is not like the others...
Artificial Intelligence
Interplanetary travel
Cancer cure
Nanomanufacturing
Eliezer Yudkowsky
Singularity Institute for AI
"The influence of animal or vegetable life on matter is
infinitely beyond the range of any scientific inquiry
hitherto entered on. Its power of directing the motions of
moving particles, in the demonstrated daily miracle of
our human free-will, and in the growth of generation
after generation of plants from a single seed, are
infinitely different from any possible result of the
fortuitous concurrence of atoms... Modern biologists
were coming once more to the acceptance of something
and that was a vital principle."
-- Lord Kelvin
Eliezer Yudkowsky
Singularity Institute for AI
AI
Mouse Village idiot
Chimp Einstein
Eliezer Yudkowsky
Singularity Institute for AI
In Every Known Human Culture:
•
•
•
•
•
tool making
weapons
grammar
tickling
meal times
•
•
•
•
•
mediation of conflicts
dance, singing
personal names
promises
mourning
(Donald E. Brown, 1991. Human universals. New York: McGraw-Hill.)
Eliezer Yudkowsky
Singularity Institute for AI
A complex adaptation must be
universal within a species.
(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture.
In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
A complex adaptation must be
universal within a species.
If: 6 necessary genes
Each at 10% frequency in population
Then: 1 in 1,000,000 have complete adaptation
(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture.
In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
Incremental evolution of complexity:
1.
2.
3.
4.
[A]
[A←B]
[A'↔B]
[A'B←C]
...
A is advantageous by itself.
B depends on A.
A' replaces A, depends on B.
C depends on A' and B
(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture.
In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
The Psychic Unity of Humankind
Complex adaptations
must be universal in a species –
including cognitive machinery
in Homo sapiens!
(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture.
In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
Must…
not…
emote…
The Great Failure
of Imagination:
Anthropomorphism
Eliezer Yudkowsky
Singularity Institute for AI
Mind Projection Fallacy:
If I am ignorant about a phenomenon,
this is a fact about my state of mind,
not a fact about the phenomenon.
(Jaynes, E.T. 2003. Probability Theory: The Logic of
Science. Cambridge: Cambridge University Press.)
Eliezer Yudkowsky
Singularity Institute for AI
Minds-in-general
Human minds
Eliezer Yudkowsky
Singularity Institute for AI
Minds-in-general
Transhuman mindspace
Human minds
Eliezer Yudkowsky
Singularity Institute for AI
Minds-in-general
Posthuman mindspace
Transhuman mindspace
Human minds
Eliezer Yudkowsky
Singularity Institute for AI
Bipping AIs
Minds-in-general
Posthuman mindspace
Freepy AIs
Gloopy AIs
Transhuman mindspace
Human minds
Eliezer Yudkowsky
Singularity Institute for AI
Bipping AIs
Minds-in-general
Posthuman mindspace
Freepy AIs
Gloopy AIs
Transhuman mindspace
Human minds
Eliezer Yudkowsky
YOU
ARE
HERE
Singularity Institute for AI
Fallacy of the Giant Cheesecake
• Major premise: A superintelligence could
create a mile-high cheesecake.
• Minor premise: Someone will create a
recursively self-improving AI.
• Conclusion: The future will be full of giant
cheesecakes.
Power does not imply motive.
Eliezer Yudkowsky
Singularity Institute for AI
Fallacy of the Giant Cheesecake
• Major premise: A superintelligence could
create a mile-high cheesecake.
• Minor premise: Someone will create a
recursively self-improving AI.
• Conclusion: The future will be full of giant
cheesecakes.
Power does not imply motive.
Eliezer Yudkowsky
Singularity Institute for AI
Minds-in-general
Human minds
Eliezer Yudkowsky
Singularity Institute for AI
Eliezer Yudkowsky
Singularity Institute for AI
Minds-in-general
YOU
ARE
HERE
Eliezer Yudkowsky
Singularity Institute for AI
Minds-in-general
YOU
ARE
HERE
Eliezer Yudkowsky
Singularity Institute for AI
Intelligence
Technology
Eliezer Yudkowsky
Singularity Institute for AI
Cognitive Technology
Closes the Loop:
Intelligence
Eliezer Yudkowsky
Technology
Singularity Institute for AI
I. J. Good's
"Intelligence Explosion":
AI
(Good, I. J. 1965. Speculations Concerning the First Ultraintelligent Machine.)
Eliezer Yudkowsky
Singularity Institute for AI
AI
Mouse Village idiot
Chimp Einstein
Eliezer Yudkowsky
Singularity Institute for AI
Friendly AI
Eliezer Yudkowsky
Singularity Institute for AI
Friendly AI
Eliezer Yudkowsky
Singularity Institute for AI
Summary:
• Intelligence supports
all human abilities
• Technology comes
from intelligence
• Whatever impacts
intelligence lifts the
tree by its roots
• >H equals >H mind
• Cognitive technology
opens vast new regions
of mind design space
• Technology improving
intelligence closes the
loop - positive feedback!
To grasp the future,
focus on technologies
that impact upon the mind.
Eliezer Yudkowsky
Singularity Institute for AI
Mind Is All That Matters
Eliezer Yudkowsky
Singularity Institute for Artificial Intelligence
singinst.org
Related documents