Download ppt - people.csail.mit.edu

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Kevin Warwick wikipedia , lookup

Existential risk from artificial general intelligence wikipedia , lookup

Robot wikipedia , lookup

Philosophy of artificial intelligence wikipedia , lookup

History of artificial intelligence wikipedia , lookup

Visual Turing Test wikipedia , lookup

Robotics wikipedia , lookup

Self-reconfiguring modular robot wikipedia , lookup

The City and the Stars wikipedia , lookup

Enactivism wikipedia , lookup

Ethics of artificial intelligence wikipedia , lookup

List of Doctor Who robots wikipedia , lookup

Visual servoing wikipedia , lookup

Cubix wikipedia , lookup

Index of robotics articles wikipedia , lookup

Embodied cognitive science wikipedia , lookup

Transcript
Perception and Perspective in Robotics
Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group
Overview
Goal
To build robots that can interact with novel
objects and participate in novel activities
Challenge
Machine perception can be robust for a
specific domain such as face detection, but
unlike human perception it is not currently
adaptable in the face of change (new
objects, changed circumstances)
‘Toil’ Example – Active Segmentation
1
2
3
‘Theft’ Example – Search Activity
Object boundaries are not always easy to
detect visually, so robot Cog sweeps its
arm through ambiguous areas
This can cause object motion, which
makes boundaries much easier to find
Then robot can learn to recognize and
segment object without further contact
1
2
Robot observes a human searching for
objects, and learns to make a connection
between the named target of the search and
the object successfully found. The robot has
no predefined vocabulary or object set.
Human
Robot
says “Find”
says “Find”
“Toma”
“Toma”
Approach
Integrate conventional machine perception
and machine learning with strategies for
opportunistic development –
 Active perception (sensorimotor ‘toil’)
3
 Interpersonal influences (‘theft’)
This work is implemented on a humanoid
robot (Cog, see right). The robot uses the
structure of familiar activities to learn about
novel elements within those activities, and
tracks known elements to learn about the
unfamiliar activities in which they are used.
(shows car)
(sees car)
“No”
“No”
This is a good basis for adaptable object perception:
Perspective
familiar activities (tasks, games, …)
use constraint of
familiar activity to
discover unfamiliar
entity used within it
reveal the structure of
unfamiliar activities by
tracking familiar entities
into and through them
Perception
familiar entities (objects, actors, properties, …)
(shows cube)
(sees cube)
“No”
“No”
active probing
segmentation
affordance exploitation
(shows bottle)
(sees bottle)
“Yes!”
“Yes”
(rolling)
edge catalog
manipulator detection
object detection,
recognition
(robot, human)
(shows cube)
(sees cube)
“Say”
“Say”
“Cube”
(shows bottle)
(sees bottle)
“Say”
“Say”
“Toma”
This work is funded by DARPA under contract number DABT 63-00-C-10102, and by the Nippon Telegraph and Telephone Corporation under the NTT/MIT collaboration agreement
Goal
To learn how human-level perception is
possible, by trying to build it
Challenge
Machine perception can be robust for a
specific domain, but is not adaptable like
human perception
Approach
Integrate conventional machine perception
and machine learning with strategies for
opportunistic development –
Active perception (sensorimotor ‘toil’)
Interpersonal influences (‘theft’)
Development
If a robot is engaged in a known activity
there may be sufficient constraint to identify
novel elements within that activity. Similarly,
if known elements take part in some
unfamiliar activity, tracking those can help
characterize that activity.
Potentially, perceptual development is an
open-ended loop of such discoveries.
Learning a sorting activity
Human shows robot where a collection of
disparate objects should go, based on some
common criterion (color).
Robot demonstrates understanding through
verbal descriptions, nods towards target
locations.
Kismet
What is done on
Kismet
Novel Perspective leads to Novel Perception
Learning a search activity
Human shows robot examples of search
activity by speaking. Robot demonstrates
understanding by linking name and object.
Learning through a search activity
Blah blah
Cog
What is done on
Cog
Perception and Perspective in Robotics
Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group
An Example – Active Segmentation
Overview
Goal
To learn how human-level perception is
possible, by trying to build it
Challenge
Machine perception can be robust for a
specific domain, but is not adaptable like
human perception
1
2
3
Open-ended Development
Object boundaries are not always easy to
detect visually, so robot Cog sweeps its
arm through ambiguous areas
This can cause object motion, which
makes boundaries much easier to find
Then robot can learn to recognize and
segment object without further contact
1
2
Approach
Integrate conventional machine perception
and machine learning with strategies for
opportunistic development –
Active perception (sensorimotor ‘toil’)
Interpersonal influences (‘theft’)
familiar activities
3
Experimental Platform
Expressive active vision head ‘Kismet’ and
upper-torso humanoid robot ‘Cog’
Kismet
What is done on
Kismet
use constraint of
familiar activity to
discover unfamiliar
entity used within it
reveal the structure of
unfamiliar activities by
tracking familiar entities
into and through them
familiar entities (objects, actors, properties, …)
Gives opportunity for much development…
active probing
segmentation
Cog
What is done on
Cog
If the robot is engaged in a known activity
there may be sufficient constraint to identify
novel elements within that activity.
Similarly, if known elements take part in
some unfamiliar activity, tracking those can
help characterize that activity.
Potentially, perceptual development is an
open-ended loop of such discoveries.
affordance exploitation
(rolling)
edge catalog
manipulator detection
object detection,
recognition
(robot, human)
Sorting activity
Human shows robot where a collection of
disparate objects should go, based on some
common criterion (color).
Robot demonstrates understanding through
verbal descriptions, nods towards target
locations.
Search activity
Human shows robot examples of search
activity by speaking. Robot demonstrates
understanding through verbal descriptions,
nods towards target locations.
Perception and Perspective in Robotics
Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group
familiar activities
Active Perception
To foo foo foo
Active Segmentation
Solve classic problem
Active Perception
Point 1, 2, 3
Motivation
Training examples are currently a necessary
condition for achieving robust machine
perception. Acquiring those examples is
properly the role of perception itself.
But a human is typically needed to collect
those examples.
Active Perception
To foo foo foo
Active Segmentation
Solve classic problem
use constraint of
familiar activity to
discover unfamiliar
entity used within it
•
reveal the structure of
Objectunfamiliar
boundaries
arebynot always
activities
tracking
familiar
entities
easy to
detect
visually
(e.g. yellow
and through
car oninto
yellow
table) them
• Solution: robot Cog sweeps
through ambiguous area
familiar entities (objects,
actors,
properties,
…)
• Resulting object motion helps
segmentation
• Robot can learn to recognize and
segment object without further
• Opportunities
abound
and
contact
cascade
EgoMap
short term memory
• Robot can perform “find the toma”
of objects and their locations
so “out of sight” is not “out of mind”
style tasks
• Observes search activity
• Then uses structure of search
activity to learn new properties
(object names)
• Searching and sorting
Sorting task
Human shows robot where a collection of
disparate objects should go, based on some
common criterion (color).
Robot demonstrates understanding through
verbal descriptions, nods towards target
locations.
Search task
Human shows robot examples of search
activity by speaking…
Robot demonstrates understanding through
verbal descriptions, nods towards target
locations.
Perception and Perspective in Robotics
Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group
Goal
To understand perception by trying to build it
Approach
Extend machine
opportuistic deve
The grist:
The mill:
perception
to
include
Active perception
Interpersonal influences
Opportunistic development
• Object boundaries are not always
easy to detect visually (e.g. yellow
car on yellow table)
• Solution: robot Cog sweeps
through ambiguous area
• Resulting object motion helps
segmentation
• Robot can learn to recognize and
segment object without further
contact
Examples
• Opportunities abound and
cascade
• Robot can perform “find the toma”
style tasks
• Observes search activity
• Then uses structure of search
activity to learn new properties
(object names)
• Searching and sorting
EgoMap
short term memory
of objects and their locations
so “out of sight” is not “out of mind”
familiar activities
use constraint of
familiar activity to
discover unfamiliar
entity used within it
reveal the structure of
unfamiliar activities by
tracking familiar entities
into and through them
familiar entities (objects, actors, properties, …)
1
A
B
C
Opportunism
Standard approach to machine perception is to develop
algorithms which, when provided with sufficient training data,
can learn to perform some classification or regression task.
Can move one step back and develop algorithms which, given
physical opportunities, acquire the training data. Need to
design system behavior side-by-side with the perceptual code.
Opportunistic Development
Suppose there is a property P which can normally not be
perceived. But there exists a situation S where it can be.
Then the robot can try to get into situation S, and observe P,
and relate it to other perceptual variables that are observable
Perception and Perspective in Robotics
Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group
Training
Data
Head
(7 DOFs)
Right arm
(6 DOFs)
Left arm
(6 DOFs)
Sequencing Mod
Instructor
Task
Modeling
Torso
(3 DOFs)
1
2
Task Learning Mechanis
Stand
(0 DOFs)
Task
Grounding
3
Eyes
(3 DOFs)
Facial
(15 DOFs)
Perceptual
System
Neck
(3 DOFs)
poking
affordance exploitation
object segmentation
Demonstrated
Task
(rolling)
State
Grounding
Perceptual Netw
familiar activities
edge catalog
manipulator detection
object detection
(recognition, localization,
contact-free segmentation)
(robot, human)
use constraint of
familiar activity to
discover unfamiliar
entity used within it
reveal the structure of
unfamiliar activities by
tracking familiar entities
into and through them
familiar entities (objects, actors, properties, …)
EgoMap
short term memory
of objects and their locations
so “out of sight” is not “out of mind”
(Speech)
Training
Data
Task Learning Mechanism
Sequencing Model
Instructor
Task
Modeling
Understanding perception by trying to build it
Task
Grounding
Machine perception is very fallible.
Robots (and humans) need not just particular perceptual
competences, but the tools to forge those competences out of raw
physical experiences.
Three important tools for extending a robot’s
perceptual abilities whose importance have been recognized
individually are related and brought together. The
first is active perception, where the robot employs motor
action to reliably perceive properties of the world that it
otherwise could not. The second is development, where
experience is used to improve perception. The third is
interpersonal influences, where the robot’s percepts are
guided by those of an external agent. Examples are given
for object segmentation, object recognition, and orientation
sensitivity; initial work on action understanding is
also described.
State
Grounding
Perceptual
System
Demonstrated
Task
Perceptual Network
• Object
boundaries are
not always easy
to detect visually
• Solution: Cog
sweeps through
ambiguous area
• Resulting object
motion helps
segmentation
• Robot can learn
to recognize and
segment object
without further
contact
camera image
implicated edges
found and grouped
response for
each object