Download Learning - Appalachian State University

Document related concepts

Verbal Behavior wikipedia , lookup

Classical conditioning wikipedia , lookup

Psychophysics wikipedia , lookup

Behaviorism wikipedia , lookup

Neuroeconomics wikipedia , lookup

Operant conditioning wikipedia , lookup

Transcript
Habituation in Spirostomum ambiguum
• Habituation: A response decrement to repetitive
stimulation
• Spirostomum is a ciliated protozoa
Habituation in Spirostomum ambiguum
• Are multiple cells required for learning
• Chemical bases of learning
Habituation in Spirostomum ambiguum
• Place individuals on a slide
• Tap slide with a mechanical stimulus each 4 seconds
• Animal shows a response decrement (habituation)
after 12 to 15 stimuli
• Other studies show that Spirostomum can remember
for a least 10 minutes
Ivan Pavlov
• A Russian physiologist
• Discovered the conditioned reflex by chance
E. B. Twitmyer
• A Beauty Never to See Flower
• A discoverer of the conditioned reflex
Pavlov’s Basic Procedure
• Present the CS without the US-The UR is observed
• Pair the CS with the US for a number of trials
• The CR is observed just prior to the US onset
Stages Of Pavlovian Conditioning
Mower’s Bell and Pad
Pavlovian procedure used to treat enuresis
Taste Aversion
Pavlovian procedure causes people to avoid certain
foods
E. L. Thorndike’s Puzzle Box
Law of Effect
B. F. Skinner and Operant Conditioning
1. My introduction to B.F. Skinner
2. Skinner and society
Operant Conditioning
Four Critical Definitions
Positive Reinforcement-The application of a stimulus that
increases the probability of the response it follows.
Negative Reinforcement-The withdrawal of a stimulus that
increases the probability of the response it follows.
Punishment-Reinforcement-The application of a stimulus
that decreases the probability of the response it follows.
Extinction-The withdrawal of a stimulus that decreases the
probability of the response it follows.
Some Negative Side Effects Of Punishment
1. Person who is punished avoids the punisher.
2. Punishment may model inappropriate aggressive
behavior.
3. Punishment can reduce self-esteem.
4. Punishment teaches you what not to do but not what to
do.
5. Punishment teaches people to do the minimum.
6. The use of aversive stimuli are hard to control.
Abu Graib
Schedules of Reinforcement
Caswell Center
Caswell Center
A New Orleans Story
Goal Of Behavior Therapy And A Few Definitions
Goal: To provide the individual with better control over
themselves or their environment.
Baseline: Behavior prior to intervention.
Shaping: Reinforcing successive approximations of the
desired behavior.
Prompt: A stimulus used to increase the probability of a
correct response.
Fading: Gradual removal of a prompt.
Chaining: Reinforcing the last behavior in the
sequence, the next to last behavior, etc.
Four Types Of Prompts
Physical Assistance: Moving an individual through the
desired responses.
Modeling: Imitating the desired behavior.
Pointing: Designating a location.
Verbal Instruction: Describing how to perform a
particular behavior.
Goal of Behavior Therapy
To give the individuals increased control over their lives
and/or environment.
Mary Cover Jones: Counterconditioning
Greg and the empty toilet paper roll
What Is Automation ?
Any sensing, detection, information-processing,
decision-making, or control action that could
be performed by humans but is actually
performed by a machine” (Moray, Inagaki, &
Itoh, 2000)
Automation is usually viewed as a continuum,
ranging from manual control to full
automation.
Some Quotes About Technology
But lo!! Men have become the tools of their tools.
- Henry David Thoreau
Some Quotes About Technology
It has become appallingly obvious that our
technology has exceeded our humanity. - Albert
Einstein
Some Quotes About Technology
We live in a time when automation is ushering in
a second industrial revolution. - Adlai E.
Stephenson
Some Quotes About Technology
The first rule of a
technology used in a
business is that
automation applied to an
efficient operation will
magnify efficiency. The
second is that automation
applied to an inefficient
operation will magnify
inefficiency. - Bill Gates
Four Generations of Artificial
Environments (AEs)
Where we have been, where we are, and where
we are going
First Generation
Unidirectional Communication-Information moves
from the machine to the person but not the person
to the machine.
Second Generation
Bidirectional Communication-Information moves
from the machine to the person and from the
person to the machine.
Third Generation
Virtual Reality-Information moves from the
machine to the person and from the person to the
machine. Ideally, the synthetic environment is
indistinguishable from the actual environment.
Fourth Generation
Life Simulation-The synthetic and actual
environments are indistinguishable and the person
does not know whether they are in an actual or
synthetic world.
Automation Usage Decisions (AUDs)
AUDs: Choices in which a human operator
has the option of using manual control or one
or more levels of automation (LOAs) to
perform a task.
Some AUDs Are Commonplace
Checkbooks may be balanced with a
calculator or by mental computation
Automobiles can be set to cruise
control or the driver may operate the
accelerator pedal
Stock purchases may be based on the
output of software programs or
investors may depend upon their
subjective assessment of the market
Some AUDs Have Historic Consequences
Casey Jones
Pearl Harbor
Three Mile Island
Some AUDs Have Historic Consequences
USS Greenville
2000 Election
Types of Automation
• Static: Level of automation is set a
the design stage
•Adaptive: Level of automation varies
depending upon the situation
Optimal And Suboptimal AUDs
If it is assumed that the objective is to perform a
task, the optimal AUD is to employ the level of
control, manual through full automation, that
maximizes the likelihood of a successful outcome.
A suboptimal AUD is a choice to use a level of
control that does not maximize the likelihood of
successfully performing a task.
Types of Suboptimal AUDs
Misuse is over reliance, employing automation
when manual control or a relatively low LOA
has a greater likelihood of success
Disuse is the under utilization of automation,
manually performing a task that could best be
done by a machine or a higher LOA.
Errors Resulting in Misuse and/or Disuse
Recognition Errors-Operator fails to recognize
that an alternative, either automated or manual,
is available.
Appraisal Errors-Operator inaccurately
estimates the utilities of the options.
Intent Errors (also called action errors)Operator knowingly selects the alternative that
does not maximize the likelihood of task success.
Two Images of an Operator
An operator is a single minded individual whose
sole object is to maximize task performance
An operator‘s decision to rely on automation is
based on a number of contingencies only one of
which is to achieve a successful performance.
Intent Errors and Decision Aids:
Doing It Your Way When Your
Way Is Obviously Wrong
Decision Aids And Intent Errors
Probably no area of automation has proved
more problematical than the introduction of
decision aids
Beck, Dzindolet and Pierce contended that much
of the disuse of decision aids is due to intent
errors
That is, operators refuse “advice” from a
decision aid that they know would improve their
performance
200 “Training” Trials
Participants viewed a series of slides on the
computer screen, half of which contained a
soldier in camouflage.
Machine Absent: Pressed a “button” to indicate
if the soldier was present or absent
Machine Present: 1) Pressed a “button” to
indicate if the soldier was present or absent and
2) Received the decision aid’s response
100 “Test” Trials
Participants viewed a series of slides on the
computer screen, half of which contained a
soldier in camouflage.
Machine Absent: Pressed a “button” to indicate
if the soldier was present or absent
Machine Present: 1) Received the decision aid’s
“recommendation” and 2) Pressed a “button” to
indicate if the soldier was present or absent
Results
12
Errors
10
8
6
4
2
0
Aid Absent
Aid Present
Number of Operators
Operators In Machine Present Condition
18
16
14
12
10
8
6
4
2
0
Disagreed
Agreed
Estimated Accuracy
Machine Present Condition: Estimated Accuracies
100
90
80
70
60
50
40
30
20
10
0
Own Accuracy
Aid Accuracy
To Shoot Or Not To Shoot
To Shoot Or Not To Shoot
Since 1900, 10% to 25% of US war fatalities in
resulted from fratricide
Targeting Decisions: Possible Outcomes
1) Soldier and CID detect a friend.
2) Soldier and CID fail to detect a friend.
3) Soldier detects a friend and CID fails to detect
a friend.
4) Soldier fails to detect a friend and CID detects
a friend.
Automation Usage Decisions (AUDs)
AUDs- Choices in which a human operator has the
option of relying upon manual control or one or
more levels of automation (LOAs) to perform a task.
Optimal AUD-Soldier relies upon the form of
control that is most likely to result in a correct
decision.
Types of Suboptimal AUDs
Misuse is over reliance, soldier employs
automation when manual control or a relatively
low LOA has a greater likelihood of success
Disuse is the under utilization of automation,
soldier manually performs a task that could best
be done by a machine or a higher LOA.
Beck, Dzindolet, & Pierce (2002)
Appraisal Errors-Soldier misjudges the relative
utilities of the automated (CID) and nonautomated (e.g., view through gun site) options.
Intent Errors-Soldier disregards the utilities of
the alternatives when making AUDs.
Intent Errors: Two Images of an Operator
An operator is a single-minded individual whose
sole object is to maximize task performance
An operator‘s decision to rely on automation is
based on a number of contingencies only one of
which is to achieve a successful performance.
John Henry Effect
John Henry Effect: Operators respond to
automation as a challenger, competitor, or threat
Increasing the operator’s personal involvement
with the non-automated alternative augments the
likelihood of a John Henry Effect.
John Henry Effect
Variables that increase the strength of a John
Henry Effect augment operators‘ preference for
the non-automated over the automated
alternative
Heightened preference for the non-automated
option should: 1) increase disuse and 2) decrease
misuse
Design
2 (Operator: Self-reliant, Other-reliant) x
2 (Machine Performance: Inferior,
Superior) x 14 (Trial Blocks) design
Dependent Variable: Suboptimal AUDs
(Superior Machine: Basing credit point
on the operator’s performance; Inferior
Machine: Basing credit on the
machine’s performance)
Credit Choice Screen
Sample Helicopter Photograph
Sample Helicopter Photograph
Operator Response Screen
CID Response Screen
Results Screen
Hypotheses
• Self-reliant operators will be less likely to base
credit points on the CID than other-reliant
operators
• Therefore
– Disuse will be greater in the self-superior
than in the other-superior condition
– Misuse will be higher among other-inferior
than self-inferior persons
Disuse
15
14
13
12
Mean Suboptimal AUDs
11
Self
10
9
8
7
Other
6
5
4
3
2
1
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Trial Blocks (20 Trials Per Trial Block)
•
Figure 1. Mean suboptimal automation usage decisions (AUDs) as a function of
operator and trial block for persons working with the superior machine.
Misuse
9
8
Mean Suboptimal AUDs
7
6
5
Other
4
3
2
Self
1
0
1
2
3
4
5
6
7
8
9
10 11
12
13
14
Trial Blocks (20 Trials Per Trial Block)
•
Figure 2. Mean suboptimal automation usage decisions (AUDs) as a function of
operator and trial block for persons working with the inferior machine.
Conclusions
1) Self-reliant and other-reliant operators were yoked.
Each had the same information. It seems reasonable to
conclude that the difficulty in determining the optimal
AUD was approximately equal in both conditions. Thus,
the large differences in suboptimal AUDs were
probably due to intent rather than appraisal errors.
2)Results support the hypotheses that factors which
augment the degree of personal involvement or
challenge from automated devices will increase the
probability of disuse and decrease the likelihood of
misuse
A Few Implications
1) Operator training programs should attempt to
attenuate intent as well as appraisal errors.
2) At least on this task, intent errors were a significant
source of suboptimal AUDs
3) Both appraisal and intent errors are sufficient to
produce suboptimal AUDs although neither is
necessary
4) It will be a hollow achievement if advances in our
knowledge of hardware and software are not
matched by an equally sophisticated comprehension
of the causes and control of misuse and disuse.