Download Classical Conditioning

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Observational methods in psychology wikipedia , lookup

Behavioral modernity wikipedia , lookup

Motivation wikipedia , lookup

Abnormal psychology wikipedia , lookup

Thin-slicing wikipedia , lookup

Theory of planned behavior wikipedia , lookup

Attribution (psychology) wikipedia , lookup

Theory of reasoned action wikipedia , lookup

Neuroeconomics wikipedia , lookup

Sociobiology wikipedia , lookup

Learning theory (education) wikipedia , lookup

Transtheoretical model wikipedia , lookup

Descriptive psychology wikipedia , lookup

Applied behavior analysis wikipedia , lookup

Adherence management coaching wikipedia , lookup

Verbal Behavior wikipedia , lookup

Psychophysics wikipedia , lookup

Insufficient justification wikipedia , lookup

Behavior analysis of child development wikipedia , lookup

Social cognitive theory wikipedia , lookup

Eyeblink conditioning wikipedia , lookup

Psychological behaviorism wikipedia , lookup

Behaviorism wikipedia , lookup

Classical conditioning wikipedia , lookup

Operant conditioning wikipedia , lookup

Transcript
Please keep in mind that "learning theory" is associated with the psychological perspective of BEHAVIORISM.
Learning: a relatively permanent change in an organism's behavior due to experience.
Behaviorism: the view that psychology (1) should be an objective science that (2) studies behavior without
reference to mental processes. Most research psychologist today agree with (1) but not with (2).
Associative Learning: learning the two events (2 stimuli in the case of classical conditioning or a response and
its consequence in operant conditioning) occur together.
CLASSICAL CONDITIONING
Classical Conditioning: a type of learning in which an organism comes to associate two stimuli. A neutral
stimulus that signals and unconditioned stimulus (UCS) begins to produce a response that anticipates and
prepares for the unconditioned stimulus. Also called Pavlovian Conditioning.
Unconditioned Stimulus (USC): in classical conditioning, a stimulus that unconditionally--naturally and
automatically--triggers an unconditioned response (UCR).
Unconditioned Response (UCR): in classical conditioning, the unlearned, naturally occurring response to the
unconditioned stimulus (UCS), such as salivation when presented with food.
Conditioned Stimulus (CS): in classical conditioning, an originally irrelevant or Neutral Stimulus (NS) that, after
association with an unconditioned stimulus (UCS), comes to elicit a conditioned response (CR).
Conditioned Response (CR): in classical conditioning, the learned response to a previously neutral conditioned
stimulus (CS).
Before Conditioning
UCS (food)→UCR (salivation) & NS (bell)→no salivation
During Conditioning
Page 1
NS (bell) + UCS (food)→UCR (salivation)
After Conditioning
CS (bell)→CR (salivation)
**Remember: During classical conditioning, the neutral stimulus (NS) must be presented immediately BEFORE
the UCS. After conditioning, the NS will become the conditioned stimulus (CS). Also, keep in mind that the
unconditioned response (UCR) and the conditioned response (CR) are often very similar, if not identical to one
another.
Acquisition: the initial stage in classical conditioning. The phase associating a neutral stimulus with an
unconditioned stimulus so that the neutral stimulus becomes a conditioned stimulus and elicits a conditioned
response.
Extinction: the diminishing of a conditioned response. It occurs in classical conditioning when the UCS stops
being paired with the CS (e.g., the bell is presented without being followed by the food).
Spontaneous Recovery: the reappearance, after a rest period, of an extinguished conditioned response.
Generalization: the tendency, once a response has been conditioned, for stimuli similar to the conditioned
stimulus to elicit similar responses.
Discrimination: the learned ability to distinguish between a conditioned stimulus (e.g., bell) and other stimuli
that do not signal an unconditioned stimulus (e.g., telephone ringing).
Biological Predispositions: the understanding that an animals capacity for conditioning is constrained by its
biology (e.g., it is much easier to condition a rat to avoid certain tastes than certain sounds because rats use
taste naturally to determine if food is "good").
Little Albert: young child who was conditioned to fear rats after a rat was paired with terribly loud noise.
John B. Watson carried out this study and is considered to be the "father of behaviorism".
OPERANT CONDITIONING
Associative Learning: learning that two events (a response and its consequence in operant condition or 2
stimuli in classical conditioning) occur together.
Operant Conditioning: a type of learning in which behavior is strengthened if followed by a reinforcer
(positive or negative) and weakened if followed by a punisher.
Respondent Behavior: behavior that occurs as an automatic response to some stimulus; Skinner's term for
behavior learned through classical conditioning.
Operant Behavior: Skinner's term for behavior that operates on (affects) the environment, producing
Page 2
consequences.
Law of Effect: Thorndike's principle that behaviors followed by favorable consequences become more likely,
and that behaviors followed by unfavorable consequences be come less likely.
Operant Chamber (Skinner Box): a chamber containing a "bar" that an animal can manipulate to receive a food
or water reinforcer, with associated devices to record the animal's rate of bar pressing.
Shaping: an operant conditioning procedure in which reinforcers guide behavior toward closer and closer
approximations of the desired behavior.
Reinforcer: in operant conditioning, any event (consequence) that strengthens the behavior it follows.
Positive Reinforcer: a typically pleasurable stimulus that follows a response (e.g., getting a hug). It strengthens
and increases the response.
Negative Reinforcer: an aversive stimulus that is removed following a response (e.g., the buzzer stopping once
you fasten your seatbelt). It strengthens and increases the response. It is NOT the same thing as punishment.
Operant Conditioning: Contingencies of Reinforcement & Punishment
ADD
REMOVE
Appetitive (pleasant) Stimulus
Positive (+) Reinforcement
The behavior preceding the consequence is strengthened; it is more likely to occur again.
Punishment
(Referred to as negative punishment or response cost)
The behavior preceding the consequence is weakened; it is less likely to occur again.
Aversive (unpleasant) Stimulus
Punishment
(referred to as Positive punishment)
The behavior preceding the consequence is weakened; it is less likely to occur again.
Negative (-) Reinforcement
The behavior preceding the consequence is strengthened; it is more likely to occur again.
Types of Reinforcement and Punishment
Primary Reinforcer: Reinforcer that is rewarding in and of itself (e.g., food, water, and sex).
Page 3
Secondary Reinforcer: Reinforcer whose value is LEARNED through association with primary reinforcers (e.g.,
money, nice car, good grades, etc.).
Primary Punisher: Punishment that is unpleasant in and of itself (e.g., physical pain or discomfort).
Secondary Punisher: Punishment is LEARNED (e.g., poor grades, having a bad hair day, etc.).
Primary Reinforcer: an innately reinforcing stimulus, such as one that satisfies a biological need (e.g., food or
water).
Secondary (or Conditioned) Reinforcer: a stimulus that gains it reinforcing power through its association with a
primary reinforcer (e.g., money).
**Remember: Immediate reinforcers (and punishers) are much more effective than delayed reinforcers (and
punishers).
Schedules of Reinforcement
Continuous Reinforcement: reinforcing the desired response every time it occurs.
Partial (intermittent) Reinforcement: reinforcing a response only part of the time. This results in slower
acquisition of a response but with much greater to resistance to extinction than a continuous schedule of
reinforcement.
Fixed-ratio: reinforcement of a response only after a specific number of responses have occurred.
Variable-ratio: reinforcement of a response after an unpredictable number of responses have occurred.
Fixed-Interval: reinforcement of a response after a specific amount of time has elapsed.
Variable-Interval: reinforcement of a response after an unpredictable amount of time has elapsed.
Schedules of Reinforcement
Punishment
Punishment: an event that decreases the behavior it follows.
Positive Punishment: following a response with an aversive stimulus, thus weakening the response (e.g.,
spanking a child).
Negative Punishment: following a response with the removal of a pleasant stimulus, thus weakening the
response (e.g., taking away TV privileges).
Problems with Punishment: (1) it is only temporary; (2) it doesn't teach the correct behavior; (3) it can create
aggressive behavior in the organism being conditioned and: (4) the organism may become classically
conditioned to fear the punisher (through the association of pain (UCS) with the punisher (CS).
A B C's of Behviorism
Page 4
Antecedent: Mom says, "Bobby, clean your room".
Behavior: Bobby cleans his room.
Consequence: Mom gives Bobby a hug.
Cognition & Operant Conditioning
Cognitive Map: a mental representation of the layout of one's environment.
Latent Learning: learning that occurs, but is not apparent, until there is an incentive to demonstrate it
Overjustification Effect: the effect of promising a reward for doing what one already likes to do. The person
may now see the reward, rather than intrinsic interest, as the motivation performing the task....and thus, lose
interest.
Biological Predispositions
As with classical conditioning, an animal's natural predispositions constrain its capacity for operant
conditioning. For example: Pigeons easily learn to flap their wing to avoid a shock or to peck at a bar to obtain
food because they naturally flap their wings to flee from danger and peck to obtain food. However, they have
a hard time learning to flap their wings to obtain food or peck at a bar to avoid a shock.
Contrasting Operant & Classical Conditioning
Please review the CHART!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
It is extremely important that you clearly understand the similarities and differences in these two conditioning
techniques.
Classical vs. Operant Conditioning
The Response
Classical (Pavlovian) Conditioning
Operant Conditioning
Involuntary; automatic
Voluntary: the behavior "operates" on
(affects) the environment
Acquisition
Associating events in the environment; the CS announces the UCS (i.e., the bell
"announces" the food).
Operant / Associating a behavioral response with a consequence (either a reinforcer or a punisher).
Extinction
The CR (salivation) decreases when the CS (bell) is repeatedly presented, but not followed by
the UCS (food)
Operant / The behavior decreases when reinforcement stops.
Page 5
(Don’t forget, however, that the behavior will probably increase in frequency prior to extinction.)
Cognitive Processes
Subjects develop an "expectation" that the CS (bell) signals the imminent arrival of the UCS (food)
Subjects develop an "expectation" that a behavioral response will be reinforced or punished.
Biological Predispositions
Natural predispositions constrain what stimuli and responses can easily be associated.
Organisms best learn behaviors similar to their natural behaviors; unnatural behaviors instinctively drift
back toward natural ones.
OBSERVATIONAL LEARNING
Observational Learning: learning by observing the behavior of others (e.g., Bandura's experiments with the
children and the Bo-Bo Dolls)
Modeling: the process of observing and imitating a specific behavior. (While children clearly learn to model
antisocial behavior they see in the media, they can also learn to model prosocial behavior).
Prosocial Behavior: positive, constructive, helpful behavior. The opposite of antisocial behavior.
People to know: Pavlov, Watson, Thorndike, Skinner, Tolman, Bandura
Terms shared by Classical Conditioning and Operant Conditioning
Extinction
Spontaneous Recovery
Generalization
Discrimination
Differences between Classical and Operant Conditioning?
Classical Conditioning
Involuntary (The participant is passive.)
The Conditioned Stimulus (CS) must come before the Unconditioned Stimulus (UCS)
Operant Conditioning
Voluntary (The participant is active.)
Reinforcement comes after the behavior
Positive reinforcement:
A child picks up some litter and throws it away; the parent says, "good job!"
Page 6
A puppy sits when told; it receives a treat.
A rat receives a food pellet for pressing a bar when the green light is on.
Negative reinforcement:
A child is yelled at until he hangs up his coat in the closet.
A puppy is scolded with a newspaper until it jumps down off the couch.
A rat receives an electric shock to its feet until it presses the bar in its cage.
Punishment:
A child gets a "talking to" for teasing her baby brother.
A puppy gets slapped with a newspaper for jumping up on a neighbor.
A rat is blasted with bright lights and noise after choosing the wrong door.
Punishment by removal:
A child is not allowed to watch Nickelodeon for one week because she screamed at her dad.
A puppy is allowed to play in the house until it wets the floor; then it is put outside.
A teenager is not allowed to borrow the car for one month after arriving home late one evening.
Beware when using punishment: if used incorrectly, it only teaches how to avoid punishment. If you are
punished by a speeding ticket, do you stop speeding? Probably not, you just invest in a radar detector.
(Then, when radar detectors are illegal, law enforcement must invest in radar detector detectors; then, if you
do own one of those illegal radar detectors and you don't want to get caught, you must also buy a radar
detector detector detector. Just kidding.)
For punishment to be used effectively, it must:
be immediate;
be consistent;
be of sufficient magnitude (not easily ignored or shrugged off);
and there must be acceptable alternative behaviors made available -- if you punish one behavior,
you must also model or present the correct or acceptable behavior.
Operant Conditioning
Operant Conditioning – People or animals learn to do certain things (or not do certain things) because of the
results of what they do.
B.F. Skinner – Another famous behavioralist. Designed Skinner’s Box. Laboratory rats were not given food.
There was a lever in the cage, and when the mouse hit it, food released into the cage. The mouse learned to
hit the lever so that he could eat.
Reinforcement – a stimulus increases the chances that a behavior will happen again.
In the Skinner Box, the food was the reinforcement to make sure mouse would hit the lever again and again.
No lever pushing, no food.
-
It doesn’t matter why the behavior happens. It might be an accident the first time the behavior leads to
the reinforcement. Behavior must come before the reinforcement!
TYPES OF REINFORCERS
Page 7
Primary reinforcers – Reinforcers that appeal to biological needs, such as water, food and warmth. The food in
the Skinner example was a primary reinforcer.
Secondary reinforcers – Reinforcers that are learned by association. For example, money is a secondary
reinforcement because we have learned that money can buy us things. Others include: attention from others,
social approval, good grades, etc.
Positive reinforcers – These increase the frequency of a behavior. People are rewarded with positive
reinforcers when they accomplish a certain behavior.
Negative reinforcers – Something unpleasant happens if a behavior does not occur. If you fail a test every time
you do not study, you will begin to study more to avoid failing grades.
Reward – Basically the same thing as a positive reinforcer.
Punishment – NOT THE SAME AS A NEGATIVE REINFORCER!!!!!
With punishment, you quit doing something to avoid a negative result. If you bring home bad grades and get
grounded, you quit bringing home bad grades to avoid being grounded.
Negative reinforcement = increased behavior
Punishment = decreased behavior
Wrapping it all up – Conditioning/Learning
Reinforcement Schedules for Operant Conditioning
Continuous reinforcement – appearance of reinforcer each time the behavior occurs.
** Not practical, or even possible, to always provide reinforcer
Partial reinforcement – behavior is not reinforced every time it occurs.
** Behaviors tend to last longer b/c subject is unclear whether they will receive reinforcement, but are willing
to take the chance it will.
(That’s why it’s best to be consistent with a child. If you give into a child’s whining part of the time, and
sometimes not, the child will continue whining in hopes that this is the time you cave in to their needs)
Fixed-Ratio Schedules – Reinforcement comes after a set number of behaviors (Ex. You are paid for every 30 tshirts you sell)
Variable-Ratio Schedules – Reinforcement comes after an unpredictable number of responses (Ex. Slot
machines – makes it hard to get an addicted gambler to stop; the next one might be the “big” one)
Fixed-Interval Schedules – An equal pause after reinforcement. (Ex. People are more likely to check the mail
around the time the mail carrier usually comes)
Variable-Interval Schedules – Reinforcing behavior after varying time intervals (Ex. Students always do their
reading the night before to avoid failing a POSSIBLE pop quiz the next day)
Page 8
CLASSICAL CONDITIONING
Pavlov's Conditioning Experiments
Russian psychologist Ivan Pavlov hit upon classical (or Pavlovian) conditioning almost by accident when
studying digestive processes. He trained a dog
to salivate at the sound of a bell by presenting
the sound just before food was brought into the
room. Eventually the dog began to salivate at
the sound of the bell alone.
Elements of Classical Conditioning
Classical conditioning involves pairing a response
naturally caused by one stimulus with another,
previously neutral stimulus. There are four basic
elements to this transfer: The unconditioned stimulus
(US), often food, invariably causes an organism to
respond in a specific way. The unconditioned response
(UR) is the reaction (such as salivation) that always
results from the unconditioned stimulus. The
conditioned stimulus (CS) is a stimulus (such as a bell)
that does not initially bring about the desired response;
over the course of conditioning, however, the CS comes
to produce the desired response when presented alone.
Finally, the conditioned response (CR) is the behavior
that the organism learns to exhibit in the presence of a conditioned stimulus.
Classical Conditioning in Humans
Humans also learn to associate certain sights or sounds with other stimuli. John Watson and Rosalie Rayner
conditioned a little boy, Albert, to fear white rats by making a loud, frightening noise every time the boy was
shown a rat. Using much the same principle, Mary Cover Jones developed a method for unlearning fears: She
paired the sight of a caged rat, at gradually decreasing distances, with a child's pleasant experience of eating
candy. This method evolved into desensitization therapy, a conditioning technique designed to gradually
reduce anxiety about a particular object or situation. Recently, scientists have discovered that the immune
Page 9
system may respond to classical conditioning techniques, thus allowing doctors to use fewer drugs in treating
certain disorders.
Classical Conditioning Is Selective
Some kinds of conditioning are accomplished very easily, whereas other kinds may never occur. Research
demonstrating that we develop phobias about snakes and spiders, for example, but almost never about flowers
or cooking utensils illustrates Seligman's principles of preparedness and contrapreparedness, respectively. The
ease with which we develop conditioned food (or taste) aversions also illustrates learning preparedness.
Conditioned food aversions are exceptions to the general rules about classical conditioning. Animals can learn
to avoid poisonous food even if there is a lengthy interval between eating the food and becoming ill. In many
cases, only one pairing of conditioned and unconditioned stimuli is necessary for learning to take place.
OPERANT CONDITIONING
Classical conditioning focuses on a behavior that invariably follows a particular event, whereas operant (or
instrumental) conditioning concerns the learning of behavior that operates on the environment: The person or
animal behaves in a particular way to gain something desired or avoid something unpleasant. This behavior is
initially emitted rather than elicited—you wave your hand to flag down a taxi, dogs beg at the dinner table to
get food.
Thorndike's Conditioning Experiments
Psychologist Edward Lee Thorndike was the first researcher to study operant behavior systematically. He used
a "puzzle box" to determine how cats learn.
Elements of Operant Conditioning
Thorndike's work still stands as a landmark in our understanding of the effects
of both reinforcers and punishers. In operant conditioning, reinforcement (such
as food) is used to increase the probability that a particular response will occur
in the future. To decrease the probability that a particular response will recur,
punishers (such as scolding) are used. Thorndike proposed the law of effect,
which states that behavior that is consistently rewarded will become "stamped
in" as learned behavior and behavior that is consistently punished will be
"stamped out."
Types of Reinforcement
There are several kinds of reinforcers; all of them strengthen behavior just as steel rods reinforce or
strengthen concrete. The presence of positive reinforcers (such as food) adds to or increases the likelihood that
a behavior will recur. Negative reinforcers (such as terminating electric shocks) also increase the likelihood
that a behavior will recur, but they do so by reducing or eliminating something unpleasant from the
environment.
Page 10
Punishment
Although all reinforcers (both positive and negative) increase the likelihood that a behavior will occur again,
punishment is any event whose presence decreases the likelihood that ongoing behavior will recur.
Reinforcement always strengthens behavior; punishment weakens it. Avoidance training involves learning a
desirable behavior that prevents an unpleasant condition, such as punishment, from occurring.
Operant Conditioning Is Selective
Studies have revealed that in operant conditioning the behaviors that are easiest to condition are those that
animals typically would perform in the training situation. These behaviors vary from species to species, and
put significant constraints on both classical and operant conditioning.
Superstitious Behavior
When something we do is followed closely by a reinforcer, we tend to repeat that behavior, even if it was not
actually responsible for producing the reinforcement. Such behaviors are called superstitious. Nonhumans as
well as humans exhibit superstitious behaviors.
Learned Helplessness
The failure to avoid or escape from an unpleasant or aversive
stimulus that occurs as a result of previous exposure to
unavoidable painful stimuli is referred to as learned
helplessness. Learned helplessness, which has been
demonstrated in both animals and humans, is associated with
many of the symptoms characteristic of depression.
COMPARING CLASSICAL AND OPERANT CONDITIONING
A number of phenomena characterize both classical conditioning and operant conditioning, and there are
several terms and concepts common to both kinds of learning.
Response Acquisition
In classical conditioning, responses occur naturally and
automatically in the presence of the unconditioned stimulus. During
the phase of the learning process called response acquisition, these
naturally occurring responses are attached to the conditioned
stimulus by pairing that stimulus with the unconditioned stimulus.
Intermittent pairing reduces both the rate of learning and the final
Page 11
level of learning achieved.
In operant conditioning, response acquisition refers to the phase of the learning process in which desired
responses are followed by reinforcers. A Skinner box is often used to limit the range of available responses and
thus increase the likelihood that the desired response will occur. To speed up this process and make the
occurrence of a desired response more likely, motivation may be increased by letting the animal become
hungry; the number of potential responses may also be reduced by restricting the animal's environment.
For behaviors outside the laboratory, which cannot be controlled so conveniently,
the process of shaping is often useful: Reinforcement is given for successive
approximations to the desired behavior. However, there are differences among
species in what behaviors can be learned and the circumstances under which
learning will take hold.
Extinction and Spontaneous Recovery
If the unconditioned stimulus and the conditioned stimulus are no longer paired, extinction occurs, meaning
the strength and/or frequency of the learned response diminishes. When Pavlov's dogs received no food after
repeatedly hearing the bell, they ceased to salivate at the sound of the bell. However, after a while, this
extinguished response may reappear without retraining in a process called spontaneous recovery. Extinction is
complete when the subject no longer produces the conditioned response.
Extinction occurs in operant conditioning when reinforcement is withheld. However, the ease with which a
behavior is extinguished varies according to several factors: the strength of the original learning, the variety
of settings in which learning takes place, and the schedule of reinforcement used during conditioning.
Especially hard to extinguish is behavior learned through punishment rather than reinforcement.
Generalization and Discrimination
In classical conditioning, situations or stimuli may resemble each other enough that the learners will react to
one the way they have learned to react to the other through a process called stimulus generalization. On the
other hand, the process of stimulus discrimination enables learners to perceive differences among stimuli so
that not all loud sounds, for example, provoke fear. Just as in classical conditioning, responses learned
through operant conditioning can generalize from one stimulus to other, similar stimuli. Response
generalization occurs when the same stimulus leads to different but similar responses. Discrimination in
operant conditioning is taught by reinforcing a response only in the presence of certain stimuli.
NEW LEARNING BASED ON ORIGINAL LEARNING
In both classical and operant conditioning, original learning serves as a building block for new learning.
Page 12
Higher-Order Conditioning in Classical Conditioning
Higher-order conditioning in classical conditioning uses an earlier conditioned stimulus as an unconditioned
stimulus for further training. For example, Pavlov used the bell to condition his dogs to salivate at the sight of
a black square. This sort of conditioning is difficult to achieve because of extinction: Unless the first
unconditioned stimulus is presented occasionally, the initial conditioned response will be extinguished.
Secondary Reinforcers in Operant Conditioning
In operant conditioning, neutral stimuli can become reinforcers by being paired or associated with other
reinforcers. A primary reinforcer is one that, like food and water, is rewarding in and of itself. A secondary
reinforcer is one whose value is learned through its association with primary reinforcers or with other
secondary reinforcers. Money is an example of a secondary reinforcer—in and of itself, it is not rewarding; it
is valuable only for what it can buy.
CONTINGENCIES
The "if-then" relationship between conditioned stimuli and unconditioned stimuli in classical conditioning or
between responses and reinforcers (or punishers) in operant conditioning is called a contingency.
Contingencies in Classical Conditioning
Robert Rescorla has demonstrated that classical conditioning requires more than merely presenting an
unconditioned stimulus and a conditioned stimulus together in time. His work shows that for conditioning to
occur, a conditioned stimulus must provide information about the unconditioned stimulus—that is, there
must be a CS—US contingency. Blocking can occur when prior conditioning prevents conditioning to a
second stimulus, even when the two stimuli are presented simultaneously.
Contingencies in Operant Conditioning
In operant conditioning, response contingencies are usually referred to as schedules of reinforcement. We
rarely receive reinforcement every time we do something. Interestingly, it turns out that partial reinforcement,
in which rewards are given for some correct responses but not for every one—results in behavior that persists
longer than that learned by continuous reinforcement.
The schedule of reinforcement specifies when a
reinforcer will be delivered. Reinforcers may be
provided on the basis of time since the last
reinforcement (the interval between reinforcements).
Or reinforcement may depend on the number of
correct responses since the last reinforcement (the
ratio of reinforcement per correct response).
A fixed-interval schedule provides reinforcement of
the first correct response after a fixed, unchanging
period of time. A variable-interval schedule reinforces
Page 13
the learner for the first correct response that occurs after various periods of time, so the subject never knows
exactly when a reward is going to be delivered. In a fixed-ratio schedule, behavior is rewarded each time a
fixed number of correct responses is given; in a variable-ratio schedule, reinforcement follows a varying
number of correct responses.
A REVIEW OF CLASSICAL CONDITIONING AND OPERANT CONDITIONING
Despite their differences, classical and operant conditioning share many similarities; both involve associations
between stimuli and responses; both are subject to extinction and spontaneous recovery as well as
generalization and discrimination. In fact, many psychologists now question whether classical and operant
conditioning are not simply two ways of bringing about the same kind of learning. Biofeedback is an operant
conditioning technique in which instruments are used to give learners information about the strength of a
biological response over which they seek to gain control.
COGNITIVE LEARNING
Both human and nonhuman animals also demonstrate cognitive learning, learning that is not tied to
immediate experience by stimuli and reinforcers.
Latent Learning and Cognitive
Maps
Early experiments by Tolman and
other psychologists demonstrated
that learning takes place even
before the subject reaches the goal
and occurs whether or not the
learner is reinforced. Tolman
proposed the concept of latent
learning, which maintains that
subjects store up knowledge even if this knowledge is not reflected in their current behavior because it is not
elicited by reinforcers. Later research suggested that latent learning is stored as a mental image, or cognitive
map. When the proper time comes, the learner calls up this map and puts it to use.
Insight and Learning Sets
One phenomenon that highlights the importance of cognitive processing in learning is insight, in which
learning seems to occur in a "flash." Through insight learning, human and some nonhuman animals suddenly
discover whole patterns of behavior or solutions to problems. Learning sets refer to the increasing
effectiveness at problem solving that comes about as more problems are solved.
Learning by Observing
Page 14
Social learning theory argues
that we learn not just from
firsthand experience, but also
from watching others or by
hearing about something.
Albert Bandura contends that
observational (or vicarious)
learning accounts for many
aspects of human learning. His
highly influential theory of
learning holds that although
reinforcement is unrelated to
learning itself, reinforcement may influence whether learned behavior is actually displayed. Such
observational learning stresses the importance of models in our lives. To imitate a model's behavior, we must
(1) pay attention to what the model does; (2) remember what the model did; and (3) convert what we learned
from the model into action. The extent to which we display behaviors that have been learned through
observation can be affected by vicarious reinforcement and vicarious punishment. Social cognitive theory
emphasizes that learning a behavior from observing others does not necessarily lead to performing that
behavior. We are more likely to imitate behaviors we have seen rewarded.
Cognitive Learning in Nonhumans
Research has demonstrated that nonhuman animals can be classically conditioned, that they can be taught to
perform whole patterns of operant behaviors, and that they are capable of latent learning. All this evidence
lends support to the argument that nonhuman animals use cognitive processing in learning.
Page 15
Learning RG Questions and Answers
Study Guide questions and Answers
___ 1. At its beginning, psychology focused on the study of:
A) observable behavior.
B) consciousness.
C) abnormal behavior.
D) all of the
above.
___ 2. As defined by the text, consciousness includes which of the following?
A) focused attention
B) sleeping
C) hypnosis
D) all of the above
___ 3. Consciousness is defined in the text as:
A mental life.
)
B) selective attention to ongoing perceptions, thoughts, and feelings.
C) information processing.
D our awareness of ourselves and our environment.
)
___ 4. Concluding his presentation on levels of information processing, Miguel states that:
A humans process both conscious and unconscious information in parallel.
)
B) conscious processing occurs in parallel, while unconscious processing is serial.
C) conscious processing is serial, while unconscious processing is parallel.
D all information processing is serial in nature.
)
___ 5. Which of the following is not an example of a biological rhythm?
A feeling depressed during the winter
)
C) the five sleep stages
months
B) the female menstrual cycle
D sudden sleep attacks during the day
)
___ 6. Circadian rhythms are the:
A brain waves that occur during Stage 4 sleep.
)
B) muscular tremors that occur during opiate withdrawal.
C) regular body cycles that occur on a 24-hour schedule.
D brain waves that are indicative of Stage 2 sleep.
)
___ 7. When our ________ is disrupted, we experience jet lag.
A) Stage 1 sleep
B) REM sleep
C) circadian rhythm
Page 16
D) Stage 4 sleep
___ 8. The cluster of brain cells that control the circadian rhythm is the:
A) amygdala.
B) suprachiasmatic nucleus.
C) adenosine.
D) pineal.
___ 9. The sleep-waking cycles of young people who stay up too late typically are ________ hours
in duration.
A) 23
B) 24
C) 25
D) 26
___ 10. A person whose EEG shows a high proportion of alpha waves is most likely:
A) dreaming.
B) in Stage 2 sleep.
C) in Stage 3 or 4 sleep.
D) awake and relaxed.
___ 11. Sleep spindles predominate during which stage of sleep?
A) Stage 2
B) Stage 3
C) Stage 4
D) REM sleep
___ 12. During which stage of sleep does the body experience increased heart rate, rapid
breathing, and genital arousal?
A) Stage 2
B) Stage 3
C) Stage 4
D) REM sleep
___ 13. Which of the following is characteristic of REM sleep?
A) genital arousal
B) increased muscular tension
C) night terrors
D) alpha waves
___ 14. Although her eyes are closed, Adele's brain is generating bursts of electrical activity. It is
likely that Adele is:
A under the influence of a depressant.
C) in REM sleep.
)
B) under the influence of an opiate.
D having a near-death experience.
)
___ 15. REM sleep is referred to as paradoxical sleep because:
A studies of people deprived of REM sleep indicate that REM sleep is unnecessary.
)
B) the body's muscles remain relaxed while the brain and eyes are active.
C) it is very easy to awaken a person from REM sleep.
D the body's muscles are very tense while the brain is in a nearly meditative state.
)
___ 16. A PET scan of a sleeping person's brain reveals increased activity in the visual and
auditory areas. This most likely indicates that the sleeper:
A) has a neurological disorder.
B) is not truly asleep.
suffers from narcolepsy.
___ 17. The sleep cycle is approximately ________ minutes.
A) 30
B) 50
C) 75
D) 90
Page 17
C) is in REM sleep.
D)
___ 18. The effects of chronic sleep deprivation include:
A suppression of the immune system.
C) impaired creativity.
)
B) altered metabolic and hormonal
D all of the above.
functioning.
)
___ 19. Concluding her presentation on contemporary theories of why sleep is necessary,
Marilynn makes all of the following points except:
A Sleep may have evolved because it kept our ancestors safe during potentially dangerous
)
periods.
B) Sleep gives the brain time to heal, as it restores and repairs damaged neurons.
C) Sleep encourages growth through a hormone secreted during Stage 4.
D Slow-wave sleep provides a “psychic safety valve” for stressful waking experiences.
)
___ 20. One effect of sleeping pills is to:
A) decrease REM sleep.
B) increase REM sleep.
C) decrease Stage 2 sleep.
D)
increase Stage 2 sleep.
___ 21. A person who falls asleep in the midst of a heated argument probably suffers from:
A) sleep apnea.
B) narcolepsy.
C) night terrors.
D) insomnia.
___ 22. According to Freud, dreams are:
A a symbolic fulfillment of erotic wishes.
C) the brain's mechanism for selfstimulation.
)
B) the result of random neural activity in
D the disguised expressions of inner
the brainstem.
)
conflicts.
___ 23. Jill dreams that she trips and falls as she walks up the steps to the stage to receive her
college diploma. Her psychoanalyst suggests that the dream might symbolize her fear of
moving on to the next stage of her life—a career. The analyst is evidently attempting to
interpret the ________ content of Jill's dream.
A) manifest
B) latent
C) dissociated
D) overt
___ 24. People who heard unusual phrases prior to sleep were awakened each time they began
REM sleep. The fact that they remembered less the next morning provides support for the
________ theory of dreaming.
A) manifest content
B) physiological
C) information-processing
synthesis
Page 18
D) activation-
___ 25. Which of the following is not a theory of dreaming mentioned in the text?
A Dreams facilitate information processing.
)
B) Dreaming stimulates the developing brain.
C) Dreams result from random neural activity originating in the brainstem.
D Dreaming is an attempt to escape from social stimulation.
)
___ 26. According to the activation-synthesis theory, dreaming represents:
A the brain's efforts to integrate unrelated bursts of activity in visual brain areas with
)
the emotional tone provided by limbic system activity.
B) a mechanism for coping with the stresses of daily life.
C) a symbolic depiction of a person's unfulfilled wishes.
D an information-processing mechanism for converting the day's experiences into long)
term memory.
___ 27. Barry has participated in a sleep study for the last four nights. He was awakened each
time he entered REM sleep. Now that the experiment is over, which of the following can
be expected to occur?
A Barry will be too tired to sleep, so he'll continue to stay awake.
)
B) Barry will sleep so deeply for several nights that dreaming will be minimal.
C) There will be an increase in sleep Stages 1–4.
D There will be an increase in Barry's REM sleep.
)
___ 28. Which of the following statements regarding REM sleep is true?
A Adults spend more time than infants in
)
C) People deprived of REM sleep adapt
REM sleep.
easily.
B) REM sleep deprivation results in a REM
rebound.
D Sleeping medications tend to increase
)
REM sleep.
___ 29. The modern discovery of hypnosis is generally attributed to:
A) Freud.
B) Mesmer.
C) Spanos.
D) Hilgard.
___ 30. Of the following individuals, who is likely to be the most hypnotically suggestible?
A Bill, a reality-oriented stockbroker
)
B) Janice, an actress with a rich imagination
C) Megan, a sixth-grader who has trouble focusing her attention on a task
D Darren, who has never been able to really “get involved” in movies or novels
)
Page 19
___ 31. Hypnotic responsiveness is:
A the same in all people.
C) generally greater in men than women.
)
B) generally greater in women than men.
D greater when people are led to expect it.
)
___ 32. An attorney wants to know if the details and accuracy of an eyewitness's memory for a
crime would be improved under hypnosis. Given the results of relevant research, what
should you tell the attorney?
A Most hypnotically retrieved memories are either false or contaminated.
)
B) Hypnotically retrieved memories are usually more accurate than conscious memories.
C) Hypnotically retrieved memories are purely the product of the subject's imagination.
D Hypnosis only improves memory of anxiety-provoking childhood events.
)
___ 33. Research studies of the effectiveness of hypnosis as a form of therapy have demonstrated
that:
A for problems of self-control, such as smoking, hypnosis is equally effective with
)
subjects who can be deeply hypnotized and those who cannot.
B) posthypnotic suggestions have helped alleviate headaches, asthma, and stress-related
skin disorders.
C) as a form of therapy, hypnosis is no more effective than positive suggestions given
without hypnosis.
D all of the above are true.
)
___ 34. As a form of therapy for relieving problems such as warts, hypnosis is:
A ineffective.
)
B) no more effective than positive suggestions given without hypnosis.
C) highly effective.
D more effective with adults than children.
)
Page 20
___ 35. Those who consider hypnosis a social phenomenon contend that:
A hypnosis is an altered state of consciousness.
)
B) hypnotic phenomena are unique to hypnosis.
C) hypnotized subjects become unresponsive when they are no longer motivated to act as
instructed.
D all of the above are true.
)
___ 36. Those who believe that hypnosis is a social phenomenon argue that “hypnotized”
individuals are:
A consciously faking their behavior.
C) underachievers striving to please the
hypnotist.
)
B) merely acting out a role.
D all of the above.
)
___ 37. According to Hilgard, hypnosis is:
A no different from a state of heightened motivation.
)
B) the same as dreaming.
C) a dissociation between different levels of consciousness.
D a type of “animal magnetism.”
)
___ 38. Which of the following statements concerning hypnosis is true?
A People will do anything under hypnosis.
)
B) Hypnosis is the same as sleeping.
C) Hypnosis is in part an extension of the division between conscious awareness and
automatic behavior.
D Hypnosis improves memory recall.
)
___ 39. Psychoactive drugs affect behavior and perception through:
A the power of suggestion.
C) alteration of neural activity in the
brain.
)
B) the placebo effect.
D psychological, not physiological,
)
influences.
___ 40. A person who requires increasing amounts of a drug in order to feel its effect is said to
have developed:
A) tolerance.
B) physical dependency.
C) psychological dependency.
Page 21
D) resistance.
___ 41. Dan has recently begun using an addictive, euphoria-producing drug. Which of the
following will probably occur if he repeatedly uses this drug?
A As tolerance to the drug develops, Dan will experience increasingly pleasurable
)
“highs.”
B) The dosage needed to produce the desired effect will increase.
C) After each use, he will become more and more elated.
D Dependence will become less of a problem.
)
___ 42. All of the following are common misconceptions about addiction, except the statement
that:
A to overcome an addiction a person almost always needs professional therapy.
)
B) psychoactive and medicinal drugs very quickly lead to addiction.
C) biological factors place some individuals at increased risk for addiction.
D many other repetitive, pleasure-seeking behaviors fit the drug-addiction-as-disease)
needing-treatment model.
___ 43. Which of the following is classified as a depressant?
A) methamphetamine
B) LSD
C) marijuana
D) alcohol
___ 44. Which of the following is not a stimulant?
A) amphetamines
B) caffeine
C) nicotine
D) alcohol
___ 45. Roberto is moderately intoxicated by alcohol. Which of the following changes in his
behavior is likely to occur?
A If angered, he is more likely to become aggressive than when he is sober.
)
B) He will be less self-conscious about his behavior.
C) If sexually aroused, he will be less inhibited about engaging in sexual activity.
D All of the above are likely.
)
___ 46. Alcohol has the most profound effect on:
A the transfer of experiences to long-term
)
C) previously established long-term
memory.
memories.
B) immediate memory.
D all of the above.
)
Page 22
___ 47. How a particular psychoactive drug affects a person depends on:
A the dosage and form in which the drug
)
C) the situation in which the drug is taken.
is taken.
B) the user's expectations and personality.
D all of the above.
)
___ 48. Cocaine and crack produce a euphoric rush by:
A blocking the actions of serotonin.
C) blocking the reuptake of dopamine in
brain cells.
)
B) depressing neural activity in the brain.
D stimulating the brain's production of
)
endorphins.
___ 49. I am a synthetic stimulant and mild hallucinogen that produces euphoria and social
intimacy by triggering the release of dopamine and serotonin. What am I?
A) LSD
B) MDMA
C) THC
D) cocaine
___ 50. THC is the major active ingredient in:
A) nicotine.
B) MDMA.
C) marijuana.
D) cocaine.
___ 51. Which of the following statements concerning marijuana is true?
A The by-products of marijuana are cleared from the body more slowly than are the by)
products of alcohol.
B) Regular users may need a higher dose of the drug to achieve a high than occasional
users would need to get the same effect.
C) Marijuana is not as addictive as nicotine or cocaine.
D Even small doses of marijuana hasten the loss of brain cells.
)
___ 52. Which of the following was not cited in the text as evidence that heredity influences
alcohol use?
A Children whose parents abuse alcohol have a lower tolerance for multiple alcoholic
)
drinks taken over a short period of time.
B) Boys who are impulsive and fearless at age 6 are more likely to drink as teenagers.
C) Laboratory mice have been selectively bred to prefer alcohol to water.
D Adopted children are more susceptible if one or both of their biological parents has a
)
history of alcoholism.
Page 23
___ 53. Which of the following statements concerning alcoholism is not true?
A Adopted individuals are more susceptible to alcoholism if they had an adoptive
)
parent with alcoholism.
B) Having an identical twin with alcoholism puts a person at increased risk for alcohol
problems.
C) Geneticists have identified genes that are more common among people predisposed to
alcoholism.
D Researchers have bred rats that prefer alcohol to water.
)
___ 54. The lowest rates of drug use among high school seniors is reported by:
A) white males.
B) white females.
C) black males.
D) Latinos.
___ 55. Which of the following is usually the most powerful determinant of whether teenagers
begin using drugs?
A) family strength
B) religiosity
C) school adjustment
D) peer influence
___ 56. Which of the following statements concerning the roots of drug use is true?
A Heavy users of alcohol, marijuana, and cocaine often are always on a high.
)
B) If an adolescent's friends use drugs, odds are that he or she will, too.
C) Teenagers who are academically average students seldom use drugs.
D It is nearly impossible to predict whether or not a particular adolescent will
)
experiment with drugs.
___ 57. Which of the following was not suggested by the text as an important aspect of drug
prevention and treatment programs?
A education about the long-term costs of a drug's temporary pleasures
)
B) efforts to boost people's self-esteem and purpose in life
C) attempts to modify peer associations
D “scare tactics” that frighten prepubescent children into avoiding drug experimentation
)
___ 58. Which of the following statements concerning near-death experiences is true?
A Fewer than 1 percent of patients who come close to dying report having them.
)
B) They typically consist of fantastic, mystical imagery.
C) They are more commonly experienced by females than by males.
D They are more commonly experienced by males than by females.
)
Page 24
___ 59. Which theorists believe that the mind and the body are separate entities?
A) the behaviorists
B) the monists
C) the dualists
D) the Freudians
___ 60. Levar believes that once the body has died, the mind also ceases to exist. Evidently, Levar
is a(n):
A) behaviorist.
Answer Key
1. B
2. D
3. D
4. C
5. D
6. C
7. C
8. B
9. C
10. D
11. A
12. D
13. A
14. C
15. B
16. C
17. D
18. D
19. D
B) monist.
C) dualist.
D) atheist.
20. A
41. B
21. B
42. C
22. A
43. D
23. B
44. D
24. C
45. D
25. D
46. A
26. A
47. D
27. D
48. C
28. B
49. B
29. B
50. C
30. B
51. A
31. D
52. A
32. A
53. A
33. D
54. C
34. B
55. D
35. C
56. D
36. B
57. D
37. C
58. B
38. C
59. C
39. C
60. B
40. A
Page 25