Download SHB_Answer_key Revised 399KB Jan 09

Document related concepts

Embodied cognitive science wikipedia , lookup

Operant conditioning wikipedia , lookup

Transcript
1
W. Scott Wood's Study Objectives
For B.F. Skinner's
Science and Human Behavior
Wood, W.S.
A Study Guide to Accompany B.F. Skinner's Science and Human Behavior
SECTION I: THE POSSIBILITY OF A SCIENCE OF HUMAN BEHAVIOR
I. CAN SCIENCE HELP?
The Misuse of Science. pp. 3-5
Q1:
To what does Skinner refer when he states, "science has developed unevenly." Why does
he believe this to be so?
A1:
Technologies of control over inanimate nature are available, but effective knowledge
about the social consequences of the implementation of such technologies is absent.
This has resulted in misapplication and sometimes the creation of more problems
than solutions.
COMMENT: This introductory section is illuminating for two reasons. First, it clearly
reflects Skinner's social concerns, and second, it contains the first of many literary references that
will appear in this and many other of Skinner's books and articles.
Science as a Corrective. p. 5
Q2:
What does Skinner see as a solution to misapplied scientific techniques?
A2:
The development of an objective science of behavior that will permit society to be
able to direct scientific development wisely.
The Threat to Freedom. pp. 6-8
COMMENT: Several important issues are raised in this section, ideas that Skinner
returns to time and again both in this book and elsewhere. The first involves Skinner's view of
the overall conceptual nature of a science of behavior.
Q3:
What else is involved in science other than the description of facts?
A3:
Science not only describes, it predicts and controls.
COMMENT: At this point, Skinner points out that to arrive at a science that provides
both prediction and control, you have to begin by assuming that such a science is possible; in
Skinner's words, "..we must assume that behavior is lawful and determined." This assumption,
of course, leads Skinner into a direct conflict with certain traditional assumptions concerning
human nature.
2
Q4:
What tradition stands in opposition to the assumption that behavior is lawful and
determined?
A4:
The traditional Western philosophic position that man is a free agent; that he has a
"free will" and his behavior results from inner and unknowable processes of
decision making.
Q5:
What are some ways of defending a free will doctrine without exactly saying so?
A5:
a) Behavior is essentially unknowable.
b) Value judgments are more important determiners of human affairs, and science
cannot help in this realm.
c) Some "sciences" are, in fact, compatible with free will concepts.
The Practical Issue. pp. 8-10
COMMENT: Again we see Skinner as the social critic. He appears to be demanding a
science of behavioral prediction and control because of its social necessity. Obviously, this is in
contrast to what many other psychological theorists view as the goal of a science.
Q6:
What is the practical issue?
A6:
That the concept of free will stands in opposition to the development of any
practically effective science of behavior.
II. A SCIENCE OF BEHAVIOR
Q7:
Skinner states that science is not to be identified with certain common features of a
science. Be able to list three.
A7:
a) Organized accumulations of information, which are the products of a science
b) instruments of research
c) precise measurement or mathematical calculations
COMMENT: An interesting feature of this section is the comment of Skinner's upon the
noncumulative and thus "unscientific" aspects of writing, art, and philosophy. Few contemporary
practitioners of any of these fields would be likely to agree that there has been no progress in
2500 years.
Some Important Characteristics of Science. pp. 12-14
Q8:
What are the essential characteristics of science?
A8:
A set of attitudes, objectivity, honesty, and patience, and a search for lawful
relations which can be systematized in such a way as to permit effective prediction
and control of the subject matter.
3
COMMENT: Skinner understands science to be a set of behaviors that ultimately result
in effective control of the environment. He points out that the order of development of scientific
laws is similar to our own behavior during our early years, when we learn from our early
experiences to appreciate the orderliness of our environment through experience. For Skinner,
science is nothing more than a set of practices which increase the effectiveness of a natural
behavioral process. For him, seeking uniformity or orderliness in order to be able to better
predict and control our environment is something that we all do automatically. Science consists
merely of procedures which permit us to do this better. This, of course, stresses the importance
of the ultimate objective of environmental control and is something that a more traditional
interpretation of science as "seeking knowledge for its own sake" is somewhat opposed to. This
contrast is brought out by his distinction between science as an active process, not a passive or
contemplative one.
Behavior as a Scientific Subject Matter. pp. 14-16
Q9:
What features of behavior make it a difficult subject matter for science?
A9:
Its complexity, and the fact that behavior is a fluid process rather than a physical
object.
COMMENT: Notice again on p. 16 how Skinner refuses to equate science with the
abstract process that is represented by logical or mathematic statements of relationships.
Techniques of observation, experimental manipulation, and mathematical systematizing are seen
as the tools of science, not the hallmarks. The goal for Skinner remains effective prediction and
control. Remember that the rationale for this position results from Skinner's belief that science is
nothing more than an amplification of a natural behavioral process.
Some Objections to a Science of Behavior. pp. 17-22
Q10: Be able to summarize briefly the typical objections to a science that Skinner cites and be
able to give Skinner's counterarguments.
A10: a) Objection: that the Principle of Indeterminacy may be effective in a science of
behavior.
Reply:
that principle itself is a scientific relationship and doesn't imply
philosophic freedom from control inherent in the subject matter.
b) Objection: reason cannot, for philosophical and/or logical reasons, comprehend
itself
Reply:
no evidence
c) Objection: each individual is unique
Reply:
science ultimately can explain the individual case more effectively
through general principles than by simple descriptions of individual
diversity.
4
d) Objection: A statistical description of the average case is of little help in
predicting and controlling a given individual's behavior, which
should be the goal of an effective science of behavior.
Reply:
Skinner agrees, but refuses to identify statistical techniques as
essential to a science of behavior.
Comment: This point, of course, is one that Skinner returns to many
times in his writing.
e) Objection: Behavior is too complex to be successfully systematized even though
it may ultimately be understood to be lawful.
Reply:
First, this is scarcely an argument for individual self-determination.
Second, it is an experimental question whether or not it is even true.
f) Objection: predictions alter resultant behavior, and thus effective predictions
can never be made
Reply:
this is merely a practical problem which, in fact, reflects the
orderliness rather than the capriciousness of behavior
e) Objection: an effective utilization of laboratory-based behavioral laws requires
impractical conditions in society
Reply:
this is not necessarily true, since laboratory control and
systematization may point out unsuspected lawful relationships in
the surrounding "real" world. Besides, considerable environmental
control does exist in several different social institutions.
III. WHY ORGANISMS BEHAVE
COMMENT: Notice Skinner's discussion of "cause and effect" versus functional
relationship between the independent and dependent variable. He literally discounts the entire
definitional issue by asserting that both refer to the same factual core.
Q11: Why are there so many spurious caused accounts of behavior available?
A11: Men have anticipated scientific inquiry in their desire to explain behavior.
Some Popular "Causes" of Behavior. pp. 24-27
Q12: What principle does Skinner use to account for the emergence of astrology and body type
as causal factors for behavior?
A12: Conspicuous events correlated with behavior are interpreted by many as causal, and
these beliefs are supported by occasional chance predictions.
COMMENT: After Skinner similarly disposes of the issue of "heredity" as the layman
uses it, he discusses the possibility of genetically determined behavior. He makes two quite
different but related points. The first involves the reason why the issue of genetic predisposition
is so volatile, and the second is a rationale for his own lack of interest in the topic.
5
Q13: What role does genetics play in a practical science of behavior?
A13: It only permits us to make better use of other causes to control behavior. As an
independent variable, it only features importantly in long-range eugenic programs.
Inner "Causes." pp. 29-31
Q14: Why does Skinner discount any particular interest in a search for physiological causes of
behavior?
A14: Because even highly accurate knowledge about physiological antecedents can be
traced back outside the organism to the environmental factors that produced the
physiological antecedents, and these environmental causes will be more practically
useful in a science of behavioral control.
COMMENT: Notice how this argument against physiological psychology is a practical
one involving a technology of control rather than any philosophic belief in the "empty" organism.
Much confusion has resulted from other theorists who don't understand Skinner's objectives in
establishing a science which not only "understands" but also is effective in the prediction and
control of behavior. We will return to this point later in this chapter.
Q15: Skinner's argument against psychic inner causes seems to take two forms. (What are
they?)
A15: a) Since they are unobservable, they can too conveniently account for anything.
b) Their supposed non-physical status removes them from everything we know
about cause and effect in the physical universe.
COMMENT: This material should be read carefully since most lay and/or philosophic
interpretations of behavior are of this kind. Essentially, the groundwork for an argument against
dualism is established in this section. Notice also that in all cases of discarding alternative
interpretations of behavior, Skinner points out how these interpretations might have arisen in the
first place.
Q16: How does Skinner describe a conceptual inner cause?
A16: As a behavioral description which has been inverted into a cause.
The Variables of Which Behavior is a Function.
pp. 31-35
Q17: Where do the independent variables that provide for a scientific analysis of behavior lie?
A17: Outside the organism, in its immediate environment and in its environmental
history.
6
COMMENT: Several other key issues are discussed by Skinner in this section. One,
which he almost breezes over, involves the topic of what the dependent variable of what a
science of behavior ought to be. In the drinking example, he points out that it is the probability of
a single act of drinking that should be determined. Keep this objective in mind when Skinner
begins talking about a rate of measure.
COMMENT: The next few pages return to the issue of psychic and physiological "inner"
causes, where Skinner refines his earlier arguments by discussing a causal against a science
oriented toward a "second link" analysis.
Q18: What are the links in a causal change chain of behavior?
A18: Environmental operations, inner states, behavioral outcomes.
Q19: What are the essential features of Skinner's objection to inner states?
A19: They are not relevant to a scientific analysis. Valid knowledge concerning the inner
state may help illuminate the causal relation between the first and third, but would
be of little practical utility.
COMMENT: Again, notice that the force of Skinner's argument is against inner states
(and we can correctly assume that he discounts the psychic and is referring here only to the
physiological) because of their current limited utility in a science of behavioral control. How do
you imagine he would discuss the topic of drug effects where currently a relatively powerful
control technology exists?
A Functional Analysis. pp. 35-39
Q20: Be able to give Skinner's position on what constitutes the laws of a science of behavior.
A20: The relationships between external conditions and the behaviors of the individual
organism.
Q21: What does Skinner mean by an analysis within the bounds of a natural science?
A21: That the independent and dependent variables must be observable and describable
in physical terms.
Q22: Briefly enumerate the potential sources of data for a behavioral analysis.
A22: a)
b)
c)
d)
e)
causal observations
controlled field observation
clinical observation
human laboratory studies
laboratory studies of lower organisms
7
Q23: What is Skinner's view on the position that there is an essential discontinuity between
humans and animals?
A23: That to assert this distinction prior to an evaluation of the facts is to beg the
question.
Q24: What are some of the advantages of studying lower organisms?
A24: a)
b)
c)
d)
simpler
behavior can be recorded for longer periods of research time
no complicating social relations between the subject and the experimenter
conditions can be better controlled, e.g., genetic histories, and deprivation states
Analysis of the Data.
pp. 39-42
COMMENT: The key to an understanding of the book lies in this section. As with the
text Verbal Behavior, Skinner refers to his objective as an "extrapolation" of certain known
relationships for the purpose of gaining an understanding of complex human events. Many have
accused Skinner of really providing a "hypothesis" about human behavior, but talking about it
rather than rigorously testing it, and even being guilty of denying that he is in fact hypothesizing.
The issue isn't that simple: For Skinner, science is a behavioral chain which proceeds from
observation to prediction and control. His extrapolations represent, for him, examples of stimulus
and response generalizations which are themselves recognized processes. But they are behavioral
processes, not logical or philosophical ones, If one does not understand Skinner's concept of
science, or more accurately, scientific behavior, then he seems to be violating the "scientific
principles of the hypothetico-deductive system by his extension of behavioral principles. On the
other hand, if you understand science as a behavioral process, not a rational or logical one (which
Skinner views only as helpful tools to sharpen scientific behavior) then his extrapolations are
completely in line with what currently is known about behavioral relationships.
SECTION II: THE ANALYSIS OF BEHAVIOR
IV. REFLEXES AND CONDITIONED REFLEXES
Man a Machine pp. 45-47
COMMENT: This, obviously, is a very brief treatment of mechanistic philosophy of the
17th century. Decartes hardly deserves all the credit (or blame). However, Skinner subtly heads
off scholarly criticism and acknowledges a philosophic debt by his selection of the section
heading, which is also the title of Julien de la Mettrie's famed treatise on human behavior.
Reflex Action.
Q1:
pp. 47-49
Define the following terms: stimulus, response reflex and reflex arc.
8
A1:
a) stimulus: an environmental event which produces (elicits) an almost inevitable
behavioral reaction.
b) response: the behavioral reaction to a stimulus.
c) reflex: a stimulus-response combination
d) reflex arc: a hypothetical neurological connection between sensory input and
behavioral reaction.
COMMENT: Skinner's treatment of "spontaneity", i.e., free will, is particularly
interesting since he obviously sees the contemporary situation as analogous. That is, free will is
a "null hypothesis" that can be eliminated only by bringing more and more behavior under
demonstrable environmental control. It is interesting to speculate whether or not any advocate of
a free will position would (a) be persuaded by such evidence or (b) even be willing to admit that
such an attempt should be made.
The Range of Reflex Action. pp. 49-50
Q2:
What is the range of reflex action?
A2:
Only a small fraction of the total behavior of the (human) organism can be
described by the principle of the simple reflex.
COMMENT: Notice that Skinner is delineating a reflex response as one which is almost
an invariant reaction to environmental stimuli. This distinction will be critical to his later
analysis of operant stimulus control.
Conditioned Reflexes. pp. 50-54
Q3:
Describe the process that Pavlov referred to as "stimulus substitution."
A3: A previously neutral stimulus acquires the power to elicit a response originally
elicited by another stimulus when paired with (reinforced by) that originally effective
stimulus.
COMMENT: There are a number of terms which usually appear in a treatment of the
conditioned reflex which don't appear here. Among them are conditioned stimulus (CS) and
conditioned response (CR). A conditioned stimulus is that previously neutral stimulus now
capable of eliciting a response similar to that elecitable by the originally effective stimulus. The
response to the conditioned stimulus is called the conditioned response. It is interesting to note
that the term "conditioned" is a mistranslation from Russian, and is more correctly read as
"conditional," referring to the fact that the effectiveness of a new stimulus in eliciting a response
is conditional upon continued pairing with the originally effective stimulus.
COMMENT: There are two points that Skinner makes in this section in addition to
briefly discussing the basic reflexive (Pavlovian, classical, or "respondent") conditioning process.
The first involves the distinction between "what a child knows" and "what a scientist knows"
about a given subject area. You might keep this in mind when next someone tells you that
operant principles are just common sense. Unfortunately, in an attempt to appear nonthreatening
9
to the layman, behavior modifiers have tended to emphasize the common sense approach, e.g.,
referring to the reinforcement principle as "Grandma's Law," etc.
Q4:
When can a scientist effectively dispense with explanatory fictions as causal accounts for
various observations?
A4:
One must be able to give a complete quantitative account of the process under
observation, in other words, be able to predict and control the phenomena in
question.
Q5:
What, according to Skinner, was not Pavlov's major contribution and why?
A5:
Pavlov's original efforts were devoted to an inferrential account of the physiological
processes underlying reflexive behavior. Skinner discounts this (a) because it was
demonstrably incorrect and (b) even when such an account is availabale, we are still
required to relate these processes to prior environmental events and to achieve
practical prediction and control over the resultant behavior.
The Survival Value of Reflexes. pp. 54-56
Q6:
How can evolutionary theory provide an account for the existence of reflexes and the
process of reflex conditioning?
A6:
There is considerable survival value in being able to respond automatically to
certain environmental stimuli, as well as, to come to be able to make similar
behavioral adjustments to new features of the environment as they occur within the
life of the organism.
COMMENT: Notice how Skinner describes the evolutionary process as a selection
mechanism which gradually determines a reflexive behavioral repertoire for a given species, both
in terms of certain inherited classes of behavior as well as in susceptibility to the reflexive
conditioning process. In a sense, it is the evolutionary consequences, i.e., survivability, that
determine the behavioral process... a situation which Skinner sees as analogous to the process of
operant reinforcement. An expanded treatment of this perspective is available in his paper, "The
Phylogeny and Ontogeny of Behavior" (1970).
Q7:
How might our inherited susceptibility to reflexive conditioning go awry?
A7:
later
When accidental or temporary pairings of neutral and eliciting stimuli produce
inappropriate responding.
The Range of Conditioned Reflexes. pp. 56-58
Q8:
What does Skinner see as a measure of the "range" of the conditioned reflex?
A8:
Its use in the practical control of behavior. (Again, a typical Skinnerian position
reflecting his concern with utility.)
10
Q9:
What are several ways that one uses (a) the reflex and (b) the reflex conditioning process
in the practical control of behavior?
A9:
a) to immediately evoke certain reactions such as laughing, crying, and blushing.
b) to arrange for later control by establishing conditioned reactions to desired
stimuli, e.g., to teach a soldier to react aggressively to the enemy,. or to
predispose a customer to react favorably to you or to your product in the future.
COMMENT: Notice how Skinner has begun to use reflex behavior to describe what are
commonly called emotional responses, such as fear, anxiety, aggression, favorable attitudes, etc.
Q10: How does one eliminate inappropriate conditioned reflexes? Describe two techniques.
A10: By presenting the conditioned stimulus and eliminating the original eliciting
stimulus. This may be done in two ways:
a) immediately, as in the case of the stutterer who is encouraged to talk to everyone
he meets, or
b) gradually, as in the case of buying a boy who fears dogs a puppy which
eventually "grows into" the fear producing stimulus, allowing extinction of the
fear response to occur gradually.
COMMENT: Skinner refers to several standard behavioral therapy techniques in this
section. The first one he mentions is a technique for eliminating alcoholism and smoking (p. 56)
referred to today as aversion therapy. This consists of the pairing of some noxious stimulus, such
as might evoke vomiting, with the undesirable behavior. The technique presumable results in the
behavior coming to elicit incompatible emotional responses, and thus produces a decrease in the
undesired behavior. (Much more of this will be said later.)
The other process Skinner describes involves using reflexive extinction to eliminate
undesirable conditioned reflexes. As above, this can be accomplished either immediately or "in
gradual doses." One version of this latter method has come to be called systematic
desensitization, a technique popularized by Wolpe. It involves teaching a patient certain
"relaxation techniques," then gradually exposing him to an imaginary or real hierarchy or
increasingly fearful stimuli. As he can successfully relax in the presence of each, the next is
presented until finally the inappropriate anxieties or fears are eliminated.
V. OPERANT BEHAVIOR
The Consequences of Behavior. p. 59
COMMENT: This brief introduction merely sets the stage for the analysis of operant
behavior by distinguishing between the reflex as primarily concerned with internal physiology
and another kind of behavior "which has some effect upon the surrounding world." Skinner
points out that our concern with the latter is both practical and theoretical.
11
Q11: What are some special characteristics of behavior which has an effect upon the
environment?
A11: The consequences may "feed back" into the organism and change the probability
that the behavior which produced them will recur.
Learning Curves pp. 59-62
Q12: What was Thorndike's discovery as he describes it?
A12: The Law of Effect: that behavior was "stamped in" when followed by certain
consequences. The discovery resulted from his measurement of the successive
amounts of time that it took a cat to escape from a puzzle box. Thorndike argued
that the process could be described without relying upon the inference of mental
processes within the cat.
Q13: How did Thorndike present his data?
A13: In the form of a "learning curve," a graphic representation of the decreasing
amount of time required for the cat to escape across trials.
Q14: What is Skinner's argument with this kind of data?
A14: That it doesn't reveal the process that it claims to. Instead it reveals the gradual
elimination of competing responses, which is more a function of the apparatus than
of a basic learning mechanism.
COMMENT: Some contemporary learning theorists believe that the successive
elimination of alternatives is the basic learning process (e.g., Stadden & Simmelhay,
Psychological Review, 1971, 78, 3-43.).
Q15: How does Skinner interpret the consistence ("smoothness") and similarity of general
properties revealed by learning curves?
A15: a) The consistency or smoothness can result from averaging many individual cases,
and thereby not reflect an individual process.
b) The general properties (of negative acceleration, for example) may result from
ongoing basic processes, but the learning curve is not the most direct record of these
processes. (This argument, of course, is made by analogy.)
Operant Conditioning. pp. 62-66
COMMENT: This may be the most important section in the book in terms of Skinner's
definition of the basic operant conditioning process. It certainly is the one most frequently cited
by others who wish to describe his views on the topic. The section begins with a discussion of
12
probability, describing how he believes probability can best be measured, and concludes with a
basic set of definitions.
Q16: What form of data best provides a basis for the concept of probability of response?
A16: The concept is best estimated by observed frequencies of certain behaviors. The
observation should occur under standard conditions where the behavior can
repeatedly occur and not be appreciably interfered with by competing responses.
COMMENT: For the mathematicians among you, this may hardly be adequate. It is, of
course, the means whereby skinner justified his use of a rate measure as an index or estimate of
the probability of a single response (which, as you recall, he earlier stated should be the
dependent variable in a science of behavior.) The real rationale for this argument is extremely
sophisticated, and hinges upon skinner's interpretation of logic and mathematics. He begins with
the assumption that there is a behavioral origin for logic and mathematic concepts. In the area of
"probability," observed frequency must then be the basis for the concept. A detailed explication
of this point of view is available in Verbal Behavior (pp. 418-431), but unfortunately doesn't
contain any direct references to this particular problem of probability as a concept.
Q17: How does Skinner solve the technical problem of providing for the experimental
observation and interpretation of frequency of behavior?
A17: The experimental organism is placed in a quiet box which tends to eliminate or at
least hold constant the competing behavior. The behavior of interest is directly
observed or mechanically recorded.
Q18: Within this environment, how does one study the "stamping in" process?
A18: a) a consequence is provided, e.g. food to a food-deprived organism.
b) the food is given immediately following a particular behavior which can be freely
and rapidly repeated by the organism, and can be easily observed and recorded by
the experimenter.
Q19: How is the resultant data described?
A19: "...we make a given consequence contingent upon certain physical properties of
behavior..., and the behavior is then observed to increase in frequency."
COMMENT: We now proceed to a discussion of several basic operant concepts and
term definitions. This material is critical, and these concepts and definitions should be mastered
before going any further in your reading.
Q20: What is an operant?
A20: A class of behaviors, defined by an effect which can be physically specified and upon
which reinforcement is contingent. It is so named because of its characteristic
feature of operating upon the environment.
13
COMMENT: It is essential that the reader recognize that this is an "effect" or functional
definition, not one which is based upon any specific topography of the response. Thus the
operant is defined in terms of what it accomplishes, and not what it looks like.
Q21: Distinguish between an operant response and a reflexive response.
A21: An operant response is any specific member of an operant class of behaviors. It
bears no simple invariant relation to any prior stimulus.
Q22: What is operant conditioning?
A22: An increase in the frequency of the operant resulting from reinforcement.
Q23: Distinguish between operant and Pavlovian conditioning.
A23: In Pavlovian conditioning, the reinforcer is contingent upon the occurrence of a
stimulus. In operant conditioning, the reinforcer is contingent upon the occurrence
of a response.
Q24: Distinguish between reinforcer and reinforcement.
A24: The reinforcer is a specific stimulus (e.g. food) which has the effect of strengthening
an operant. Reinforcement is the operation of presenting that stimulus contingent
upon the occurrence of a designated operant response.
Quantitative Properties. pp. 66-68
COMMENT: Much of the preliminary material here is a rationale for the selection of an
easily recorded response, the key peck, as an operant to be studied.
Q25: What are some of the factors which determine the quantitative effect of reinforcement?
A25: a) feedback of some sort is essential
b) kind, amount, and immediacy of reinforcement are all important.
COMMENT: This latter material is relevant to the all-or-none versus incremental theory
of learning. Does learning occur gradually as a result of repeated trials (as a "learning curve"
might indicate), or is a single reinforcement sufficient to maximally increase the probability of a
response with any lesser effect due to competing factors? Skinner puts an odd twist on the old
controversy by his observation that if learning in the rat and/or pigeon is an instantaneous shift
from low to maximum probability, then the vaunted human superiority in learning must reflect
something other than the speed of conditioning.
The Control of Operant Behavior. pp. 68-69
14
Q26: Once conditioned by food reinforcement, what additional source of control is obtained
over the subject?
A26: The frequency of the operant can be controlled by level of food deprivation.
COMMENT: Notice how Skinner makes a clear distinction between deprivation and
reinforcement. Many theorists discuss deprivation as a factor which determines the effectiveness
of a reinforcer. However, from the standpoint of a manipulable independent variable which can
be used to control behavior, deprivation is a completely separate experimental procedure from
reinforcement. Again we see the distinction between a "control-oriented" and an "explanationoriented) approach to a science of behavior. The same distinction obviously will hold true for
stimulus control, as Skinner also implies.
Operant Extinction. pp. 69-71
Q27: Define operant extinction.
A27: When reinforcement is no longer forthcoming, a previously reinforced response
becomes less frequent.
Q28: How can one observe the properties of extinction?
A28: By studying the data (smooth curves) resulting from recording the frequency of
responding following the removal of reinforcement.
COMMENT: For those of you who haven't read Skinner before, you are probably
wondering about his use of the term "smooth curve." Remember that for Skinner, science is a set
of behaviors that begin with observation and result in prediction and control. In order to derive
correct conclusions about causal relationships orderliness must be observed. Thus obtaining data
that is "smooth," or orderly, is essential.
Q29: What process can interfere with orderly extinction data?
A29: The emotional effect that is a reaction to non-reinforcement, sometimes referred to
as frustration or rage.
Q30: How can this effect be eliminated?
A30: By frequent exposure to extinction.
Q31: What is the general relation between reinforcement and a number of extinction
responses?
A31: The more reinforcers, the more extinction responses.
Q32: What is more important than the simple number of reinforcers given in determining the
number of responses in extinction?
15
A32: The schedule of reinforcement: intermittent reinforcement (reinforcement for
responding other than on a 1:1 ratio) results in more protracted extinction
responding than would the same number of reinforcers given on a reinforcer per
response basis.
Q33: Distinguish between extinction and forgetting.
A33: In forgetting, the conditioning effect disappears as a function of time alone. In
extinction, the conditioned response must be emitted and go without reinforcement.
COMMENT: This latter section, "The Effects of Extinction," is a good preview of what
is to come. Skinner takes a relatively common form of human behavior (or non-behavior in this
particular case) and shows how it can be analyzed solely in terms of basic operant processes. It
also is a typical example of Skinner's style; constant paraphrase, frequent literary allusions, and
the general attitude that it is all very simple if you just know the right behavioral relationships.
What Events are Reinforcing? pp. 72-75
Q34: What is the defining characteristic of a reinforcer?
A34: Whether or not it reinforces, e.g., whether or not the behavior upon which it is
contingent increases in frequency.
Q35: Is the above a "circular" definition?
A35: No, it is simply an effect definition.
COMMENT: Actually, the question of circularity is an issue that has been discussed for
quite a while. (Some might say ad nauseum.) Given the earlier definition of an operant as a
class of behaviors that can be specified by a physical effect upon which reinforcement is
contingent, it is clear that Skinner is defining operant in terms of reinforcement, and
reinforcement in terms of its effect upon operants. Is this necessarily circular? Or if it is, is it
necessarily bad? Schick (Journal of the Experimental Analysis of Behavior, 1971, 15, 413-423)
believes that what is involved is the empirical identification of a two-term relationship, a
perfectly "acceptable" practice. Although neither operants or reinforcers can be identified
independently of one another, their interaction clearly can be identified and represents an
environmental-behavioral relationship that is readily distinguishable from the earlier discussed
stimulus-response reflex. Whether these two potential effects of environmental events, to elicit
or to reinforce, exhaust the possibilities of environmental-behavioral interactions (as Skinner
seems to imply on p. 65) is another issue.
Q36: What two kinds of events are reinforcing?
16
A36: a) stimuli that are presented to the organism and increase responding (e.g., food) are
called positive reinforcers.
b) stimuli that increase responding when removed (e.g., shock) are negative
reinforcers.
COMMENT: Remember that positive and negative refer to the direction that the
stimulus is going when it produces an increase in responding. Both positive and negative
reinforcement strengthen behavior. It is also worth noting at this point that some recent theorists,
e.g., Premack (1965), include the opportunity to engage in high probability behaviors as
reinforcing events. That is, one can empirically determine, by counting, high and low probability
responses in any organism's behavioral repertoire. A high probability behavior will reinforce a
low probability behavior when access to the high probability behavior depends upon the
occurrence of the designated low probability response. This effect is referred to as the "Premack
Principle." Thus, some say that not only environmental stimuli but also various behaviors can
function as reinforcing events.
Q37: How do we identify and evaluate the strength of reinforcing events in other people's
lives?
A37: By observation: by seeing what events capture another's interest, by seeing what
activities take up a significant portion of someone's time, by watching behavior
"come and go" as certain consequences are presented and withheld.
Q38: Why can't you simply ask a person what reinforces him?
A38: A given reinforcing relationship may not be obvious to the individual being
reinforced. (Much more of this will be said later. It involves the whole issue of
"awareness" in conditioning.)
COMMENT: The last paragraph touches upon a number of points, all centered around
the topic of an inherited capacity to be operantly reinforced. First, large differences between
species in the nature of effective reinforcers can reasonably be expected. Second, within species
variations are more likely to be due to individual histories than genetic differences. Finally, and
in any event, the identification of reinforcers can only be made in terms of their effect.
Conditioned Reinforcers pp. 76-81
Q39: What is a conditioned reinforcer and how is it established?
A39: a) a conditioned reinforcer is a previously neutral environmental stimulus which has
acquired reinforcing effectiveness...
b) by being paired with an already effective reinforcer (respondent conditioning)
Q40: What are some factors affecting the establishment of conditioned reinforcers?
17
A40: a) the more pairings with an effective reinforcer, the more reinforcing the
conditioned reinforcer becomes.
b) there shouldn't be too long a delay between the occurrence of the conditioned
reinforcer and the presentation of the already effective reinforcer.
c) the conditioned reinforcing effectiveness is lost if the pairing is discontinued.
COMMENT: It is an unresolved experimental issue whether or not simple pairing is
sufficient to establish a conditioned reinforcer. Some say the neutral stimulus must first be
established as a "discriminative stimulus" (Chapter VII). Another more recent point of view is
that the conditioned reinforcer must "contain information" availability of reinforcement and its
absence indicate no (or differential) reinforcement. In any case, the process described by Skinner
for establishing the light as a conditioned reinforcer for the pigeon (p. 76) works, no matter how
it is analyzed.
Q41: How do natural contingencies produce conditioned reinforcers?
A41: Biological reinforcers, e.g. food, water, and sex, often are obtained only following
"precurrent" behavioral chains. The stimuli generated by this precurrent behavior
can become reinforcing, and, thus, maintain the necessary precurrent behavior.
Q42: What accounts for the apparent long delays between human activity and resultant
reinforcers?
A42: Intervening events become conditioned reinforcers.
Q43: How are these intervening conditioned reinforcers important in the practical control of
human behavior?
A43: It is sometimes necessary to establish and maintain behaviors in situations, e.g.,
industry, school, and hospitals, when the terminal reinforcer may be relatively long
delayed.
Generalized Reinforcers. pp. 77-81
Q44: What is a generalized conditioned reinforcer?
A44: A conditioned reinforcer that has been paired with many different kinds of
reinforcers.
Q45: Why is the generalized conditioned reinforcer especially useful in the control of
behavior?
A45:
Because it is not dependent upon any single deprivation condition for its
effectiveness.
18
Q46: On the next few pages, Skinner describes six generalized conditioned reinforcers that are
especially important in the control of human behavior. Identify and describe the way that
each is characteristically established.
A46: a) Manipulation of the physical environment: many primary reinforcers are
following successful control of the physical world. (Skinner suggests that the
sensory feedback from manipulating the environment may be an unconditioned
reinforcer because of its biological importance.)
b) Attention: reinforcement from the social environment depends upon obtaining
the attention of relevant reinforcing agents, e.g., your parents.
c) Approval: socially mediated reinforcement characteristically is obtained only
when the reinforcing agent approves of the behavior of the individual.
d) Affection: derives its primary effect from subsequent sexual contact, but can be
related to other events as well.
e) Submissiveness: reinforcement can be obtained, or taken, from others who
submit to the demands of the individual.
f) Token: characterized by physical specifications, the token is usually deliberately
established as an economic mediator, but may also be other physical signs such as
grades, prizes, etc.
Q47: What are some difficulties involved in the effective use of attention, affection and
approval?
A47: They are difficult to physically specify because they are aspects of someone else's
behavior and not physical events. Thus, precise use is difficult to obtain.
Q48: What are some advantages of tokens?
A48: a) They can be related to many events, as money is.
b) Their precise physical specification permit establishing detailed and tightly
controlled contingencies.
c) Their reinforcing effect via conditioning can be successfully established and
maintained.
Q49: Why does Skinner doubt that these generalized reinforcers (except, possibly, the first one)
are reinforcing in and of themselves without being established by a conditioning process?
A49: The necessary conditions (primarily social) probably haven't existed long enough
for a capacity to be reinforced by such events to become biologically established
(evolutionarily selected).
COMMENT: The last paragraph in this section is somewhat of an enigma. Given the
preceding analysis on the nonbiological origin of the effectiveness of generalized reinforcers,
Skinner apparently can only mean one or both of the following. First, given a sufficiently long
history of pairings with enough different reinforcers, a generalized reinforcer becomes
autonomous and no longer need be "backed up" by anything else. A second possibility might be
19
that a sufficiently long history of pairings establishes a conditioned reinforcing effectiveness so
strongly that it is unlikely to extinguish during the lifetime of the individual.
Why is a Reinforcer Reinforcing? pp. 81-84
COMMENT: This section provides arguments against certain "theories" about why
reinforcement works, and a discussion concerning the possible biological mechanisms underlying
the capacity to be reinforced by certain kinds of stimuli.
Q50: What is Skinner's argument against defining reinforcers as a pleasant or satisfying
events?
A50: Measures of pleasantness or satisfaction refer to behavioral events, not physical
stimulus properties, and as such are probably collateral effects of reinforcement, not
the definitional properties thereof. It is a redundant description.
Q51: Why can't you ask someone whether he finds an event satisfying or pleasant?
A51: The report is likely to be unreliable. (Comment: As Skinner indicates, a detailed
explication of this problem will be provided in Chapter XVII, "Private Events in a
Natural Science."
Q52: Can reinforcers be reliably defined as events which reduce deprivation?
A52: No, reinforcement can be effective prior to any substantial effect upon deprivation
measured in other ways. Also, some events are reinforcers for which deprivation is
an irrelevant state, e.g., the tinkle of a bell for a baby. (We might add at this point,
some events for which deprivation is a relevant state do not function as reinforcers,
vitamins for example.)
Q53: What is the relationship between deprivation, satiation, and reinforcement?
A53: The relationship is an evolutionary one. The capacity to be reinforced, either
positively or negatively, by certain environmental events is due to their biological
significance for the organism. The individual that is "reinforced" by food acquires
effective food-getting behavior. When deprivation prevails, the individual's
frequency of food-getting behavior increases. Both of these behavioral effects are
biological advantages that would be evolutionarily selected.
Q54: What are some disadvantages in biologically determined reinforcers?
A54: Social evolution has provided situations where the capacity to be strongly reinforced
in certain ways may be to the disadvantage of the organism, e.g., being strongly
reinforced by sugar.
Accidental Contingencies and "Superstitious" Behavior. pp. 84-87
20
Q55: What is the only property of a contingency required to produce operant conditioning?
A55: The temporal property; the reinforcer must follow the response.
COMMENT: There are a couple of points here. First is Skinner's reiteration of the
irrelevance of an individual's ability to "talk about" reinforcement in order for it to be effective.
The second involves Skinner's use of the term "contingency" without any explicit definition. In
his recent article, "The Role of the Environment" (1970), Skinner states it this way, "An adequate
formulation of the interaction between an organism and its environment must always specify
three things: (1) The occasion upon which a response occurs, (2) the response itself, and (3) the
reinforcing consequences. The interrelationships among them are the "contingencies of
reinforcement." In other words, a contingency is the specific description of the complete
environmental and behavioral requirements for reinforcement, the details of when and where and
how much you reinforce for what specific response.
Q56: Define and give an example of superstitious behavior.
A56: a) superstitiious behavior is behavior that has been operantly conditioned when the
relationship between the response and the reinforcer was accidental.
b) pigeons can be shaped to emit superstitious behavior by providing noncontingent
reinforcement at fixed intervals.
Q57: In what sense does superstitious operant behavior represent a miscarriage of the
conditioning process?
A57: Accidental reinforcement can establish a useless repertoire of "superstitious"
behavior by virtue of the organism's susceptibility to operant conditioning.
COMMENT: Notice how Skinner carefully distinguishes between the origin of
superstitious beliefs, as resulting from accidental reinforcement, and their current practice, which
is the result of deliberate transmission primarily through verbal behavior.
Goals, Purposes, and Other Final Causes. pp. 87-90
COMMENT: There seem to be two important points in this section. The first is an
operant analysis of what is meant by goals, purposes, etc. The second is the way in which this
analysis is accomplished. Notice again how Skinner does not simply say "there are no final
causes" but rather goes to considerable length to point out how such ideas might have
erroneously arisen. It is one thing to have a theory, it's another to show how yours can account
for someone else's.
Q58: How do you rephrase a statement which implies that behavior is occurring because of
what will follow?
A58: That the behavior is occurring because of what has characteristically followed that
kind of behavior in the past.
21
Q59: Is purpose a property of behavior?
A59: No, purpose is a way of referring to controlling variables.
Q60:
If asked why he is in school, a student might reply, "to get a degree," or "to obtain
knowledge." How could this be reanalyzed more operantly?
A60: A student studies, attends classes, and participates in campus activities because of
immediate contingencies in the form of grades, peer group pressure and approval,
and, occasionally, signs of increasing skills in areas that are important to him.
School enrollment per se, is likely to be a function of a history of reinforcement
(both positive and negative), for being scholarly and complying with parental and
teacher demands and wishes. Verbal behavior about why one is in school is also
likely to have been directly reinforced by parental and peer approval. Thus, stating
high goals and noble purposes are usually a function of a history of reinforcement
for having stated such.
MISSING: QUESTIONS 61, 62, 63, 64 and ANSWERS 61, 62, 63, 64.
VI. SHAPING AND MAINTAINING OPERANT BEHAVIOR
The continuity of behavior. pp. 92-95
Q65: What is meant by "reinforcing a series of successive approximations"?
A65: A process (called "shaping") which involves establishing complex or low likelihood
behaviors in an experimental subject by reinforcing initially any response which
resembles or "approximates" the final behavioral objective, then withholding
reinforcement until another more accurate response appears (which is then
reinforced, etc.).
Q66: How is the pigeon's key peck response different from the usual operant behavioral
objectives?
A66: It appears as a rather fixed behavioral unit, probably the result of genetic selection.
COMMENT: This material concerning what constitutes the basic elements of behavior is
rather difficult. There seems to be several points: First, that operants are functional units that
can be counted, but we shouldn't forget the underlying continuity of behavior. If we do, pseudoproblems involving transfer and response induction can arise. Skinner believes that the "true
behavioral element" that is strengthened by reinforcement is carried best by the concept of a
behavioral "atom." Presumably the point is that operants or "acts" are analogous to molecules,
which can provide a valid level of understanding, but that there is also an underlying, but yet
incompletely understood, level of behavioral analysis involving the elements of which these acts
are constructed. The rationale for this perspective lies in the difficulty operant conditioners have
in talking about the shaping process. As yet there are not quantitative statements that can
22
effectively be made concerning how one goes about shaping responses other than to "reinforce
successive approximations" (and then practice a lot). We can push the "molecules" all over the
place, but we construct them intuitively.
Differential Reinforcement. pp. 95-98
Q67: What kind of contingency improves "skill?"
A67: The differential reinforcement of responses possessing special properties.
COMMENT: As Skinner points out, this is a section discussing how one improves
topography, the "shape of a response", not how one establishes an operant class in the first place.
The operant set of responses is already available; differential reinforcement simply refines this
class toward a more effective form by variations in reinforcement (differential reinforcement)
depending upon specific response characteristics.
Q68: How could you condition a child to be "annoying?"
A68: Through differential reinforcement; by requiring him to become increasingly
offensive before you pay any attention to him.
The Maintenance of Behavior. pp. 98
Q69: Classic learning theories fail to treat what fundamental issues?
A69: How reinforcement maintains behavior after the behavior is already acquired.
Intermittent Reinforcement. pp. 99-106
Q70: How might one distinguish between responses which act upon the physical environment
from responses which have social effects?
A70: Responses toward the physical environment are consistently reinforced, responses
which are reinforced through the mediation of other people are reinforced only
intermittently.
Q71: What characteristics of operants that are intermittently reinforced make them useful in
social control systems?
A71: a) They are resistent to extinction, i.e., will persist following the termination of
reinforcement.
b) They are relatively stable in occurrence.
c) You can increase the ratio of responses required for each reinforcer, thus
obtaining more behavior from the organism for each pay off.
Q72: What are two primary ways of establishing various schedules of reinforcement?
23
A72: a) contingencies arrnged by a system outside the organism, i.e., a clock.
b) contingencies which depend upon some feature of the behavior itself, i.e.,
numberof responses.
COMMENT: You should remember that those schedules arranged primarily by
conditions prevailing outside the organism, such as time, are not independent of the organism's
behavior. That is, reinforcement isn't delivered just because time has passed --- that would be
called free reinforcement. Instead, the organism has to respond after a fixed (or variable) amount
of time has elapsed before reinforcement is forthcoming.
Q73: In the following several pages, a number of standard schedules of reinforcement are
described. Be able to describe how each of the following schedules is programmed, and
the resulting behavior; fixed interval, variable interval, fixed ratio, and variable ratio.
A73: a) fixed interval; reinforcement is programmed for a response occurring after a
fixed interval of time has passed. The behavioral rate is low immediately following
reinforcement and then gradually accelerates until reinforcement is again provided.
b) variable interval; reinforcement is programmed for a response following a
period of time which is not constant but rather randomized to provide an average
rather than fixed interval of time. The response rate is steady showing little or no
fluxuation in rate and is exceptionally resistant to extinction.
c) fixed ratio; reinforcement is programmed solely as a function of the number of
responses which have occurred, e.g., every fifth response is reinforced. The
behavior is characterized by virtual nonresponding immediately following
reinforcement with a rather abrupt switch to a high and stable response rate until
the necessary number of responses has occurred for reinforcement. Extreme ratios
produce "ratio strain," which is a breakdown in performance that is not a simple
result of fatigue.
d) variable ratio; reinforcement is contingent upon the emission of a number of
responses that is not constant, but which varies around an average number. The
schedule generates a high steady response rate with little or no pausing and is
extremely resistant to extinction.
COMMENT: A number of other issues contained in this section are worth mentioning,
some involving social considerations. First, notice how Skinner points out that the weekly wage
is a more complex situation than a simple fixed interval schedule (p. 101). This particular
situation has been analyzed in considerable detail by Michael (1969). Second, the analysis of
fixed ratio schedule as analogous to human piecework provides a prospective on why labor
unions were formed to prevent industry from constantly "upping the ratio."
There are some theoretical issues as well. The analysis of post-reinforcement pausing in
both fixed interval and fixed ratio schedules as resulting from discriminated non-reinforcement
intervals is important. One misstatement involves the pattern of fixed ratio responding. The
shift from nonresponding to responding is not a smooth gradient of acceleration as Skinner states
on p. 103, but rather a relatively abrupt shift from nonresponding to a rather high rate.
24
Q74 : MISSING
A74: One on which reinforcement may be determined by either an interval or a ratio
schedule, or both.
Q75: What is a combined schedule?
A75: MISSING
COMMENT: As Skinner states these are not simple schedules. Some recent research
has elaborated a number of distinguishing features of several possible combinations. Some of
these schedules are quite well understood. Differentially reinforcing high rates (DRH) and
differentially reinforcing low rates (DRL) have quite simple effects: they produce what they
reinforce. Recent efforts have also produced schedules which tightly control the interresponse
times. One important area of schedule analysis which followed the publication of this text
involves the multiple response-reinforcement relationships called concurrent schedules of
reinforcement. For a detailed background in the role of reinforcement schedules in behavioral
control, two important sources should be consulted: Ferster and Skinner, Schedules of
Reinforcement (1957), and The Journal of the Experimental Analysis of Behavior.
VII. OPERANT DISCRIMINATION
Discriminative Stimuli. pp. 107-110
Q76: Define and describe the three-term contingency that produces discrimination.
A76: Discrimination is the increased probability of an operant under certain stimulus
conditions, and is produced by utilizing those conditions as the occasion upon which
an operant response will be reinforced.
COMMENT: In differentiating the discriminative operant from the conditioned or
unconditioned reflex, which shows a similar stimulus-response pattern, Skinner uses the term
emitted to describe the production of the operant and elicited to describe the occurrence of the
reflex.
Q77: What form of behavior control does discrimination provide?
A77: When a discrimination has been established, we may alter the probability of a
response instantly by presenting or removing the discriminative stimulus.
Q78: Give examples of operants under the control of the physical environment, the social
environment, or verbal stimuli.
25
A78: a) picking and eating apples as a function of their color
b) greeting a smiling person
c) academic behavior.
Q79: How do we use the discrimination process to control behavior, in addition to directly
altering probabilities (as in Q77)?
A79: By establishing discriminations in order to control future responding.
Voluntary and Involuntary Behavior. pp. 110-116
COMMENT: This section involves analyses of two traditional ways of distinguishing
between operant and respondent behavior; voluntary versus involuntary, and striped muscles
versus smooth muscles and glands.
Q80: What were three historic criteria used to distinguish the reflex from other behavior?
A80: The reflex was said to be innate, unconscious, and involuntary, in that they could
not be controlled when elicited.
Q81: Why can the use of "who is in control" no longer be considered a distinguishing feature
between operants and respondents?
A81: In a science of behavior, there are no inner agents to whom such powers can be
attributed.
Q82: Can the distinction between operant and respondent be based upon the issue of control
versus lack of control?
A82: No, since no behavior is "free," control only refers to degree of probability.
COMMENT: In the preceding section, Skinner elaborated the distinction between the
conditioning procedures that establish conditioned reflexes and discriminative operants. In this
section, he refers to another distinguishing feature, quantitative relations. A respondent shows a
correlation between increased intensity of eliciting stimuli and increased magnitude and
decreased latency of the resulting respondent. These relations do not hold with the
discriminative operant.
Q83: MISSING
A83: Yes, different conditioning histories and different quantitative relationships permit
a distinction between the conditioned reflex and the discriminative operant.
Q84: Why has the concept of will survived longer in the interpretation of operant behavior than
in the interpretation of respondent behavior?
26
A84: Because the discriminative stimulus shares its control of the operant with other
variables, like deprivation and history of reinforcement, and thus its effect is less
obvious (and possibly even reversible given certain conditions.)
COMMENT: Skinner now has included "will" as an inner agent. He argues against such
a construct on the same two grounds, impractical as a source of control and simply raising
another agent to be accounted for.
Q85: Can a distinction between respondent and operant behavior be based upon musculature?
A85: Somewhat, since many reflexes are concerned with internal economy where smooth
muscles and glands play a major role.
Q86: Can reflexes be operantly conditioned?
A86: No, but a similar effect can be achieved by conditioning an operant to control the
respondent, for example, raising the pulse by exercising, or to imitate the reflex, as
in operant sneezing.
COMMENT: This position is not quite so clear-cut as Skinner stated it. The research
area of biofeedback seems to support a contrary notion, at least in some cases. Nevertheless, it
still is important to distinguish between "true" reflexes and discriminated operants which
resemble them, e.g. operant v. respondent crying in children.
Q87: What is Skinner's analysis of the social roles of the concept of personal responsibility?
A87: The concept characteristically is used to condition "a sense of responsibility" or "an
obligation to society."
Q88: How might such objectives be better obtained?
A88: By eliminating the concepts of free will and personal responsibility and designing
alternative practices which recognize the importance of the variables which control
behavior, e.g., reinforcement.
COMMENT: Obviously, the final paragraph in this section directly anticipates Beyond
Freedom and Dignity (1971).
Discriminative Repertoires, pp. 117-122
COMMENT: Skinner is suggesting that any realm of a discriminative stimulus (field)
can be broken down into smaller units to compose a continuous field, just as any operant can be
reanalyzed into behavioral elements or "atoms" (p. 94), thus emphasizing the underlying
continuity of the environment and behavior. Any potential discriminative operant can be
represented by a "point-to-point" correspondence between the discriminative stimulus field and
the units of the operant. In such a case, the "functional units" would be each field element
27
functioning as a discriminative stimulus for each behavioral elements. Ordinarily, however, the
functional units are not so small, since point-to-point correspondence is rarely if ever established.
Q89: How is a skilled copyist distinguished from an unskilled drawer of lines?
A89: A history of differential reinforcement has provided a much larger repertoire of
responses to lines in the skilled copyist than are available to the unskilled or novice
copyist.
COMMENT: Skinner uses the example to represent many of the possible degrees of
correspondence between the stimulus field and the discriminative operant. A point-to-point
correspondence is reached when exact copy can be reproduced. Near, but imperfect,
correspondence is represented by "individual style." The electrical engineer can emit only
discrete units to certain stimuli, for example, stereotyped symbols of resistors, batteries, etc.
Q90: What two sources of differential reinforcement are available to train the skilled copyist?
A90: a) a teacher may provide many differential contingencies, reinforcing each
improved approximation to accurate copying.
b) the artist himself may differentially reinforce himself automatically after he has
become "discriminating."
COMMENT: This issue of automatic self-reinforcement is a little complex. Skinner's
point is that if one has been taught to discriminate good copying from bad, one can react
appropriately to his own efforts as well, thus "reinforcing" himself when his copying is good.
Many of us learn to recognize "good" behavior long before we can emit it consistently, for
example, golf, and thus when we do hit a good shot, we are likely to tell ourselves so. The next
issue is in what sense can praising ourselves function as a reinforcer? Skinner's position is that
once praise has been established as a conditioned reinforcer from its prior history of being paired
with other reinforcers, it still functions as a conditioned reinforcer when self-administered. The
effect of reinforcement is independent of who administers it. An extended analysis of this
process is available in Verbal Behavior (1957, pp. 438-446).
Q91: Describe the repertoire of the singer with "poor pitch."
A91: A poor singer is one whose vocal repertoire is badly matched with the tones of the
physical stimulus field.
Q92: Give an operant definition of imitation.
A92: When the stimuli generated by a pattern of responses in one organism function as a
discriminative stimulus to produce behavior of the same pattern in another, the
second is said to be imitating the first.
Q93: What is an inverse imitative repertoire? Give an example.
28
A93: When the imitator is shaped to do the opposite of the model, the resultant repertoire
is an inverse imitative repertoire. The "follower" of two dance partners must have
this type of behavior.
COMMENT: The establishment of an imitative repertoire has become a standard
practice with current behavior modification psychologists. It is easier to shape a subject (for
example, a retarded child) to imitate, and then teach them new behaviors imitatively, than it is to
try to "shape" each new behavior separately.
Attention. pp. 122-125.
Q94: What is attention?
A94: The controlling relation between a response and a discriminative stimulus.
Q95: How can we tell whether or not someone is "paying attention?"
A95: By observing whether or not he responses appropriately to a particular stimulus.
COMMENT: This is an important section, inasmuch as it makes clear use of Skinner's
functional approach to a behavioral analysis, notice how Skinner rules out "receptor orientation",
a topographical description, as a definition of attention, and relies, instead, upon the
discriminative relationships between the stimulus and a response. Consider this approach the
next time you read an article involving the conditioning of "study" behavior or "attentive"
behavior in the classroom.
Temporal Relations between stimulus, response & reinforcement, pp. 125-128
Q96: Which characteristics of the natural environment are responsible for the occurrence of a)
respondent conditioning, b) operant conditioning, and c) operant discrimination?
A96: a) the tendency for certain stimuli to occur together in the natural environment
b) the tendency for certain behaviors to effect changes in the environment
c) the fact that certain situations are occasions for certain behaviors to have
characteristic effects.
Q97: What is the role of temporal relations in each of the above?
A97: Each of the above relationships may have their maximum effect when a specific
temporal interval is obtained.
COMMENT: Notice that Skinner is not discussing what is commonly called "delayed"
reinforcement, either in the respondent or operant case. Instead, he is talking about the temporal
specification as part of the contingency, such that the maximum probability of response occurs
within a given stimulus situation after a specific period of time has elapsed.
Q98: What are the behavioral effects of waiting for a delayed consequence following a
discriminative stimulus (often called "expectancy" or "anticipation"?
29
A98: a) conditioned reflexes involving pulse, respiration, etc., (often referred to as the
"activation syndrome")
b) attention
c) emotional changes, either reflecting "joy" if the anticipated event is to be
positively reinforcing or "anxiety" if the anticipated outcome is to be aversive.
d) preparatory set; postural adjustments which attempt to maximize the
effectiveness of subsequent responding.
COMMENT: Actually, only two behavioral processes are involved, respondent
conditioning being responsible for (a) and (c) and operant conditioning for (b) and (d). Skinner
extends the analysis to account for different kinds of conditioned behavior so far as they have
commonsense labels, for example, attention, anxiety, etc.
VIII. THE CONTROLLING ENVIRONMENT
The Importance Of The Environment. pp. 129-130
COMMENT: This section makes two rather important points: First, it suggests the
potential complexity of behaviors under various combinations of conditioning histories and
stimulus events, as indicated by the "X seeing Y" example. Second, Skinner implies that current
practices in clinical psychology are inadequate in this regard, a theme which will be greatly
expanded later in the book.
The Analysis of Stimuli. pp. 130-131
Q99: How do we begin to study the effects of the physical environment?
A99: By describing them in physical terms.
Q100: What about physical events that are undetectable by the organism?
A100: They are simply ignored as irrelevant.
Q101: Are all stimulus detection problems simply functions of receptive physiology?
A101: No, variables related to conditioning history, motivation, and emotion can affect
sensory reaction.
COMMENT: The last paragraph suggests the generality of processes across stimulus
dimensions that Skinner sees in several fundamental areas of behavioral control. Earlier, in his
rationale for the study of lower organisms, he stated that important behavioral relationships are
generalizable across species. Essentially, he is implying that if we study the effects of visual
30
stimuli on pigeons, we will begin to understand discriminative stimulus control of human
behavior.
Induction. pp. 132-134
Q102: What is stimulus generalization (or induction)?
A102: Physical events similar to conditioned discriminative stimuli will have similar,
though lesser, behavioral affects.
Q103: How does one account for this effect?
A103: Skinner appeals to the concept of identical elements within the stimuli as factors
responsible for generalization or induction.
Q104: What are some human situations which demonstrate this process?
A104: a) Freud's observation about early conditions determining later personal
adjustment.
b) Freud's analysis of "symbols."
c) The use of the metaphor in literature.
COMMENT: For those of you who view Skinner and Freud to be irreconcilable, prepare
yourself for a shock. Skinner cites Freud more often than he cites any other psychologist, and
frequently quite favorably. It is not Freud's observations but rather his "inner agents" that
Skinner disagrees with.
Q105: How does one empirically assess stimulus generalization?
A105: By conditioning a strong response to a single discriminative stimulus and then
recording behavioral decrements as a function of presenting different stimuli. This
type of research results in generalization gradients which show the degree of control
that various different dimensions have over the subject's tendency to respond.
COMMENT: Obviously, this is a very brief treatment of an extensive research area.
Examples of recent operant research and stimulus generalization can be found in a number of
sources, including the Journal of the Experimental Analysis of Behavior or the appropriate
section of Catania, Contemporary Research and Operant Behavior (1968) or Verhave, The
Experimental Analysis of Behavior (1966).
Discrimination. pp. 134
Q106: How does Skinner explain the process of generalization, both in terms of what it is and
what it isn't?
A106: By referring to the elements that make up a discriminative stimulus as "sharing
control" over behavior, not as an activity on the part of the organism.
31
Q107: How does one establish a discrimination? Give an example.
A107: a) by sharpening a natural generalization gradiant through differential
reinforcement.
b) reinforce a pigeon's responding in the presence of a single stimulus, for example,
a red spot, and extinguish responding in the presence of other similar stimuli, for
example, orange spots.
Abstraction. pp. 134-136
Q108: What is an abstraction, and how is it established?
A108: a) Abstraction is behavior under the control of a single property or a specific
combination of properties of a stimulus.
b) By always reinforcing in the presence of stimuli
c) By always reinforcing in the presence of stimuli which have a single common
property, for example, red, and never in the presence of stimuli which lack that
dimension, although they might share other properties, for example, shape.
Q109: In what sense is abstraction not a natural process?
A109: It appears to require the participation of a verbal community to set up the necessary
contingencies to develop an abstraction.
COMMENT: This is an especially important section since it suggests one of the
important processes underlying Skinner's approach to science as behavior. Verbal abstractions
have isolated many important and subtle properties of nature (for example, "change"), which
probably could not have been identified without the mediation of a verbal community. It is this
refining of concepts in relationships that describe our physical and behavioral world that is the
foundation of empirical science. (Hint: Etymology; the study of historical linguistic change,
especially as applied to single words).
Some Traditional Problems in Stimulus Control. pp. 136-140
Q110: What is cross modal induction?
A110: Responding under the control of two or more stimuli which share no physical
properties.
Q111: What are several possible behavioral accounts for this phenomenon?
A111: a) The responses have been separately conditioned to each stimulus.
b) Intermediate connections exist between the two stimulus.
c) Common mediating behavior exists.
32
COMMENT: The example here is of little help if you don't know Butler, Handel or what
the Wetterhorn is. The process, however, is a relatively important one, as it involves covert
operant mediating behavior. Notice that Butler never said aloud the word "shoulder". It
mediated a visual impression and resultant humming behavior. In accounting for somewhat
similar phenomena, others, for example Staats (1963), have relied upon classical conditioning as
the underlying process. However, the word, "shoulder" is clearly an operant. The analysis of
such covert operant responding will be more extensively considered in Section III.
Q112: Why might a pigeon conditioned to peck a 5" disk instead of a 3" disk when presented
together, peck a 7" disk when presented a 7" disk and a 5" disk?
A112: The pigeon is responding to a relationship, not the actual physical dimensions of the
5" disk.
COMMENT: Two points: First, notice how Skinner has extended the concept of
stimulus. With the earlier account of stimulus elements sharing control, he has freed the concept
of the stimulus from a particular physical event or a single dimension. It now has become a
combination of events which differentially control probability of responding. Critics of Skinner's
extension of basic principles to human behavior often misunderstand his use of the
discriminative stimulus concept (for example, Chomski, (1957). Second, Skinner suggests that
discriminated operants, such as the pigeon's response to a relationship, are learned
discriminations, not natural ones. That is, pigeons may be trained to respond either to size or
relationship between sizes. In the natural environment, relationships may be more important, and
thus pigeons may initially respond on this basis, but this history of conditioning can be
experimentally reversed.
Q113: What is an "interpreted" stimulus?
A113: A stimulus that is responded to as if it had certain properties when in fact it does not.
Q114: Distinguish functionally between "sensing" and seeing, perceiving and knowing.
A114: Sensing refers to stimulus reception, the other terms refer to resultant behaviors.
COMMENT: This treatment of seeing as behavior has a number of ramifications which
are more fully explicated in Chapter XVII.
IX. DEPRIVATION AND SATIATION
Q115: What unwarranted extension of the concept of a stimulus followed discovery of stimulus
controlled behavior?
A115: Writers inferred stimuli where none could be observed and included various
internal conditions as part of the stimulating situation.
Deprivation. pp. 141-143
33
Q116: What are the effects of deprivation of water on the probability of drinking?
A116: The probability moves from a low probability under conditions of satiation to a high
probability under conditions of extreme deprivation.
COMMENT: Skinner reiterates an evolutionary "explanation" for the effects of various
kinds of deprivation upon the probability of responses which alleviates the deprived conditions.
Some critics have accused Skinner of appealing to "conceptual evolutionary principles" in the
same way that he has accused others of appealing to a "conceptual nervous system."
Q117: What are some disadvantages with the concept of homeostasis?
A117: a) it only predicts a change in direction of probability.
b) it is hard to define, and harder to observe and measure.
Q118: Must "deprivation" concern itself with ingestion exclusively?
A118: No, the concept applies where the occurrence of the restricted behavior is itself
satiating.
COMMENT: The logic here clearly anticipates Premack's research.
Q119: Does deprivation affect only a single response?
A119: No, many kinds of behavior are affected simultaneously.
Needs and Drives. pp. 143-146
Q120: Under what conditions do inner events such as needs or wants add nothing to the
functional account?
A120: When they are inferred from either the independent operations (e.g., water
deprivation) or from the dependent behavior (e.g., drinking water).
Q121: How is "drive" legitimately used as a term in scientific discourse?
A121: As a hypothetical intervening state referring to the effects of deprivation and
satiation and other operations which alter probabilities of responding in similar
ways.
COMMENT: Skinner here is providing a rationale for the use of "drive" as what is
commonly called an intervening variable. Notice he precludes the necessity of such as a "real"
mental or physiological state (e.g., a hypothetical construct). A detailed account of hunger as a
drive is available in Behavior of Organisms (1938, pp. 341-378).
34
Q122: Why is a drive not a stimulus?
A122: Because the stimulation resulting from deprivation does not vary in any precise way
with the probability of eating. Often eating begins prior to experiencing any
"hunger pangs" or will continue long after they have eased, if they did in fact
precede eating.
Q123: Why is a drive not a physiological state?
A123: Enough effective physiological information is not available to permit prediction and
control. Drive, as Skinner is using the term, is only a mathematic relationship
between certain independent environmental operations and observed behavioral
consequences.
Q124: Is a drive a psychic state?
A124: No, the same argument for prediction and control applies.
COMMENT: Skinner isn't ruling out of consideration the problem of what you "feel",
but simply delays his discussion of the problem until Chapter XVII.
Q125: Is a drive a state of strength?
A125: No, "strong behavior" may reflect other variables, such as the schedule of
reinforcement.
The Practical Use of Drives. pp. 146-148
Q126: Be able to give examples involving the use of deprivation and satiation in the practical
control of human behavior.
A126: a) deprivation: delaying the service of food to make the guests more hungry,
solitary confinement to induce talking, etc.
b) satiation: serving bread and hors d'oeuvres prior to dinner, legalized
prostitution, etc.
Q127: How are the effects produced by operant reinforcement different from those described
above?
A127: They are brought under the control of a different set of deprivations. For example,
population size may be manipulated by bonus incentives taxation resulting in more
offspring being born (deprivation) or reduced by making more money or goods
available in other ways (satiation).
35
COMMENT: This section and the following one are a little unusual in the sense that
after all the trouble Skinner went to in elaborating the appropriate use of drives as intervening
states, he is now pointing out their relative uselessness as constructs.
Some Questions Concerning Drive. pp. 148-154
Q128: What two questions are implied by the question "how many drives are there?"
A128: a) How many ways can an organism be deprived?
b) How many kinds of behavior vary in strength independently of each other?
COMMENT: Skinner is pointing out that a question involving "drive" may be asked
either in terms of dependent or independent variables, and that a question involving the
intervening state is inappropriate.
Q129: What is the relationship between the effect of operant reinforcement and deprivation?
A129: Reinforcement is not effective unless the organism has been appropriately
deprived. Thus reinforcement selectively strengthens behavior at a given state of
deprivation and any variation in the level of deprivation directly alters the
probability of the conditioned response.
COMMENT: The effects of deprivation level on extinction responding referred to on pp.
149-150 is reported in Behavior of Organisms (1938, pp. 379-405). There is, obviously, a direct
effect of deprivation on responding: for the same schedule of reinforcement, higher deprivation
levels produce higher response rates (e.g., Clark, 1958).
Q130: What is the relationship between deprivation and conditioned reinforcers?
A130: "Behavior which has been strengthened by a conditioned reinforcer varies with the
deprivation appropriate to the primary reinforcer."
Q131: What is necessary to demonstrate an autonomous drive associated with a generalized
conditioned reinforcer such as attention, affection, etc.?
A131: It would be necessary to deprive or satiate an organism with one of the above
generalized reinforcers but make sure that no deprivation or satiation is taking
place concerning one of the associated primary reinforcers.
COMMENT: Notice that a strong reinforcing effect is not sufficient to justify a separate
drive. That would require the associated operation of deprivation or satiation. Presumably this is
because a conditioned reinforcer is effective even when the back-up reinforcer does not follow
each occurrence of the conditioned reinforcer. Effectiveness of the conditioned reinforcer will
still be maintained by only occasional pairings.
36
Q132: In what sense can chemical agents such as alcohol, morphine, etc., be called "acquired
drives"?
A132: Obvious effects of deprivation and satiation accompany (or define) the condition
called addiction.
COMMENT: The issue raised by the example of "sublimation" involves the concept of
response and/or stimulus induction. Certain behavioral features of raising children are shared by
the behavior of raising and caring for pets. Operations which strengthen the probability of child
raising (e.g., T.V. campaigns, magazine articles, etc.) may also strengthen the behavior of raising
pets through this process of induction or commonality of certain features. If the strengthening
operation is one of deprivation, the induced response is strengthened, but probably won't produce
a reduction in deprivation. Why you should observe an increase in strength of the induced
response instead of the behavior appropriate to the deprivation condition (e.g., when the
appropriate behavior is "sublimated") is discussed further in Chapter XXIV.
Q133: How can questions involving "interrelated drives" be experimentally tested?
A133: a) by examining the topography of behavior (the dependent variable) for similarity.
b) by assessing the effects of deprivation and satiation (the independent variable).
Q134: Is either the sex drive or the drive to dominate seen as primary in light of the above?
A134: No, too many specific effects of other variables are observable.
Q135: How can you experimentally assess the relative strength of drives?
A135: By simultaneous deprivation operations and then providing simultaneous
opportunities to respond. The behavior which first emerges presumably reflects the
stronger drive.
Q136: What are some of the experimental difficulties involved in such questions?
A136: Deprivation can affect the probability of behavior other than that which directly
reduces deprivation. For example, water deprivation reduces the organisms ability
to eat dry food, food deprivation weakens sexual behavior, etc.
Time as a Variable. pp. 154-156
Q137: In what sense can time be used as an independent variable?
A137: By allowing behavior to occur at all times, you observe a periodicity in some
responding. This periodicity may be used as an independent variable in the
prediction of behavior.
Q138: How is control obtained over behavior which displays this type of periodicity?
37
A138: By restricting access to the response, you produce a state of deprivation.
Q139: Is time alone the relevant independent variable in certain cases of annual cycles, such as
migration?
A139: No, environmental factors involving the stimuli associated with seasonal changes are
key factors. These cues may be artificially utilized to produce migratory behavior.
COMMENT: You should note that the "mere passage of time" can never itself be the
exclusive independent variable. At some point, "time passing" must contact the organism.
Q140: How can predictions be made about behavior which reflects maturational processes?
A140: It must come from a group data, i.e., by observing other members of the species,
since this behavior is not cyclical in the light of the organism.
Q141: What practical problem does this produce?
A141: Since individual differences are frequently great, chronological age is of lesser value
than directly observing the particular individual for evidences of the relevant
behavior.
The Individual and the Species. pp. 156-157
Q142: As an account for behavior, why is "instinct" described as an explanatory fiction?
A142: If it only refers to a behavioral observation, e.g., the tendency of certain birds to
build nests, it is a description of behavior, not an explanation.
COMMENT: Skinner's observation that behavior is as much a part of the organism as
are its anatomical features, and is describable with respect to species status, would probably
startle many ethologists who doubt contemporary behaviorists' capacity to deal effectively with
species specific behavior. Their arguments usually center around the species specificity of
certain food-getting behaviors in rodents and the lack of justification for generalizing information
gleaned from observing this particular behavior. Skinner's position is that a shared behavioral
process justified such extensions, such as operant behavior's capacity "to be reinforced." Thus
arbitrary operants (like lever pressing) reveal underlying common properties (like
reinforcability). Of course, other processes may be common to behavior as well, such as
discrimination, deprivation and satiation, etc.
Summary. pp. 158-159
COMMENT: This summary section is interesting in a couple of ways. First, it is the
only chapter summarization in this book. Second, of the list of seven potential questions
concerning factors which can determine the probability of responding, only the last has received
much attention as an operant research area, although (2) has resulted in some research, e.g.,
young versus old rats, etc. By and large, however, operant research has been devoted almost
38
exclusively to the factors Skinner describes in other chapters and refers to here in the last
paragraph; reinforcement, emotion, aversive stimulation, and punishment.
Q143: Why have so many of these variables apparently been ignored?
A143: a) From a research perspective, they can be relatively easily stabilized or "held
constant" and then more powerful variables can be explored.
b) From a perspective of deriving practical behavioral control techniques, they are
relatively trivial in comparison with other more manipulable independent variables.
X. EMOTION
What Is An Emotion? PP. 160-162
Q144: Why is an emotion, as it is characteristically used, an example of an explanatory fiction?
A144: Emotions are attributed to events in our history or immediate environment and are
said to cause us to act in certain ways; they are a "middle link," rather than a cause,
in a causal chain.
Q145: What are some of the difficulties in identifying emotions with (a) internal responses of
the smooth muscles and glands, (b) common expressions of facial and postural muscles?
A145: a) There is no adequate way of separating certain emotions by characteristic
patterns of internal responses, and similar responses may be produced by "nonemotional" stimuli, such as exercise.
b) Expressions of joy and grief are culturally determined, i.e., shaped by society
and are easily operantly imitated.
COMMENT: You might observe that particularly in the case of internal measures, some
psychologists give an emotion an "operational" definition simply by calling it a certain pattern of
responding, e.g., a GSR of such and such value equals "anxiety."
Emotion as a Predisposition. pp. 162-163
Q146: How does Skinner define emotion?
A146: As a conceptual state, neither psychic or physiological, which classifies behavior
with respect to the various circumstances that control its probability.
The Responses Which Vary Together in Emotion. pp. 163-164
COMMENT: First, observe the caveat in the first paragraph. It is an example of how
Skinner uses common sense terminology in a way that is often confusing to readers. This is a
chapter on "emotions" containing words like "joy," "fear," and "anger," and yet it is a curious
blend of behavioral analysis and lay terminology. Skinner does define the term emotion to
39
eliminate it as an explanatory fiction (as he did with the word "drive"), but he then goes on to
identify what most people mean when they use these words.
Second, Skinner is unclear about the factors which "cluster" certain behaviors together
when the organism is said to be behaving emotionally. Presumable they do so in part because of
common consequences and these consequences may have produced the behavior either
evolutionarily and thus the behavior today is primarily respondent, or the consequences may be
functioning as a contemporary reinforcer to shape current behaviors, or both, as in the case of
"angry" behavior.
Emotional Operations. pp. 164-166
Q147: Does "frustration" produce "rage?"
A147: No, too many different "frustrating conditions" produce too many different
behavioral consequences to group all causes and all effects together in such a
manner. It is a misleading simplification.
Q148: In what sense do drives and emotions overlap?
A148: Some cases of deprivation produce effects that extend beyond a strengthening of
behavior reinforced by the deprived object or situation. The example of nostalgia
show both strengthened behavior (by deprivation) and weakened behavior (as an
emotional effect).
COMMENT: The last sentence in this section may well have been the title of this book.
What Skinner is obviously talking about is an attempt to "force" what is commonly observed
about human behavior into a conceptual framework involving certain well established
environmental-behavioral relationships. The objective? To be able to understand better (i.e.,
talk effectively about) and alter (i.e., predict and control) human behavior.
The Total Emotion. pp. 166-167
Q149: How would one go about defining an emotion?
A149: By describing the complete behavioral consequences of certain environmental
operation. There may be both respondent and operant consequences, and the
operant consequences are often characterized as having some common
characteristics.
COMMENT: Notice in the last three sections, Skinner has provided behavioral
interpretations of loneliness, nostalgia, an employee angered by criticism, and several phobias.
Emotions are not Causes. pp. 167-169
Q150: What is the proper subject matter of emotion?
40
A150: The emotional behavior and the manipulable events of which that behavior is a
function.
COMMENT: Much of this material simply restates and extends the earlier comments
concerning emotion as a conceptual second link, not necessarily a psychic or psychological one,
which is best interpreted as a "predisposition" to act in certain ways. When one starts talking
about "predispositions to predispositions," e.g., "moods," "dispositions," you are talking
probabilistically about probabilities. You might have observed by now that Skinner provides a
behavioral interpretation for more words than most people know.
The Practical Use of Emotion. pp. 169-170
Q151: How can emotional responses that are primarily respondent in nature be controlled?
A151: a) They can be elicited by either unconditioned or conditioned eliciting stimuli.
b) They can be eliminated by withholding the elicited stimulus, eliciting an
incompatible response, or by drugs.
Q152: How can larger categories of behavior predispositions be altered?
A152: By such emotional operations as pep talk; tales of atrocities, campaign speeches, etc.
COMMENT: The last couple of paragraphs are obviously more "forced" than the earlier
material. Obviously, individual behavioral histories play a major role since prior conditioning is
the key factor determining reactions to certain words.
XI. AVERSION, AVOIDANCE, ANXIETY
Aversive Behavior. pp. 171-174
Q153: What is an aversive stimulus?
A153: A negative reinforcer; a stimulus whose removal strengthens responding.
Q154: What are the physical characteristics of aversive stimuli?
A154: Aversive stimuli are not identifiable in terms of physical properties.
Q155: What is escape behavior?
A155: Behavior followed by the withdrawal of an aversive stimulus.
Q156: How do you study behavior under the control of aversive stimuli?
A156: "...by presenting an aversive stimulus, we create the possibility of reinforcing a
response by withdrawing the stimulus. When conditioning has already taken place,
the aversive stimulus provides an immediate mode of control."
41
Q157: What are some of the practical advantages and disadvantages of the use of aversive
stimuli in the above manner?
A157: a) advantages; immediacy of results
b) disadvantages; aversive stimuli elicit reflexes and generate emotional
predispositions which often interfere with the operant behavior to be strengthened.
COMMENT: This particular disadvantage that Skinner cites is the result of an aversive
event producing emotional behavior, e.g., emotional respondents elicited by the aversive event or
some unanalyzed blend of operants and respondents which are called "predispositions" (Chapter
X) generated by the aversive event, is a rather controversial position in contemporary behavior
modification practices. These "side effects" of using aversive stimuli to control appropriate
behavior are used by many to argue against utilizing aversive control. (Skinner will give many of
these arguments later in this book.) Others feel that these side effects, if they exist at all, don't
necessarily rule out aversive control as a legitimate behavior modification technique (e.g., Baer,
Psychology Today, October, 1971).
Q158: What is aversive behavior?
A158: Escape responding, e.g., behavior strengthened by the withdrawal of an aversive event.
Q159: Is deprivation an operation equivalent to presenting an aversive event?
A159: No, behaviors which reduce the aversive stimulation that may occur under
deprivation conditions are not necessarily the same behaviors whose probability
normally varies with deprivation and satiation.
COMMENT: Notice again that Skinner cites the evolutionary advantages of being
reinforcable by the withdrawal of certain conditions.
Q160: What is a conditioned aversive stimulus?
A160: An event which has become aversive by accompanying or preceding already
aversive events.
COMMENT: This analysis of tobacco and alcohol cures isn't too clear. What is intended
is that some substances produce nausea which is an aversive state whose removal is reinforcing.
This nausea inducing capacity can be transferred to other substances, like tobacco and alcohol, by
classical conditioning. Then tobacco and alcohol can also produce nausea and behavior which
will reduce the stimuli as strengthened. Vomiting may be one of these responses but, obviously,
stopping smoking and drinking are others.
The Practical Use of Aversive Stimuli. pp. 174-175
42
COMMENT: Presumably Skinner means that we use negative reinforcers in several
different ways. Negative reinforcement would refer only to operation of removing an aversive
stimulus contingent upon a response. He refers herein to other procedures as well.
Q161: List several ways that aversive stimuli can be used to control behavior, and give an
example of each.
A161: a) Presenting conditioned and unconditioned aversive stimuli to generate desired
escape behavior, e.g., using force or social pressure to obtain a desired response.
b) Conditioning neutral stimuli to control future responding as aversive stimuli,
e.g., labeling responses as bad or sinful results in their generating stimuli from
which escape is automatically reinforced.
COMMENT: Notice that Skinner is using what is called a "two-process" explanation for
the reduction or weakening of certain behaviors. Earlier he described the reduction of smoking
and drinking as a means of escape from conditioned nausea and now he is talking about escaping
the stimulation that results from engaging in a behavior that has been paired with social
disapproval, which is also a case of response reduction. Much more of this will be discussed in
Chapter XII.
Avoidance pp. 176-178
Q162: What is avoidance behavior?
A162: Behavior which (a) escapes a conditioned aversive stimulus and thereby (b) avoids
the subsequent occurrence of a previously affective (primary) aversive stimulus.
Q163: What is the "reinforcer" in maintaining avoidance behavior?
A163: The negative reinforcement of escaping the conditioned aversive stimuli which
precedes the unconditioned aversive event.
Q164: What are the practical consequences of successful avoidance?
A164: When avoidance is successful, the conditioned aversive stimulus gradually becomes
ineffective as a negative reinforcer because it no longer is followed by the
unconditioned aversive stimulus. It thus loses its effectiveness in maintaining the
escape response and the escape response may cease to occur. When this happens,
the unconditioned aversive event is not avoided, and the conditioned aversive
stimulus is reconditioned.
COMMENT: Both in this section and in the preceding one, Skinner refers to the removal
of a positive reinforcer as definitionally equivalent to the presentation of a negative one. Strictly
speaking, that is not precisely the case. Skinner demonstrates that presenting some events
(positive reinforcers) and removing others (negative reinforcer or aversive stimuli) will
strengthen behavior. These events, however, are defined in terms of their effects on behavior.
That the opposite operations of removing positive reinforcers and presenting negative reinforcers
43
will have the same or similar effects is nowhere stated. True, Skinner calls these operations
"punishment," but this is not a definition in terms of behavioral effect as is the definition of
reinforcement. Of course other authors have defined punishment in terms of behavioral effect,
both historically (Thorndike) and more recently (Holtz & Azrin, 1965). But for Skinner, any
statement that removing a positive reinforcer will have the same effect as presenting a negative
one should have to be demonstrated, since it will not necessarily be true by his definitions alone.
Anxiety. pp. 178-180
Q165: What is anxiety?
A165: An emotional state produced by conditioned aversive stimuli.
Q166: Is escape from anxiety equivalent to avoiding the event responsible for the conditioned
aversive stimuli which produce the anxiety?
A166: No, apparently anxiety can itself be sufficiently aversive that one will escape the
anxiety even when the unconditioned event cannot be avoided.
COMMENT: Remember that for Skinner an emotion is nothing more than a conceptual
reference for a relationship between certain environmental operations and the resultant
respondent and operant behavior.
Anxiety and Anticipation. pp. 180
Q167: What is anticipation?
A167: The emotional response which precedes positive reinforcement and is occasioned by
stimuli which are discriminative for positive reinforcement.
Q168: How is anticipation behaviorally contrasted with anxiety?
A168: Anxiety produces behavioral depression, anticipation produces heightened levels of
activity.
Anxiety Not a Cause. pp. 180-181
Q169: How can one reduce the effects of anxiety?
A169: By eliminating the environmental events which produce the behavioral
predispositions. Anxiety, itself, is only a name for a relationship, not a functional
factor.
COMMENT: In 1941, Estes and Skinner wrote a paper entitled "Some Quantitative
Properties of Anxiety" (Journal of Experimental Psychology, 1941, 29, 390-400.) in which they
describe the effects of a certain experimental procedure designed to produce "anxiety." Their
method first established operant performance on an FI 4 minute schedule, then superimposed a
44
tone followed 3 minutes later by an unavoidable shock several times each session. The result
was a gradual but ultimately almost complete cessation of responding during the tone. This
effect, later called the "conditioned emotional response" (CER), or sometimes "conditioned
suppression," has been frequently utilized as an experimental procedure, and many parameters
have been studied.
Using an opposite procedure to investigate anticipation has only recently received much
attention. An operant base line is established, then a neutral stimulus "followed by a free positive
reinforcer, in the same way that a tone is followed by a shock in the CER procedure. Results
have been equivocal. Some authors finding suppression (1) (Azrin & Hake, 1969) while other
research has produced heightened responding (Henton & Brady, 1970). Several factors,
including particular base line schedule, nature and degree of conditioned and unconditioned
stimuli, and temporal parameters, are all implicated as factors determining the overall outcome of
these kinds of procedures.
XII. PUNISHMENT
A Questionable Technique. pp. 182-183
Q170: What is the objective of punishment?
A170: To reduce the tendency to behave in certain ways.
Q171: Why is this a questionable technique?
A171: Punishment works to the long-term disadvantage of both the punished organism
and the punishing agency.
Q172: What aspect of punishment causes this disadvantage?
A172: The aversive stimuli used to punish generate emotions, including predispositions to
escape or retaliate, and disabling anxieties.
Does Punishment Work? pp. 183-184
Q173: Why did Thorndike conclude that punishment didn't "stamp out" behavior?
A173: The use of the work "wrong" in human learning tasks did not eliminate behavior in
the opposite manner of the way the work "right" strengthened it.
COMMENT: This revision of the original Law of Effect is referred to as the "Truncated
Law of Effect" and is a popular piece of psychological trivia, frequently found in comprehensive
examinations.
45
Q174: What is the effect of punishment on extinction responding in animal research?
A174: Punishment only temporarily suppresses behavior. Removal of the punishing event
results in a recovery of responding with the resultant total number of responses
equivalent to what would have occurred had no punishment been applied.
COMMENT: This is, possible, a weak position. The theoretical formulation hinges upon
Skinners' original concept of the "Reflex Reserve," a hypothetical model whereby reinforcement
"stores up" responses and extinction exhausts them somewhat in the same manner water is stored
and drained from a water tower (Behavior of Organisms, 1938, pp. 26-28). He almost
immediately retracted the concept (formally in a paper delivered at APA) and later observed that
as a theory it was "utterly worthless in suggesting further experiments" (Cumulative Record,
1961, p.88). Operationally, however, it can be viewed simply as a predicted extinction curve and
it is in this latter sense that Skinner is discussing the effects of punishment.
The experimental evidence cited is based on Estes' (1944) "bar slap" study of the effects
of punishment, wherein the first few extinction responses were "punished." As described, the
response rate recovered fully following the cessation of punishment and the predicted total
number of extinction responses ultimately occurred. However, more recent research (e.g.,
Rachlin, 1966) clearly demonstrates that extinction curves are considerably smaller under
continued punishment. The temporary effect of punishment is seen as no different than a similar
temporary effect of reinforcement. Both must continue to be effective.
The Effects of Punishment. pp. 184-185
Q175: How are positive and negative reinforcers defined?
A175: As events whose presentation and withdrawal strengthen behavior upon which they
are contingent.
Q176: How is punishment defined?
A176: By the operations of withdrawing a positive reinforcer and presenting a negative
one.
COMMENT: Recently a more generally accepted operant definition of punishment has
been in terms of its behavioral effects "... a consequence of behavior that reduced the future
probability of that behavior" (Azrin & Holtz, 1966). However, that is not to say that Skinner
himself necessarily buys that approach. In 1965, Skinner stated "punishment does not merely
cancel reinforcement; it leads to a struggle for self-control which is often violent and time
consuming." (The Environmental Solution, in Contingencies of Reinforcement, 1970, p. 52). A
similar position is stated in Evans, R.I., B. F. Skinner, The Man and His Ideas (1968, pp. 33-34).
A First Effect of Punishment. p. 86
Q177: What is the first effect of punishment?
46
A177: The elicitation of incompatible responses including respondents, conditioned
respondents, or conditioned or unconditioned emotional predispositions.
Q178: In what sense is this a temporary effect?
A178: Presumably by virtue of the fact that these stimuli would have no permanent effect
in elimination behavior. Once they were removed, the incompatible responding
would cease to occur and the replaced behavior would occur once more.
A Second Effect of Punishment. pp. 186-188
Q179: What is the second effect of punishment?
A179: Punished behavior becomes the source of conditioned stimuli which evoke
incompatible behavior. Again, this behavior may be either conditioned respondents
or emotional predispositions.
Q180: Can this incompatible behavior be evoked by other events?
A180: Yes, external conditions in which punishment has occurred can also become
conditioned stimuli capable of eliciting these reactions.
A Third Effect of Punishment. pp. 188-180
Q181: What is the third effect of punishment?
A181: Behaviors, or external circumstances, that are paired with punishment become
effective as conditioned aversive stimuli. Behavior which is effective in reducing this
conditioned aversive stimulation will be negatively reinforced.
COMMENT: This concludes Skinner's analysis of the effects of punishment. In essence,
the key effect of punishment is the establishment via conditioned negative reinforcement of
behavior incompatible with the previously punished responding, e.g., the avoidance behavior of
"doing something else." Punishment, then, doesn't eliminate behavior, instead it establishes the
conditions for the acquisition of replacement responding. As in the comment above, frequently
this behavior can be described as learning "not to respond" or "self-control." It is active
behavior, however, not simply the vacuum that would be left if punishment had the effect of
successfully removing a response from the organism's repertoire of behavior.
Q182: What happens when you punish someone for not doing what he is supposed to do?
A182: Conditioned aversive stimuli occur whenever he is doing something else. He only
escapes this conditioned aversive stimulation ("guilt") by doing what he is supposed to.
COMMENT: It is in this paradigm that this account is weakest. The experimental
parallel, of course, is Sidman (or nondiscriminative) avoidance, where the organism is shocked
47
for not pressing the bar. Only by responding is punishment avoided. From Skinner's point of
view, all behavior other than bar pressing must be capable of eliciting (or actually become)
conditioned aversive stimulus. The subject learns to bar press because all other possible
responses generate aversive stimulation. That means that the rat could learn to press the bar until
everything else he could do had been paired with the shock. Well, they learn to bar press far too
rapidly for that to be a completely plausible explanation. (See Hearnstein, or Anger, for a more
detailed explanation of this particular issue.)
Some Unfortunate By-Products of Punishment. pp. 190-191
COMMENT: Notice how Skinner avoids the issue of elicited aggression. The problem,
of course, is that some species clearly demonstrate behavior that appears to be a reflexive
reaction to pain, while others don't seem to have any similar fixed responses.
Q183: What are the by-products of the use of punishment?
Q183: a) The avoidance behavior and the punished behavior are incompatible but both
have strong tendency to occur simultaneously. The resultant conflict is represented
either by oscillation or some unusual combination of both behaviors.
b) The incipient stages of punished behavior generate reflexes such as the emotions
of fear and anxiety. Successful avoidance, too, may generate frustration. These
emotional reactions cannot be easily escaped since they are generated by the
organism's own behavior.
c) If the punished behavior is reflexive rather than operant, it cannot easily be
replaced by a negatively reinforced operant avoidance response. Instead, some
operant must be acquired which can successfully control the reflex.
COMMENT: This is an important section in that it clearly reflects Skinner's special
concern about the use of punishment. In the first paragraph, he suggests several reasons for not
using punishment: (1) It is only temporarily effective. (2) It reduces the group's overall
effectiveness. (3) It makes people unhappy. Even if, as today, punishment can be effectively
used to permanently eliminate behavior, have these other concerns been similarly eliminated?
Alternatives to Punishment. pp. 191-193
Q184: Be able to list the several alternative ways of eliminating behavior other than punishment.
A184: a) If the behavior is emotional in nature, e.g., elicited, removing the eliciting
stimulus will eliminate the behavior.
b) Simply satiating the undesired response will cause it to stop. (Of course, once
deprivation again occurs, the behavior returns to strength.)
c) If the behavior reflects a developmental stage, maturation will eliminate it. (It is
hard to imagine what Skinner is thinking about here, possibly crying while
teething.)
d) Let time pass in the hopes that the response will be forgotten.
e) Extinction.
48
f) Condition incompatible behavior.
COMMENT: Contrary to Skinner's hopes in the last paragraph, the recent research on
punishment has probably led more toward effective utilization of aversive stimuli than to the use
of these alternatives.
XIII. FUNCTION VERSUS ASPECT
Q185: Are traits descriptive of specific responses?
A185: No, they only describe behavior generally.
What Are Traits? pp. 195-199
Q186: What are the equivalents of traits in a functional analysis?
A186: The behavioral manner in which a person differs from others or from himself from
time to time.
Q187: List, with an original example, each of the several behavioral differences resulting from
independent variables that can give rise to trait names.
A187: a) differences in histories of reinforcement; ingenuous
b) different schedules of reinforcement; stubborn
c) contingencies involving punishment; fearful
d) contingencies involving discrimination between stimuli; tasteful, cultivated,
artistic
e) differences in deprivation; bored
f) differences in hereditary endowment; sexy (?)
g) differences in age; grown-up
h) differences in development; mature
i) differences in emotional reactions; blinded with rage
COMMENT: It seems to me that Skinner omitted a rather critical set of differences,
those with respect to the independent variables of conditioned reinforcement. This could be with
respect to the degree to which certain generalized reinforcers control behavior, e.g., sociable vs
retiring, boastful vs modest, avaricious vs indifferent to wealth. Or, the difference could be with
respect to whether or not certain conditioned reinforcers have even been established; e.g.,
scholarly, sports loving, "gay" as in homosexual, etc.
Q188: What do these kinds of traits represent in a functional analysis?
49
A188: The behavioral repertoire of an organism, with some indication of the relative
strength of its parts and with certain inferences regarding relevant variables.
Q189: How are such repertoires assessed?
A189: By tests which are, in fact, inventories of responses, indicating both class and/or
relative strength of certain responses.
Q190: What is a "process" difference?
A190: Differences in behavior arising from a difference in the rate at which changes in
behavior occur.
Q191: Can these process differences be inventoried?
A191: No, but possibly they can be expressed in quantitative form as constants in
equations which describe the processes.
COMMENT: It is important to note that Skinner includes as an explanation for rate of
conditioning, the possibility that nothing more is involved than a particular history of
reinforcement.
Q192: How are traits usually quantified?
A192: With respect to relative position in a population of scores.
Q193: What alternative method is more appropriate to a functional analysis?
A193: Inventories, relative strength, and speeds of processes can be better quantified with
respect to frequency of occurrence.
Q194: Summarize the basic categories of behavioral differences that Skinner believes give rise
to traits.
A194: Differences in repertoires, in relative strength of parts of the repertoire (resulting
from exposure to variables), relative speed of behavioral processes (or rates of
change).
Prediction in Terms of Traits. pp. 199-202
Q195: What is the sense in which a test permits prediction?
A195: It allows prediction from one effect to another. This occurs because the variables
responsible for test performance will also cause certain behaviors to occur
predictably in other circumstances.
Q196: How does this differ from prediction based upon a functional analysis?
50
A196: In a functional analysis, independent variables are identified, and thus prediction is
from cause to effect.
Q197: Under what practical conditions are test results useful?
A197: When the independent variables are impossible or difficult to ascertain.
Q198: What is the disadvantage of prediction based upon trait description?
A198: No knowledge of independent variables is sought or discovered, and no additional
control over the behavior of interest in gained.
COMMENT: Notice how Skinner qualifies the observation that traits imply only
descriptions. Many have attempted to identify the causes of traits (e.g., Freud). What Skinner
means in these cases is that they haven't correctly identified the causes (as contrasted with his
own functional analysis).
Q199: Why haven't trait descriptions been especially useful in a functional analysis?
A199: A trait doesn't refer to a functional unit, but merely to a descriptive one. A response
is the functional unit.
COMMENT: This is a rather important point, but it is made somewhat more difficult by
Skinner's earlier presentation of traits resulting from different independent variables and
processes. What is intended is the fact that a trait analysis only identifies behavior in terms of
what it looks like. A functional analysis identifies behavior in terms of its controlling relations.
This distinction is expressed by the chapter title. It wouldn't be a problem if different
appearances were directly correlated with separate functions, but they are not. Identically
appearing behaviors may appear in two individuals or in the same individual at different times
for completely different reasons. That is, similarly appearing behavior may represent different
functional relationships with the environment. This problem will be more fully discussed in the
next chapter.
Traits are not Causes. pp. 202-203
Q200: Why are traits not causes?
A200: Because they are derived solely from the dependent variable, behavior.
Q201: Give an example of a trait beginning as an adjective and becoming utilized as a cause.
A201: The field of mental retardation and clinical diagnosis provide many examples of the
problem of descriptions becoming causes.
XIV. THE ANALYSIS OF COMPLEX CASES
51
Oversimplification. PP. 204-205
Q202: What is a frequent criticism of behavioral principles that are based upon laboratory
research with lower organisms?
A202: They are oversimplified and are unable to deal with the complexities of human
behavior.
Q203: What is a common misunderstanding concerning basic behavioral principles?
A203: A failure to understand what happens when the variables of different
environmental-behavioral relations interact.
COMMENT: This section, written in the early 1950's, well anticipates Chomsky's harsh
review of Verbal Behavior (1957).
Multiple Effects of a Single Variable. pp. 207-209
Q204: In what sense can a single variable have multiple effects? Give an example from the field
of punishment.
A204: a) A single environmental event, such as an aversive stimulus, can simultaneously
affect more than one class of an organism's behavior.
b) An aversive stimulus can (1) elicit reflexes, (2) produce emotional predispositions,
(3) establish behaviors and events which precede it as conditioned aversive stimuli
through respondent conditioning, and (4) maintain escape and avoidance behavior
as a negative reinforcer.
COMMENT: Skinner again restates his view of the dynamics of punishment. This
section is somewhat different from that in Chapter XII in that here he separates reflex elicitation
from an emotional operation. These were combined in his "First Effect of Punishment" section
(p. 186), since he was speaking there of the production of incompatible behavior, of which each
of these effects is an example. Notice also that he uses the traditional terminology in referring to
an unconditioned stimulus in the classical conditioning paradigm as a "reinforcing" stimulus. It's
effect as such is obviously not what is meant by the same term when referring to operant
conditioning.
Q205: When can multiple effects be easily observed? Give an example.
A205: a) When the effects are felt at different times.
b) A large amount of particular reinforcer can initially satiate the responding
which produced it. Its effectiveness as a reinforcer is observed when the organism is again
deprived.
52
Q206: In what sense does giving attention to a child who "needs" it weaken his demands?
A206: Only by satiation; unfortunately its reinforcing effects will be observed later when
the child again "needs" attention.
Q207: Separate satiation effects from discriminative effects in the example of giving a child a
piece of candy when he hasn't asked for it.
A207: We appear to strengthen a drive rather than satiated one by giving him candy since
the mere presence of candy serves as a discriminative stimulus for asking for more.
The candy-seeking behavior becomes strong because of the discriminative
properties of the candy rather than any substantial deprivation effects. If more
candy isn't forthcoming, the emotional effect of "frustration" may be observed.
Q208: What multiple effects are involved in the wavelike oscillation frequently observed in
extinction curves?
A208: Removing reinforcement both weakens behavior directly and produces an emotional
state of depressed behavior. As the second effect subsides, the behavior may
increase somewhat in rate only to fall again as more behavior goes unreinforced.
Q209: What is the effect of repeated exposures to extinction?
A209: The emotional effect drops out and the extinction responding becomes consistent in
form, e.g., a smoothly decreasing response rate without oscillation.
Q210: Is the emotional effect of frustration restricted to the response being extinctioned? How
can your answer be demonstrated?
A210: a) No, more than one response is affected.
b) By recording multiple extinction curves, offset in time, and observing that the
frustration-produced behavioral depressions occur simultaneously.
COMMENT: If this experiment has actually been accomplished, I haven't been able to
find it. Of course, it could have been conducted and not published, and if so, probably during
Skinner's research at Indiana University.
Multiple Causes. pp. 209-213
Q211: What is a second way in which important behavioral variables may interact? Give an
example.
A211: a) They may combine to produce a similar effect.
b) A similar response may be reinforced by more than one consequence.
Q212: How might emotional operations act in conjunction with reinforcement?
53
A212: When one is emotionally predisposed to engage in behavior that will be reinforced
also. An example might be a soldier whose best friend has been killed. He is
especially eager to go into battle. The emotional reaction to attack those responsible
for his loss of a companion interact and strengthen his operantly conditioned
military skills.
Q213: Give some examples of multiple strengthening that involve interacting discriminative
stimuli.
A213: Most verbal behavior is multiply determined. Puns represent notorious examples.
Prompts, cues, or suggestions are all examples of discriminative stimuli that serve to
supplement behavior which already exists at some strength. A projective test serves
the same function; to add strength to an already existing, but unknown, repertoire.
COMMENT: As the footnote on p. 210 suggests, the analysis of multiply determined
behavior is conducted extensively in Verbal Behavior (1957). It is especially important to
remember that words "may have many meanings." A single word (as in Skinner's example of
"house." p. 210) may be under the control of many variables. This is an example of the problem
of form versus function (Chapter XIII). Many cognitive interpretations of verbal behavior result
from a lack of understanding of the role of multiple causation in language.
The Practical Use of Multiple Causation. pp. 213-216
Q214: What is "suggestion?"
A214: "...the use of a stimulus to raise the probability of a response already assumed to
exist at some low value."
Q215: Define and give an example of the classes of suggestions that Skinner describes in this
section.
A215: a) Formal Prompt: the supplementary stimulation is the same form as the sought
response (an imitative or echoic stimulus), and the exact response is known in
advance. A cue in the theatre is one example.
b) Thematic Prompt: the supplementary stimulus is of a different form, and the
response is known. The example would be a hint, not a cue, as in hint, "Stars &
Stripes", when the sought response is "flag." (Notice that in this case how my own
selection of an example receives supplementary stimulations from the text example
of George Washington, being related to a common history of patriotic love, etc. )
Obviously, the game "Password" exemplifies this process.
c) Formal Probe: a source of stimulation which serves to evoke unknown behavior
that is strengthened by its topographic similarity to the stimulus. The verbal
Summator (invented by Skinner and used rarely and exclusively by him) provides
barely audible, sound rhythms which can be interpreted as speech, thus revealing
strong behavior, in the same way that the sounds of wheels on rails frequently evoke
words and songs.
54
d) Thematic Probe: providing stimulation to which the organism responds in a
different way (e.g., "with the first word that comes to mind." An example is the
word association test where the thematic probe is a verbal one.
Projection and Identification. pp. 216-217
Q216: How would projective tests be categorized in terms of the above forms of suggestion?
A216: As formal and thematic probes
Q217: What is the Freudian and the behavioral interpretation of projection and interpretation?
A217: a) Freud saw projection and identification as two methods whereby repressed
wishes worked themselves out.
b) Behaviorarily, certain current stimuli function as occasions for verbal or
nonverbal behavior to join forces with behavior already in some strength. If the
summation of these two tendencies to respond are sufficient the behavior occurs.
Q218: What is the difference between projection and identification in terms of the relationship
between the supplementary stimulus and the response?
A218: a) in identification, the behavior is imitative, either covert or overt, verbal or
nonverbal.
b) the behavior in projection is less specifically controlled by the supplementary
stimulus.
COMMENT: It is important to understand the dynamics of Skinner's concept of multiple
causation. When Skinner says a behavior already exists in some strength but is not presently
occurring, what he intends is that certain variables that control the behavior are present. This
may be one or several of the factors already discussed; history of reinforcement, emotional
operations, discriminative stimuli, etc. But the behavior doesn't occur. Possibly some
counteracting variable is also present, a preaversive stimulus, for example. However, the
addition of one more behavioral variable, another discriminative stimulus perhaps, summates
those already present controlling variables to the point where their strength is sufficient to
produce the behavior. Confusion can exist if this behavior is then erroneously contributed to the
single variable, since it may well not cause the response at some later time when the other
variables are not present.
Multiple Variables in Perception. pp. 218
Q219: What is the role of multiple causation in the field of perception?
A219: Variables such as emotion, motivation, and reinforcement interact with physical
stimuli in determining what is called perception. Parents of missing children
frequently mistake strangers for their own offspring
COMMENT: You might review the section, The Analysis of Stimuli, pp. 130-131.
55
Variables With Incompatible Effects. pp. 218-223
Q220: What is conflict?
A220: A situation where two or more incompatible responses tendencies exist. That is,
variables which control incompatible behavior are concurrently present.
Q221: What is algebraic summation?
A221: The behavior resulting from the simultaneous presence of two variables that control
diametrically opposed responses., e.g., in the "approach-avoidance conflict." The
resultant behavior show the effects of both counteracting tendencies.
Q222: What kinds of behavior can result in algebraic summation?
A222: a) The behavior may be primarily under the control of one factor, but the effect of
the opposing variable is seen in hesitancy or slowness of responding.
b) The behavior may be awkward.
c) The behavior may be easily distracted.
d) The behaviors may oscillate, first in one direction, then another. This results
from the first response sufficiently changing the stimulus conditions that the
counteracting variable "takes over." The resultant opposite behavior then occurs
until it, too, sufficiently alters the stimulus conditions that a reversal again occurs.
COMMENT: Notice how skinner stresses the fact that stimuli, e.g., physical events, do
not exhaust the world of behavioral control variables. Much of the weakness of a traditional S-R
approach to the interpretation of human behavior is due to an over-reliance on stimulus variables
(p. 141). Discriminative stimuli are, of course, important, but so are reinforcement contingencies
and drive and emotional operations.
Q223: What is prepotency?
A223: An outcome of behavioral conflict whereby a single response occurs to the
momentary suppression of the incompatible behavior. Oscillation may occuyr if the
emission of the first response changes the situation to lessen its probability (by
satiation, for example) so that the other response then becomes prepotent.
COMMENT: The following section, "To Do or Not To Do," is somewhat difficult. The
essential point is that the avoidance behavior acquired to replace punished responding is not
"doing nothing," but rather is a specific behavior. Punishment does not create a negative
response tendency, but rather strengthens incompatible avoidance behaviors. Thus, the
punishment paradigm creates two types of competing response situations: The first, obviously, is
between the strength of the punished behavior and the avoidance behavior that the punishment
has established --- to go to the dentist or to do something else. The second is between the various
possible avoidance responses; why one should avoid the dentist by one response or another. It is
the second point that is involved in the Barchester Towers example.
56
Chaining. pp. 224
Q224: What is a chain of behavior?
A224: A sequence of responses where the emission of the earlier affect the probability of
the latter either by changing stimuli, changing other variables, e.g., deprivation or
satiation, or which is directly reinforced for altering the probability of the latter
behavior.
SECTION III: THE INDIVIDUAL AS A WHOLE
XV. "SELF-CONTROL"
The "Self-Determination" of Control. pp. 227-230
Q1:
Why is the notion of control implicit in a functional analysis of behavior?
A1:
"When we discover an independent variable which can be controlled, we discover a
means of controlling the behavior which is a function of it."
Q2:
What are the theoretical and practical implications of this possibility?
A2:
a) theoretical; by demonstrating control, we can prove the validity of a functional
relationship.
b) practical; techniques of control result in the development of a technology, and
also point up the degree of control already in social use.
COMMENT: Skinner's aside at the use of statistics is not gratuitous. There is a
fundamental difference between his approach to a functional analysis of behavior based upon
demonstrable control and the usual hypothetico-deductive approach which frequently utilizes
inferential statistics, both to frame hypotheses and to test them.
Q3:
What is a typical objection to the behaving organism as it has been described so far?
A3:
By emphasizing external controlling variables, the organism seems to be left without
any control of its own. It has become simply a repertoire of responses, whose
particular pattern is a reflection of the contemporary array of controlling variables.
Q4:
What typical observations about behavior lead to a concept of "self-control?"
A4:
Self-determination seems evident in the creative behavior of the artist and the
scientist, the self-exploratory behavior of the writer, the self-disciplines of ascetic, etc.
57
Q5:
What is the proper response of a functional analysis of behavior to these apparent
contradictions?
A5:
To recognize that the techniques of self-control are, in fact, examples of behavior,
whose occurrence must ultimately be accounted for by variables lying outside the
organism.
COMMENT: The remainder of this section primarily describes the objectives of the next
few chapters. However, a number of important distinctions are made.
Q6:
How can one behaviorally distinguish between self-control and creative thinking?
A6:
"...in self-control, the individual can identify the behavior to be controlled while in
creative thinking he cannot."
COMMENT: This distinction between self-control and creative thinking will be
considerably elaborated upon in this and the next chapter.
Q7:
A7:
Why is the concept of a private event relevant to a discussion of self-control and creative
thinking?
Behavioral events functioning as middle links in an example of self-control are often
accessible only to the behaving organism, itself. As such, they must be accounted for
in order to provide a complete functional account of self-control.
COMMENT: Notice how Skinner's position on middle links that are "private events" is
different from those on psychic or physiological ones. He wants to provide an account for the
private events but he disavows completely the notion of psychic middle links and he is
uninterested in physiological ones. The reason for this lies in his analysis of private events as
behavior.
"Self-Control" pp. 231
Q8:
What is self-control?
A8:
Behavior that is automatically reinforced by successfully altering variables to make
a subsequent punished response less probable.
Q9:
What two responses are involved in the self-control relationship and how are they
related?
A9:
"One response, the controlling response, affects variables in such a way as to change
the probability of the other, the controlled response."
COMMENT: It is important to be able to distinguish between self-control and the other
classes of avoidance behavior that may occur and replace punished responding. In the latter case,
responses which escape the conditioned aversive stimuli generated by the incipient punished
responding, and thereby avoid the punishing stimulus, are automatically reinforced. Their only
58
effect on the punished behavior is to replace it. In self-control, controlling response directly acts
upon the variable of which the previously punished response is a function.
Techniques of Control. pp. 231-240
COMMENT: This is a rather detailed section. However, to understand much that
Skinner has written since, it should be mastered in equal detail. The goal is to understand the
acquisition and maintenance of the controlling response. (Remember that each case, the
controlled response is one which has been previously punished.)
Q10: Give examples of the five techniques of self-control through physical restraint and
physical aid, explaining how the controlling response is functionally acquired and
maintained.
A10: a) The controlling response imposes physical restraint upon the controlled
response, as in clapping one's hand over one's mouth to keep from laughing. The
response avoids the conditioned aversive stimulation which occurs when the
previously punished behavior begins.
b) The controlling response consists of moving out of a situation. An angry man
leaves an argument rather than engage in further aggressive behavior which is
likely to lead to punishment.
c) Suicide; probably only the last example given, to prevent divulging secrets,
really fits the paradigm. Much suicidal behavior seems to be directly socially
reinforced.
d) Removing the situation; a Las Vegas tourist who only takes $50.00 with him to a
gambling casino reduces the probability of extensive gambling and thus losing more
than he can afford.
e) Supplying physical aid; obtaining the necessary tools before beginning a job in
order to reduce the aversive properties of unnecessary effort and delay.
Q11: Describe each of the techniques of self-control involved in the manipulation of stimuli,
and in each case explain how the consequences will maintain the behavior.
A11: a) Removing eliciting stimuli; swallowing a pill with water to eliminate the reflexive
reaction of regurgitation, which is presumably aversive.
b) Removing discriminative stimuli; placing tempting foods out of sight. The
aversive consequences here are obviously socially conditioned since probably eating
the food would be itself reinforcing.
c) Presenting eliciting stimuli; by taking an emetic, or a laxative, to induce certain
reflexive responses which will reduce later aversive reactions.
d) Presenting discriminative stimuli; arranging certain stimulus conditions, such
as indicating an appointment on a desk calendar, to control behavior which has
aversive consequences, for example, missing the appointment.
e) Conditioning stimuli; a new employee may look at staff photographs and names
so that he will recognize the individuals when he sees them and reduce the
aversiveness of not knowing their names.
59
f) Extinguishing stimuli; eliminating reactions, such as blushing, by systematically
exposing oneself to increasingly embarrassing situations. (Skinner uses the word
"reinforce" to refer here to the unconditioned stimulus in the classical conditioning
sense, not the operant concept.)
Q12: Give examples of self-control techniques utilizing deprivation and satiation effects.
A12: a) Deprivation; going without lunch to assure hearty eating at dinner. In order to
qualify for Skinner's definition of self-control, we would have to demonstrate that
not indulging in heavy eating would be aversive in some way.
b) Satiation; "giving in" to a competing response in order to be able to return to
work more effectively. Presumably this is the mechanism whereby having a beer
before mowing the lawn also works toward getting the lawn mowed. In many
instances of competing behaviors, simply doing one immediately then the other
shortly thereafter is less aversive than oscillating indecisively between which to do
first.
Q13: Give examples of the various techniques of self-control involving the manipulation of
emotional conditions.
A13: a) Removing stimuli which control emotional behaviors; the homesick soldier may
place his girlfriend's picture out of sight to reduce its effect in controlling his
loneliness.
b) Presenting emotional stimuli; biting your tongue to elicit behavior incompatible
with current inappropriate behavior such as laughing in church.
c) Controlling emotional predispositions; suppressing your behavior by reviewing a
history of punishment for inappropriate behaviors in similar circumstances.
d) Delaying emotional reactions to weaken them; one counts to ten before answering.
e) Extinguishing emotional reactions; a form of self-managed systematic desensitization.
Q14: Describe various ways to use aversive stimulation in self-control.
A14: a) Presenting aversive stimuli; setting an alarm provides an aversive stimulus which
can be escaped only by awaking and stopping it. The escape behavior replaces the
controlled response of oversleeping.
b) Conditioning aversive stimuli to control later escape and avoidance behavior;
making a boastful claim in the presence of others will create a situation where
failure to accomplish the boast will be especially aversive because of the social
disapproval. We can avoid this aversive situation only by making good our boast.
Q15: Give some examples of the use of drugs as self-control techniques.
A15: Drugs are used to synthesize or approximate other environmental operations, and
are effective to the extent that they do so. Deprivation, satiation, and emotional
effects can all be artificially induced to some extent. For example, taking a sleeping
pill controls the later probability of sleep and reduces the aversiveness of insomnia.
60
COMMENT: Obviously, this field has grown since Skinner wrote this section, and many
more examples could be given. Self-control through drugs is much simpler than self-control by
acquiring self-controlling responses. The key difference lies in the fact that self-controlling
behaviors "stay" with you in a way that drug effects don't. In other words, you will always
require the drug to produce the desired self-control if drugs are what you rely upon. Secondly,
contemporary pharmacology being what it is, you often get more than you bargain for with drugs,
physiological side effects and addiction are two well-known examples.
Q16: What is the role of self-reinforcement and self-extinction as self-control techniques?
A16: It is not clear; presumably one must delay the self-delivery of a reinforcer till a
given response has occurred and then the reinforcer must strengthen that response which
immediately precedes it. Whether or not this effect can be adequately demonstrated
is unknown, since other explanations for such results (should they occur) are also
possible. The same holds true for self-extinction.
COMMENT: This is an extremely significant section, since it clearly distinguishes
Skinner from many other contemporary behavioral psychologists who are trying to utilize selfreinforcement as a technique to establish self-controlling behavior, e.g., Homme (1967) and
Kanfer (1970). Reinforcing and not-reinforcing yourself for specific behavior is a subtle issue.
On the fact of it, it seems little different from reinforcing or extinguishing someone else's
behavior. That is, you respond, then obtain reinforcers (e.g., mow the lawn, then drink a beer).
But is it really that simple a question? Its effect on lawn mowing seems clearly different than if
you pay a neighborhood youngster to do the job. He comes under the control of deprivation and
satiation, and will ask to mow the lawn again when he needs money. Similar effects don't seem
to be relevant to the self-reinforcement paradigm. In any event, Skinner seems much less certain
of the effectiveness of self-reinforcement and self-extinction (and in the next section, selfpunishment) than do many others.
Q17: What is the role of self-punishment as a self-control technique?
A17: It, too, is unclear. Engaging in behavior which will be punished or placing yourself
in a situation where punishment will occur are not examples of self-punishment.
Whether or not an individual can arrange a situation where he delivers aversive
stimulation to himself contingent upon his own behavior, and the behavior is
thereby suppressed is unclear.
COMMENT: The issue of self-punishment, as well as those of self-reinforcement and
self-extinction are probably best summarized for Skinner in his statement, "The ultimate question
... is whether a practice of this sort shows the effect which would be generated by the same
stimulation arranged by others." Obviously, he is not sure they will.
Q18: How is the principle of prepotency utilized as a technique of self-control?
A18: One can engage in a response of "doing something else" in order to avoid the
aversive consequences of a particular behavior. (Obviously, this can be facilitated if
other stimuli can also be arranged to supplement the behavior.)
61
The Ultimate Source of Control. pp. 240-241
Q19: Who arranges for the behavior of self-control?
A19: Either society or the physical environment. By punishing certain responses, a
situation is established whereby behavior that lessens the probability of the
punished behavior will be automatically reinforced. The variables which provide
the ultimate control are in the environment and in the behavioral history of the
individual's whole control.
Q20: What are the practical advantages of a functional analysis of self-control in contrast to
the traditional conception of self-determination?
A20: a) It can provide a technology for the effective teaching of self-control.
b) It can improve social maintenance of self-controlled behavior.
XVI. THINKING
The Behavior Of Making A Decision. pp. 242-244
Q21: How does making a decision differ from self-control as a form of self-determination?
A21: In decision-making, the outcome cannot be predicted, while in self-control, an
attempt is made to reach a particular known state of affairs.
Q22: What is the role of private events in decision-making?
A22: As with self-control, the variables that are manipulated to help make a decision are
often private events.
Q23: What are the similarities and differences in the techniques used to accomplish self-control
and decision-making?
A23: They are essentially the same, but with less emphasis upon motivation and
conditioning and more upon the direct manipulation of stimuli.
Q24: What is meant by the term "deciding?"
A24: The preliminary behavior necessary for the execution of the decided upon act.
COMMENT: This distinction between deciding and the act decided upon parallels the
distinction between the controlling response and the controlled response in the self control
paradigm of Chapter XV.
Q25: Give some examples of behavior that terminate decision-making prior to the execution of
the decided upon behavior.
62
A25: a) Committing an irrevocable act to support a single outcome, such as the making
of a down payment.
b) Announcing the decision (to insure aversive social consequences if one doesn't
act
in accordance with our decision).
c) Terminating all efforts to strengthen any but one course of action.
Origin and Maintenance of the Behavior of Deciding. p. 244
Q26: What are some of the consequences that reinforce decision-making?
A26: a) The negative reinforcement of escaping from indecision (Comment: Skinner's
use of "positive reinforcement" is incorrect in this instance).
b) The positive reinforcement resulting from more effective behavior arrived at
after proper consideration of alternatives. (maximizing the "net gain.")
Q27: What are some of the disadvantages of these consequences and how are these revealed?
A27: a) They are long delayed and frequently are obscurely related to any particular
response.
b) This is demonstrated by the typical difficulties seen in making decisions. Many
individuals and most lower organisms seem not to have especially effective decisionmaking behavior.
Q28: Why do we see as much decision-making as we do?
A28: As the result of the community teaching relevant decision-making techniques
through special contingencies of reinforcement that supplement the natural
consequences for the individual of escaping indecision or maximizing the net gain.
The Behavior of Recall. pp. 245-246
Q29: What special circumstances in decision-making occasions the use of a self-probe?
A29: An instance where the outcome is not currently known, but will be recognized when
it occurs, i.e., the response has been "forgotten" and the decision-making process
becomes one of recalling.
Q30: Describe some of the techniques that are available to aid in recall.
A30: Thematic probes, formal probes, and by establishing an aversive situation from
which escape will be reinforced.
Problems and Solutions. pp. 246-252
Q31: What is a problem?
63
A31: An occasion when a response exists in some strength which cannot be emitted.
Q32: How is the strength of this response usually revealed?
A32: By demonstrating that it occurs as soon as the situation permits.
COMMENT: At one point in this section, Skinner defines the solution to a problem as
simply a response which alters the situation so that a strong response can be emitted (p. 247).
However, throughout the remainder of the chapter, he refers to this activity as "problem-solving"
and refers to the emitted strong response as the solution. The questions in this guide will adopt
the latter formulation.
Q33: What are the consequences of the emission of the behavioral solution?
A33: a) The problem vanishes, since the condition has been eliminated.
b) The problem, per se, is unlikely to recur since the solution has been already
emitted and reinforced. Future occurrences of a similar situation will thus be
discriminative for the behavioral solution, rather than constitute a problem again.
Q34: What are the similarities and differences between problem-solving and self-control?
A34: a) Similarities: Both represent two response chains, with the earlier member
defined in terms of its effect in altering the probability of the latter through the
manipulation of the variables of which it is a function. The earlier responses are
reinforced to the extent that they are successful in so doing.
b) Differences: In self-control, the objective is to make a punished response less
probable, and the behavior to be so controlled is known by the behaving individual
in advance. In problem-solving, an unknown behavior, the problem's solution, is to
be increased in probability.
Q35: Why is the appearance of a solution no guarantee that problem-solving behavior has
occurred? Give an example.
A35: a) The solution may have occurred by virtue of the accidental arrangement of
environmental variables, for example, serendipity in science.
b) Trial-and-error learning, where increased activity is made more probable by
high deprivation states resulting in an increased likelihood that one particular
response might be successful. Similarly the active behavior of an organism said to be
exploring may also represent a situation where an accidental solution might occur.
Q36: Under what circumstances does trial-and-error learning display some minimum problemsolving?
A36: When the organism has acquired a repertoire of increased responding, and
responds to certain features of the problem through a history of reinforcement.
64
COMMENT: The behavior Skinner described in this section as minimum problemsolving of the trial-and-error type is often observed in young children and the retarded when you
begin to teach simple discriminations, such as color names, for example.
Q37: What are two problem-solving techniques that involve the manipulation of stimuli? Give
examples.
A37: a) Improving or amplifying stimuli; looking a problem over carefully, getting all the
facts, stating the problem in its clearest terms, etc.
b) Arranging or rearranging stimuli; phrasing an argument in the form of a
syllogism, placing factors into an equation, etc.
Q38: What are two other commonly used techniques in addition to the manipulation of stimuli
for problem-solving?
A38: a) The self-probe; reviewing tentative solutions.
b) Manipulating deprivation levels.
c) Arranging aversive schedules to keep behavior at a certain pace.
d) Eliminating erroneous responses through self-control. (Comment: I can't see
more separate techniques in this paragraph on p. 251 than these last three. I'm not
sure what is intended by "generating relevant interests," and following a rigid
schedule seems only to be a special case of self-control.)
Q39: What constitutes a problem's difficulty?
A39: The availability of the behavior which is the solution.
Q40: What makes a problem insoluble?
A40: A case when there is no behavior existing at any level of strength.
Having an Idea. pp. 252-254
Q41: What are some of the sources of "ideas" that do not occur as the result of deliberate
problem-solving. Cite some examples.
A41: a) The delayed solutions (said to have been worked out "unconsciously") which has
resulted from changed variables, i.e., competing behaviors may have been
eliminated and supporting variables added, etc.
b) Seeing new relationships, as in obtaining a new way of looking at your own life
following reading a book.
c) "Creative thinking" in the sense of generating new activities in the absence of
any clear problem. This may result either by haphazard or highly systematic
problem-solving activities.
65
Originality in Ideas. pp. 254-256
Q42: What is the problem for a functional analysis that is posed by the concept of originality
and creativity in thought or behavior?
A42: These are usually interpreted as the result of spontaneity or a demonstration of the
absence of lawful determinism.
Q43: How does Skinner suggest we account for an original or novel idea?
A43: Through behavioral processes such as stimulus generalization, or idiosyncratic,
personal
histories, or such factors as novel circumstances, or reacting to more subtle
variables in the environment, etc.
Q44: What are the practical advantages of a functional analysis?
A44: We can develop improved educational practices in several ways.
COMMENT: The paragraph on p. 255, describing the fact that the environment is now
in better control of man contains some extremely important ideas of Skinner on the nature of
cultural evolution. He further expands these concepts toward the goal of accelerated cultural
development in his book, Beyond Freedom and Dignity, (1971).
Suggested Further Readings:
"Teaching Thinking" in Skinner's Technology of Teaching (1968)
"The Creative Student" also in the above text.
"An Operant Analysis of Problem Solving" in Skinner's Contingencies of Reinforcement:
A Theoretical Analysis (1969)
XVII. PRIVATE EVENTS IN A NATURAL SCIENCE
The World Within One's Skin. pp. 257-258
Q45: What is the "world within one's skin?"
A45: That part of the physical universe capable of affecting behavior as a source of
independent variables, but which is solely accessible to the behaving organism.
COMMENT: Notice that Skinner doesn't suggest that a private event cannot be observed
by another organism, only that when it is, it is responded to differently. That is, a cavity
constitutes a completely different stimulus event for the dentist's behavior than it does for the
patient's. Behaviorally it is by no means the "same thing."
66
Q46: How is this realm of events distinguished from the external world?
A46: Only by its limited accessibility.
Q47: What is the task of a functional analysis?
A47: To suggest a view of such events that is an alternative to the traditional scientific
dualism on the issue.
COMMENT: Obviously, many behavioral psychologists, both experimental and applied
(e.g., behavior modifiers) might wish at this point that Skinner had let sleeping dogs lie. He
seems to be opening the Pandora's box of inner events that a more "tough-minded" approach to
human behavior has been trying to shut for the last several decades. However, it has long been
Skinner's position that to ignore the phenomological world --- usually called "mental" --- is to
weaken rather than strengthen the position of the behaviorist. Thus, he has taken several
opportunities to extend the concept of a functional analysis based on the prediction and control of
observable behavior into the private world of the individual. An elaboration of his own
description of the goals and rationale of this effort, "radical behaviorism" as it is frequently
called, is available in his article "An Operational Analysis of Psychological Terms" (1945,
reprinted in Cumulative Record, 1961)
Verbal Responses to Private Events. pp. 258-261
Q48: What is the problem in establishing private events as discriminative stimuli for verbal
responses?
A48: The social community, which teaches descriptive verbal behavior, has no access to
the private discriminative stimulus. Thus, it cannot differentially reinforce
appropriate responding.
Q49: What are the three methods Skinner describes for the establishment of private events as
discriminative stimuli for verbal behavior? That is, what does the community respond to
when it reinforces discriminative behavior to private events? Give examples.
A49: a) Public Accompaniments: A child is hit by a thrown ball. You ask, "Does that
hurt?" He nods, then imitatively responds, "Yes, that hurts."
b) Collateral responses: You find a youngster holding his head and rocking back
and forth. You ask, "Do you have a headache?", etc.
c) Transfer of Common Properties: A descriptive response, acquired to a public
event, may be generalized to a private event. The internal event becomes
metaphorically described.
Q50: What is the problem for the listener of someone's description of subjective events?
A50: The teaching methods for such a vocabulary are imprecise compared to those
involving public events, and thus the subjective repertoire of discriminative
67
responses is likely to be unreliable. Both the scientist and the layman "mistrust"
descriptions of private events.
Q51: What is the result of this imprecision for the individual who acquires such a repertoire of
responses to private events?
A51: Since repertoire of self-observation requires social shaping, we don't know ourselves
to any greater extent than we are taught to do so. We are ignorant to the extent that
we haven't been shaped to respond well to our own behavior.
COMMENT: This is a rather subtle and much debated point. However, it is a
straightforward extension of what is known about responding to external stimuli. Animals not
shaped to respond differentially to certain stimulus features simply don't respond to them. Since
most verbal behavior called knowledge hinges upon verbal abstractions, failure to be taught them
results in failure to observe them. This well apply to the private event as well as to the public
one.
Varieties of Private Stimulation. pp. 261-264
Q52: What are three kinds of private stimulation and their sources?
A52: a) Interoceptive: arise from digestive, respiratory and circulatory systems (small
muscles and glands).
b) Proprioceptive: arise from the position and movement of the body in space and
with respect to the relative position of the parts (joints and tendons).
c) Exteroceptive: stimulation from the external environment.
Q53: To which of these does an individual respond when describing his own behavior?
A53: To a possible combination of all three.
Q54: How can the verbal community establish self-descriptive behavior that includes responses
to private events?
A54: a) The private stimuli have characteristically preceded or accompanied publicly
observable responding.
b) The speaker is referring to variables of which the behavior is characteristically a
function, rather than the behavior itself.
c) The described behavior is occurring at a reduced level, observable only to the
speaker, e.g., "convert behavior".
Q55: What are the special advantages of covert verbal behavior?
A55: a) It doesn't actually require the presence of the external environment for its
occurrence.
b) It can be reinforced and maintained at the covert level by its effect on the
individual's behavior.
68
COMMENT: Skinner's analysis of the role of covert verbal behavior is greatly extended
in Verbal Behavior (1957). One distinction made therein that is relevant to this section of Science
and Human Behavior is that between the roles of the speaker and listener when each occurs
simultaneously in the same individual. That is, we "talk to ourselves" and to the extent that this is
effective (in the sense of self-determination) the behavior is maintained. We both speak and
listen to ourselves for the same reason that we speak and listen to others. It represents a case of
publicly reinforced behavior occurring at a covert level and being reinforced in that form.
Responses to One's Own Discriminative Behavior. pp. 264-266
COMMENT: This section must be read carefully, since Skinner presents two difficult
and quite different points, although both involve discriminative repertoires. The first centers
around the question of how one learns to describe such repertoires, and the second involves the
problem raised by discriminative responding in the absence of relevant external stimuli.
Q56: Give an example of a response to a discriminative behavior.
A56: "I hear a siren," "I see a rainbow," etc.
Q57: How does the verbal community teach the individual to respond to his own discriminative
behavior?
A57: a) It may rely upon the presence of a conspicuous external stimulus.
b) It may rely upon the proper orientation of the relevant receptors.
c) It may depend upon the production of additional collateral information.
Q58: To what does the individual ultimately respond in these cases?
A58: To the private occurrence of a discriminative response.
Q59: Under what conditions does this prove to be a problem for the analysis of behavior?
A59: When such discriminative responding occurs and the appropriate external stimulus
is not present, e.g., "I see a rainbow," when there is no rainbow physically present.
Q60: What are the major sources of such responding?
A60: As the result of classical or operant conditioning.
Conditioned Seeing. pp. 266-270
Q61: In what manner does classical conditioning account for seeing something when it is not
there? Give examples.
A61: Discriminative responses to objects can be conditioned to stimuli which
characteristically precede or accompany the external stimulus which originally
69
controlled the discriminative response by the process of stimulus pairing. Hearing a
dinner bell produces a conditioned discriminative "seeing" response of food
although the food is not yet present.
COMMENT: Skinner is using the Pavlovian formula strictly to refer to the operational
procedures of stimulus pairing to produce a stimulus substitution effect. Earlier discussed
behavioral distinction ("autonomic vs. external musculature") and stimulus function (elicits vs.
occasions) are not involved. If one effect of a stimulus is somehow to produce a discriminative
response, perhaps through contingencies of reinforcement, then presumably that effect can be
transferred to another stimulus by classical conditioning procedures.
The next several paragraphs discuss how certain fragmentary or partial stimuli can
produce a conditioned discriminative response appropriate to the original complete stimulus, thus
causing one to "see" the completed stimulus rather than the fragmentary one. The context of this
discussion, of course, are the Gestalt phenomena and optical illusions.
Q62: What is the range of control exerted by external stimuli over discriminative responding?
A62: From the situation where momentary stimulation is optimal, e.g., fully in the
presence of the external stimulus, to the situation where the stimulus is absent and
the discriminative response is wholly a conditioned response.
Q63: What are the sources of individual differences with respect to conditioned discriminative
responding?
A63: a) Different sensory receptor functions, perhaps of genetic origin.
b) Different personal conditioning histories.
c) Different histories of reinforcement for producing a descriptive repertoire of
discriminative responses.
Q64: What is an hallucination?
A64: A discriminative response to an absent stimulus, where the individual cannot report
the facts that the stimulus is, in fact missing.
Q65: What is the practical importance of conditioned discriminative responding?
A65: Discriminative responding is often reinforcing, and conditioning can extend the
range of this reinforcing effect.
Operant Seeing. pp. 270-275
COMMENT: This section is quite complex, both because of the subtleties of the
behavior(s) involved and the particular style of organization (or lack thereof).
Q66: What are some evidences of strength for a discriminative operant?
70
A66: a) The existence of strong precurrent behavior, behavior which produces and
occurs in the presence of the relevant stimulus, i.e., looking for, and at, "X"
extensively.
b) Reports of discriminative behavior in the presence of ambiguous or nonexistence
stimuli, e.g., "private" seeing.
Q67: How is operant discriminative responding distinguished from classically conditioned
sensory processes?
A67: By virtue of the controlling variables. Removal of deprivation will eliminate the
operant form.
Q68: What are the advantages and disadvantages of discriminative responding in the absence
of the external stimulus?
A68: a) Advantages: It doesn't require any precurrent behavior to produce the stimulus,
it can occur when the external stimulus is completely out of reach, and it cannot be
punished.
b) Disadvantages: It doesn't reduce the deprivation level through satiation.
Q69: What are some sources of reinforcement for discriminative responding?
A69: a) It may be automatically reinforcing, as with the hobbyist who looks at model
planes all day.
b) It may be reinforced via its success in problem-solving, both when relevant
stimuli are provided but also when they are not, and the discriminative responding
is wholly "private."
Q70: Is private problem-solving wholly discriminative behavior?
A70: No, private manipulative as well as discriminative behaviors can occur.
Q71: What are some sources of individual differences in private behaviors utilized in problemsolving?
A71: a) Differences in histories of reinforcement that have produced the behavior.
b) Differences in ability to describe such behavior or use it as a basis for further
behavior.
c) Differences in preference for the kind of private event utilized, i.e., overt or
covert, verbal or nonverbal solutions, or discriminative imagery are frequent kinds
of solutions to private problem-solving.
Q72: Why is it difficult to distinguish among particular private problem-solving skills?
A72: Many different forms may be simultaneously occurring. Since the community can
only reinforce the solution, many forms of private problem-solving can occur and be
reinforced when the solution is correct.
71
COMMENT: Although Skinner relied on problem-solving as a source of reinforcement
for private responding and thus as a vehicle to give many examples, he includes at the end of this
section the fact that the self-control relationship would similarly be a source of reinforcement for
private events.
Traditional Treatment of the Problem. pp. 275-280
Q73: How does one account for the verbal behavior which describes discriminative behavior?
A73: The organism must not only be in the presence of the relevant stimulus but also be
making a discriminative response to it, i.e., "seeing X".
COMMENT: Be sure to remember that both the descriptive verbal response and the
discriminative response are behaviors.
Q74: What accounts for the descriptive response in the absence of the relevant stimulus?
A74: The organism is describing a discriminative response that has resulted from operant
or respondent conditioning. The external event itself is only one of the variables
responsible for the production of the discriminative response. The discriminative
response can thereby occur by virtue of the strength provided it by other
independent variables even in the absence of the external stimulus.
COMMENT: Note that although the descriptive response provides evidence for the
discriminative response, it is not an inevitable concomitant. Thus, one can assume that the
private discriminative responding also occurs in nonverbal organisms, such as the case of
animals having "dreams."
Q75: What is the traditional solution to private seeing?
A75: a) If the relevant stimulus is present, what is seen is called a sensation, a sense
datum, or a percept.
b) If the stimulus is absent, what is seen is called an image, thought, or idea, and is
said to be mental or psychic in nature.
COMMENT: The remainder of this section provides an analysis of the possible causes
for the traditional perspective, and the more appropriate behavioral account for the same
phenomena.
Q76: Briefly describe the problems which have resulted in the traditional sensation-perception
distinction.
A76: a) Physical events provide multiple stimuli to an organism. Hence no single
discriminative response was said to represent "reality."
b) Multiple responses can be made to physical events. Active responding is ruled
out as "knowledge" in preference for more passive descriptive responses.
72
c) Behavioral abstraction resulted in two different solutions; (a) immediate contact
was called raw experience, the later abstraction was assumed to construct a physical
world never directly experienced, or (2) the single contact was an unanalyzed
contact while systematic knowledge resulted from more extensive "experience."
d) The historical inadequacy of physical sciences to provide accounts for
stimulation at a distance. The solution was to put "experience" inside the skin and
the "real" world at a distance and thereby never really "knowable." Both
"sensations" and "images" occur in the experimental world depending upon the
presence or absence of the distant object. Physical science closed the gap between
the event and the organism by providing a description of the stimulating
phenomenon, e.g., light waves, and a behavioral account distinguishes between
discriminative responding in the presence and absence of the relevant stimulus.
COMMENT: The final part of this section, "Objections to the traditional view," is the
predictable discussion of the irrelevance of conceptualizing "sensations" and "images" as "mental
events." As such they may occupy the middle link position, but are characteristically seen as
causal. The behavioral account must not only provide a description of their occurrence, but also
trace their causes to the external world.
Q77: Are there "wholly private events," events which are knowable only to the organism?
A77: No, private knowledge is identified with self-descriptive responding, which must be
acquired from a verbal community. An event which was completely private, one
that lacked public accompaniments, collateral behavior, or about which additional
information could not be gleaned, could never become a discriminative stimulus
since the community would have no basis for differentially reinforcing the necessary
descriptive behavior.
Other Proposed Solutions. pp. 280-281
Q78: What other proposed solutions to the problem of private events are available, and are
they compatible or incompatible with the present functional analysis?
A78: a) Studying one's own private world, i.e. introspection, or relying exclusively on
the verbal reports of others concerning their subjective experiences. Both solutions
are inadequate since self-descriptive behavior, either of oneself or the reports of
others, are only one kind of response to a private event.
b) Studying the physiology of sensation. This approach never solves the behavioral
problem of "seeing" as a response. It merely takes a physical stimulus and traces it
(or something like it) further into the organism. A functional analysis doesn't
identify a source of stimulation as an only variable controlling the seeing response,
whether it is a set of light waves or a neural complex. From a behavioral view, it
merely is the commonest variable.
c) To rule out private events because of a scientific methodology which requires
public agreement. This perspective actually encourages a dualistic world of public,
"scientific", events and a non-physical or mental world beyond science.
73
d) An approach which is compatible with functional analysis is an attempt to
instrumentally amplify covert responding. However, an account still must be
provided for such behavior when it affects the organism without amplification.
COMMENT: Two quick observations. Additional support to Skinner's position on the
nature of discriminative responding to private events is provided by those studies which
demonstrate that additional control can be gained when covert behavior is instrumentally
amplified, "fed back" to the organism, and reinforcement is made upon certain properties thereof,
e.g., Hefferline, Keenan, & Hartford, Science, 1959. Second, although the sensation and
perception psychologists are characteristically unaware of Skinner's functional analysis, a rather
simple switch in their viewpoint from stimulus "tracing" to trying to identify neuro-physiological
"responding" would certainly enhance the importance of their efforts from a functional
perspective.
Additional Readings:
Skinner, B.F. The operational analysis of psychological terms. Psychological Review,
1945. Reprinted in Skinner's Cumulative Record.
Skinner, B.F. Verbal Behavior, 1957.
Skinner, B.F. Behaviorism at fifty, Science, 1963. Reprinted in Skinner's Contingencies
of Reinforcement, 1969.
Day, W.F. Radical behaviorism in reconciliation with phenomenology. Journal of the
Experimental Analysis of Behavior, 1969.
XVIII. THE SELF
Q79: What is the common interpretation of the role of the "self?"
A79: As a hypothetical cause of action.
Q80: Why are selves and/or personalities sometimes said to be multiple?
A80: Because there sometimes appears to be two selves acting simultaneously and in
different ways, with self-control or self-awareness. (Comment: Notice that the
meaning of "different ways" doesn't necessarily imply opposition, just different
kinds of behavior, as in the speaker-listener relationship.)
Q81: Give a brief description of the behaviors involved in the Freudian personality structures.
A81: a) Id; behavior primarily controlled by basic deprivations and not under any social
control.
b) Superego; a set of self-control responses acquired from the social group.
c) Ego; behaviors shaped by the practical exigencies of the environment.
74
Q82: Why does Skinner feel that such phenomena are worth considering?
A82: Although an account in terms of inner determination is invalid, the behaviors so
described represent important ones in society, and as such require a behavioral
analysis.
The Self as an Organized System of Responses. pp. 285-288
Q83: What is a self?
A83: A functionally unified system of responses.
COMMENT: Notice how Skinner deals with the "explanatory fiction" of the self; by
examining the facts upon which it is based. This is a completely different strategy than ignoring
it, or claiming that since it is supposedly non-physical, it has no place in science.
Q84: Identify and give examples of the several ways in which different response systems
(selves) can be organized or unified.
A84: a) Topographical subdivisions: personalities may refer to certain characteristic
ways of acting, such as a man of action or an impatient man.
b) Sets of behavior may be organized around certain discriminative stimuli; an
effective worker or a religious man, when in the appropriate settings.
c) Behaviors may be organized around a deprivation variable; the "dirty old man"
personality.
d) Behaviors may be organized around emotional variables; a person may become
"occupied with rage" when made jealous.
e) Drug effects; a person becomes "spaced out" under the effects of certain drugs.
COMMENT: Notice that any or all of these factors can come into play in the same
individual at different times or sometimes simultaneously.
Q85: What difficulties can arise utilizing an analysis of the individual in terms of selves or
personalities?
A85: The unity of the response systems may be overestimated by the technique of
personification, and thus prediction and control are difficult to obtain using such a
categorizational system.
Q86: In the next section, "Relations among selves," three basic relations between response
systems are identified and discussed. Be able to define and give a detailed example of
each of these.
A86: a) Incompatible response systems: A man may acquire two different patterns of
interaction with people, one appropriate to one setting e.g., a scientist, and another
appropriate to, say, a husband and father. Difficulty would only arise if the
situation occasioned both behaviors simultaneously. For example, the man may be
asked to use his own children for research purposes.
75
b) Self-determination (self-control, problem-solving, decision-making and
creativity): Because an individual's behavior under primary deprivation conditions
may become disadvantageous to the group, a set of self-control responses may be
shaped in the individual which lessen the aversive behavior if they are successful.
Freud has identified these two behavioral systems as the id and the superego.
Skinner describes the same situation as a system of self-controlling relationships.
c) Self-knowledge: Social contingencies quickly establish a set of responses that are
occasioned by other behaviors and which are descriptive of them. Sometimes this
self-knowledge is deliberately established for certain purposes, as in Eastern
philosophy or psychotherapy.
COMMENT: The last paragraph is a little difficult, but rather important. Essentially,
Skinner has already described self-knowledge as a set of self-descriptive responses under the
control of contingencies deliberately imposed to produce that effect. He now is asking whether
or not other organized systems of behavior (selves), occasioned by other variables and controlled
by different contingencies, can come to display this kind of self-descriptive responding with
respect to another system of responses, e.g., does the id "know" about the superego?
The Absence of Self-Knowledge. pp. 288-292
Q87: What are several examples of situations where self-knowledge is missing?
A87: a) One may not know that he has done something as in "unconscious" problemsolving.
b) One may not know that he is doing something, as when a skilled and practiced
driver "automatically" downshifts when approaching the corner.
c) One may not be able to predict his own behavior, e.g., someone about to fly into a
rage may not be able to describe the probability of his doing that, but others may
see
it coming.
d) One may not recognize the variables that control his behavior. For example,
someone may display hostility toward authoritative figures because of their
similarity to his father, whom he hates(?).
COMMENT: The example of automatic writing is treated extensively by Skinner in his
article, "Has Gertrude Stein a Secret?", Atlantic, 1934, reprinted in Cumulative Record.
Q88: The next few paragraphs describe some simple situations where self-knowledge or
awareness may be lacking. Be able to describe these briefly.
A88: a) The stimuli supplied by the behavior may be weak.
b) Prepotency; responses other than self-descriptive ones are necessitated by the
situation.
c) Satiation and sleep.
d) Drug effects.
e) Responses occurred prior to learning self-descriptive behavior.
76
COMMENT: Notice Skinner's description for how an adult may be able to describe a
childhood situation if the visual scene can be evoked later, presumably as in conditioned seeing.
This book is full of two sentence explanations for behavioral phenomena that some spend a good
portion of their careers trying to explain.
Q89: What is Skinner's account of repressed self-knowledge?
A89: One kind of behavior that can provide conditioned aversive stimuli is selfdescriptive responding of punished behavior. Behavior that successfully replaces
the self-description of punished behavior will be automatically reinforced by the
termination of the conditioned aversiveness of such self-awareness. Thus one not
only "represses" punished behavior by learning to do something else, one also
"represses" self-description or awareness of such behavior for the same reasons, i.e.,
to terminate conditioned aversive stimuli.
Q90: If the form of the punished response always important with respect to repressing selfdescription?
A90: No, punishment itself doesn't always depend upon form. Thus one will be able to
describe certain acts if they are not themselves currently punishable, i.e., they may
be under imitative control, or occurring in a situation where such behavior is
appropriate.
Q91: What is "rationalization," according to Skinner?
A91: We often report fictional controls for punishable behavior rather than accurate ones
in order to avoid punishment. It is the self-descriptive responding that is repressed
rather than the overt behavior that could be punished. The reporting behavior
that replaces accurate self-description is called rationalization. As such, it may or
may not completely replace accurate self-awareness. If it does not, we can describe
the fact that we are "lying".
Symbols. pp. 292-294
COMMENT: Skinner is using the word "symbol" in this section somewhat differently
than in the commonsense meaning of "representing something else," as when flags symbolize
countries, etc. Instead, he is talking about the "Freudian" symbol, where whatever being
symbolized is punishable in appropriate or accurate form.
Q92: What is a Freudian symbol?
A92: A spatial or temporal pattern reinforcing because of its similarity to something else,
but which escapes punishment because of the differences.
Q93: Why might such symbols occur in art, literature, or dreams?
77
A93: Because they are reinforcing but they do escape punishment.
COMMENT: I'll bet you never thought you'd read a Skinnerian interpretation of dreams,
did you?
SECTION IV: THE BEHAVIOR OF PEOPLE IN GROUPS
XIV. SOCIAL BEHAVIOR
Q1:
What is Skinner's definition of social behavior?
A1:
"...the behavior of two or more people with respect to one another or in concert with
respect to a common environment."
Q2:
Can the laws of a social science be based upon a science of the behavior of individuals?
A2:
Yes, since it is always individuals behaving and according to the same processes that
would occur in nonsocial situations. According to Skinner, "The individual
behavior explains the group phenomena."
Q3:
Should the social sciences state their laws in the terms of a functional analysis of
individual behavior?
A3:
Not necessarily, since another descriptive level may be more convenient and equally
valid.
COMMENT: Note how Skinner describes his objectives for this section. He is going to
use the principles established from a functional analysis of individual behavior "to account for"
social behavior. The objective is to see whether or not such an analysis can, in fact, be made
without introducing new terms or presupposing new principles or processes. If it can, the
adequacy of the original analysis is further substantiated, and the social sciences will be shown a
new and simpler perspective for their subject matter. This also was his rationale for dealing with
private events, as well as with human behavior as a whole.
The Social Environment. pp. 298-304
Q4:
What kind of roles do other individuals play in social reinforcement? Give examples.
A4:
a) They may participate as objects, as in sexual reinforcement.
b) They mediate the delivery of reinforcers, as with a mother feeding a baby.
Q5:
What are some characteristic features of behavior that have been reinforced through the
mediation of others?
A5:
a) It is more extensive.
b) It is more flexible.
78
COMMENT: The next few paragraphs describe the effects of various schedules of social
reinforcement of behavior. It is interesting in that all of the examples are of misapplication, or,
in any event, of unhappy outcomes. It is these kinds of situations, where positive reinforcement
is badly applied or applied to the disadvantage of some individual that led to Beyond Freedom
and Dignity (1971).
Q6:
What is a feature of social reinforcement contingencies that rarely, if ever, prevails in
inanimate nature?
A6:
Reinforcement directly contingent upon the rate of behavior.
Q7:
Why is it difficult to identify the physical features of such social stimuli as a smile?
A7:
Because the effect of such stimuli is derived from the way they participate in certain
social contingencies and not because of any inherent physical properties. Thus they
may vary from time to time within a culture or across-cultures.
COMMENT: An example of the point Skinner is making would be to consider the
redness of a ripe apple as a discriminative stimulus for picking and eating it compared with the
smile of the person you are with as discriminative for asking a favor (or something like that).
The "behavior: of the apple is naturally determined, e.g., redness always equals ripeness, whereas
the smile may merely reflect a history of being polite rather than any particular inclination to
further reinforce.
Q8:
What is it about social stimuli that set them apart from a simple analysis in terms of
physical features? Give an example.
A8:
Slight changes can produce large effects. The behavior of someone "being watched"
can be quite different from that which occurs when unobserved.
COMMENT: Again, it is the long history of reinforcement that can produce such effects,
but there are analogues in physical nature, since slight physical differences can be associated with
considerably different properties.
Q9:
What class of social stimulus seems not to vary to any great extent between cultures?
A9:
Imitative stimuli: Although any particular form is culturally determined, the
stimulus-response relationship remains constant because of its importance in
cultural control.
The Social Episode. pp. 304-309
Q10: How does one account for the behavior of two individuals in a social episode?
A10: By considering the first as under control of variables generated by the second, and
the second under the control of variables generated by the first, and then putting the
two analyses together to reconstruct the episode.
79
COMMENT: The next several paragraphs describe a number of common social
episodes, both animal and human. Several situations are considered. An important point to
remember in all of these is the role of the external environment in determining the overall
contingency, i.e., one form of social interaction (cooperation) may provide more reinforcement
than others within a given situation.
Q11: What is the behavioral nature of the leader and follower relationship?
A11: Leaders are primarily controlled by exigencies of the situation; followers are
primarily controlled by exigencies of the situation; followers are primarily
controlled by the behavior of others. However, if the task requires cooperative
effort, then the leader depends upon the behavior of the followers, and to the extent
that this behavior reinforces him, the followers, in fact, can be said to be in control.
Q12: To what extent do verbal interactions defy a functional analysis of social interactions?
A12: They don't.
COMMENT: Skinner again refers the reader to Verbal Behavior (1957) for a more
detailed explication of language. It would be difficult to exaggerate the importance of
understanding a behavioral account of language, since nonbehavioral ones are so prevalent and
misleading, as in the nature of scientific or logical thinking.
Q13: Provide a functional analysis of the following: You (A) are in the library reading.
Another student (B) comes up, and asks you for the time. You tell him, he thanks you
and goes on.
A13: Student B is currently under conditions that make knowledge of the time
reinforcing, perhaps he is going to meet a friend or is trying to avoid being late for a
class. He asks you, since you are a discriminative stimulus for providing the correct
time. Several factors may enter in. You may function as such simply by virtue of
being the only other person there or perhaps subtle cues are involved, e.g., you are
wearing a conspicuous watch, or just checked the time yourself, etc. The request for
time may set up a mild aversive situation from which you can only escape by telling
him what he wants, so that you can go back to reading. Or perhaps pleasing others
is reinforcing to you, and thus the situation represents an opportunity to be
positively, rather than negatively, reinforced. His gratitude is emitted probably
more under social control than in any deliberate attempt to strengthen your "timetelling" behavior in the future, but does serve as a discriminative stimulus for you to
return to your previous activities.
Q14: What is an autocatalytic social episode?
A14: One in which the interaction is an unstable one which generates progressive change,
such as a conversation which leads to a quarrel.
80
COMMENT: Note the last sentence in this section. Skinner's social concerns enter time
and again into the book, and are clearly articulated in the last section.
Supporting Variables in the Social Episode. pp. 309-311
Q15: Under what condition are social interactions not self-sustaining? Give an example.
A15: a) When there is a necessity for prior reinforcement from the group to establish
suitable behavior for the individual within the interlocking system.
b) Consider the student-teacher relationship: College academic behavior is
probably insufficiently maintained by either grades or the acquisition of knowledge,
and requires therefore considerable prior conditioning for the student to behave
appropriately. Similarly, faculty members probably are never completely
reinforced by improved performance in students, and obviously are additionally
reinforced by status, salaries, etc.
COMMENT: As Skinner notes, this is an extremely important concept when considering
the social practices of the culture. Most contemporary agencies of social services depend greatly
upon external social support for the behavior of those involved, e.g., fund drives, charities, etc.
Sometimes, of course, socially established altruistic behavior is missing, or is quickly
suppressed, as in a special education classroom with "culturally disadvantaged" youngsters, and
the interlocking system becomes unstable and nonproductive. Virtually all contemporary "social
values" represent descriptions of behavior that require social support to be maintained and their
"value" is the extent to which society will, in fact, support these behaviors. Why systems are not
self-sustaining to the individuals involved, but are sufficiently important to the group as a whole
to be supported (either by prior conditioning or supplementary variables), is considered in
Chapter XXVIII and elsewhere (e.g., Beyond Freedom and Dignity, 1971).
The Group as a Behaving Unit. pp. 311-312
Q16: Provide two reasons why people may join group activities.
A16: a) They simply may be imitating others, since imitation is frequently reinforced.
b) The individual enhances his power to be reinforced by being in a group. More
can be obtained by acting in concert than by acting individually in many cases.
XX. PERSONAL CONTROL
Q17.
What is the asymmetrical social relationship referred to when we say that someone is
"deliberately" controlling someone else?
A17: When someone behaves in a way to affect another's behavior because of the
consequences of which that behavior change has for the one exerting the control.
Control of Variables. pp. 314-315
81
Q18: What special advantage does personal control have that is not usually available to
organized agencies of social control?
A18: The controller is often in a position to base his techniques of control upon the
idiosyncrasies of the person he wishes to control, whereas agencies of control rarely
have access to this kind of information and base their practices upon variables that
are common to groups.
Q19: What is usually the first objective of the exercise of personal control?
A19: To establish and maintain contact with the controllee.
Q20: How is this typically accomplished? Give an example.
A20: By reinforcement; network T.V. shows with live audiences always start with a
"warm-up" session for the audience before the actual taping begins.
Techniques of Control. pp. 315-320
COMMENT: This is an absolutely critical section. Being able to successfully analyze
instances of control as well as apply these techniques when necessary are the bases of
contemporary applied behavior theory.
Q21: Be able to enumerate each of the nine basic techniques of control listed in this section,
and give an example of each.
A21: a) physical force; the incarceration of criminals.
b) manipulating stimuli; advertising campaigns.
c) reinforcement; academic grade systems.
d) aversive stimulation; the use of threats.
e) punishment; spanking children for misbehavior.
f) pointing up contingencies of reinforcement; the use of instructions.
g) deprivation and satiation; restricting access to certain reinforcers to strengthen
behavior previously reinforced by such events.
h) emotions; making the classroom a "happy" place.
i) drugs; the businessman's lunch.
COMMENT: Several of these categories require further analysis.
Q22: What are some of the disadvantages of physical force as a technique of control?
A22: a) It requires the sustained attention of the controller.
b) It has little effect upon increasing the probability of behavior, but rather is
concerned with prevention.
c) It frequently generates counterattack.
d) It can only control certain behaviors at a time.
e) It has little or no effect upon private events.
82
Q23: What are some of the ways the stimuli can be manipulated to control behavior?
A23: a) Using conditioned or unconditioned stimuli to elicit reflexes.
b) Providing discriminative stimuli to occasion desired responding or evoke
incompatible behavior if the objective is to reduce responding in some way.
c) Using supplementary stimuli to combine with other variables already present;
both imitative stimuli and most verbal behavior consists of supplementary
stimulation of this kind.
COMMENT: Notice that Skinner distinguishes between using discriminative stimuli in
isolation or as a source of supplementary stimulation as techniques of control.
Q24: What is one risk in the social utilization of conditioned reinforcers?
A24: That the deferred relationship between the behavior and the response, which is
mediated by the conditioned reinforcer, may break down if the primary reinforcer
isn't ultimately more reliably forthcoming.
Q25: What is the distinction between aversive stimulation and punishment?
A25: a) Aversive stimulation refers to unconditioned and conditioned negative
reinforcement to strengthen escape and avoidance behavior (respectively).
b) Punishment refers to the presentation of negative reinforcers or the removal of
positive reinforcers contingent upon behavior (in an attempt to weaken the
responding).
Q26: What is required if "pointing up contingencies of reinforcement" is going to be effective
as a control technique?
A26: The controllee must first have a history of reinforcement for following such
instructions.
Q27: When is the technique most likely to be used?
A27: When the controller cannot control the relevant events necessary for the ultimate
maintenance of the behavior.
COMMENT: Notice how Skinner describes the use of deprivation. It is a control
technique that permits you to strengthen behavior by deprivation alone, given that the behavior
has previously been reinforced by the event you are currently depriving the organism of. This is
a completely different utilization of deprivation than when you use it to enhance the future
reinforcing effectiveness of something you wish to use to strengthen behavior.
Q28: What are behaviors that are described as emotional?
A28: Reflexes and emotional predispositions.
83
COMMENT: It is extremely important to remember Chapter XIV while reading this
section. Probably no single technique of personal control represents an isolated instance of one
of these described categories. Many of the environmental variables have multiple effects, and
some of these may come into play when you attempt to control behavior, i.e., using positive
reinforcers may both strengthen behavior and generate a favorable predisposition toward you, as
well as permit you to exercise control via deprivation over the particular response at some later
point in time.
Objections to Personal Control. pp. 320-322
Q29: Why is deliberate control not a popular topic?
A29: Because deliberate control is aversive to the controllee in many cases.
Q30: What is the effect of this?
A30: It causes the controllee, either as an individual or as a group, to exercise
countercontrol.
COMMENT: Notice how one form of group countercontrol is to make the exercises of
control a conditioned aversive event, and thus, lessen the likelihood that individuals so
conditioned will attempt to use personal control techniques.
Q31: What are the cultural consequences of such a history of the aversive use of control?
A31: To prevent an effective analysis and utilization of the use of control techniques.
COMMENT: It scarcely needs said that this last paragraph anticipates the book Beyond
Freedom and Dignity (1971).
XXI. GROUP CONTROL
Q32: Why do groups act to control individuals?
A32: Because the individual's behavior with respect to the members of the group invite
countercontrol, e.g., when resources are limited, one man's gain is everyone else's
loss.
Q33: What is the principle technique whereby the group exerts control over the individual?
A33: To classify behavior as "right" or "wrong" and reinforce or punish accordingly.
Q34: Is this classification system foolproof? Why or why not?
84
A34: No, since behavior can be labeled aversive because of accidental correlations (the
messenger who announced that Troy had fallen) or because of past contingencies
which are no longer relevant.
Q35: How is aversive behavior controlled by group practices?
A35: It is labeled "wrong" and punished. It then becomes a source of conditioned
aversive stimuli and behavior which replaces it is automatically reinforced. Or, the
group may directly reinforce self-control.
Why the Group Exerts Control. pp. 325-326
Q36: What is required to explain group control?
A36: To show how the behavior of the individual and the group members are interlocked
in a controllee-controller social system, and give an account of the variables
responsible for the behavior of each.
COMMENT: The remainder of this section provides a description of two general
formulae that account for the nature of group control; the first involves group behavior as an
emotional reaction to individual behavior, and the second suggests that group consequences are
deliberately provided to increase or decrease future instances of the same behavior.
Q37: Give an example of group countercontrol as an emotional reaction and as an example of
deliberate punishment.
A37: The difference would lie, presumably, in that which exists between a lynch mob and
a court of law.
COMMENT: Notice that emotional counter-aggression may be effective in suppressing
behavior, and thus also may be maintained because of its consequences.
The Effect of Group Control. pp. 327-328
Q38: What are the disadvantages and advantages of group control to the individual so
controlled?
A38: a) Disadvantages; access to reinforcers is restricted since selfish behavior is
suppressed and altruism is reinforced.
b) Advantages; the group members profit by virtue of being in a group which
similarly controls all its members' behavior.
Q39: What counterbalances the power of group control?
A39: The system of behavioral classification is rarely consistent; all members of the group
rarely act in complete accord.
85
Justification of Group Control. pp. 328-329
Q40:
How does a functional analysis of behavior deal with the ethical problem of right and
wrong, good and bad, etc.?
A40: Simply by observing how groups and individuals use such terms, and what
behavioral processes account for their use.
Q41: What sources of justification for such distinctions are frequently used?
A41: a) Appeals to authority; "good" and "bad" presumably are defined by
supernatural sources, as in religion.
b) Ethical distinctions based upon non-behavioral rationale, e.g., the greatest good
for the greatest number, etc.
COMMENT: Notice how a behavioral account frequently can be provided for the
behavior of those who make such distinctions along other lines.
Q42: Does an analysis of controlling practices provide a rationale for determining what the
group controlling practices should or should not be?
A42: No.
SECTION V: CONTROLLING AGENCIES
XXII. GOVERNMENT AND LAW
Controlling Agencies. pp. 333-335
COMMENT: The introductory paragraph in this section briefly presents Skinner's views
regarding the nature and source of social control.
Q1:
What is Skinner's objection to the various conceptions of the behaving individual as
encountered in law, economics, education, etc.?
Q2:
What is his alternative strategy?
COMMENT: Notice again how Skinner regards his own analysis as an effort to achieve
"a plausible account."
Q3:
What must one describe in the analysis of a social system? What must be known in to
accomplish this?
The Governmental Agency. pp. 335-336
Q4:
What defines a government?
86
Q5:
Is the power of the ultimate agency within government always aversive?
COMMENT: Skinner is distinguishing here between the leaders and their agents in
terms of controlling techniques. The general nature of control exercised by a government may be
based upon its power to punish, but the within-agency control may well be of a different source.
Q6:
What is necessary for a government to control "with the consent of the governed"?
Q7:
Need that relationship continue once that government has established its control over the
society?
Techniques in Governmental Control. pp. 336-338
Q8:
What determined legal and illegal acts in a dictatorship? In a democracy?
Q9:
What is the net effect of governmental control? Be able to describe the behavioral
processes which account for this outcome.
COMMENT: Notice how Skinner's two process analyses of the effect of punishment is
important to the above account.
Q10: What is obedience? Why is obedience to a verbal command based upon a history of
aversive control?
Q11: What is the advantage of obedient citizens to the controlling agency?
Law. pp. 338-341
Q12: What is a law?
Q13: How does it specify behavior?
Q14: How does the average citizen become law-abiding?
Q15: What makes the effect of punishing others as a deterrent a relatively weak technique of
control?
COMMENT: Skinner's analysis of the role of verbal processes in mediating the
effectiveness of laws and other forms of rules which govern behavior can be found in Chapter 6,
"An Operant Analysis of Problem Solving," in Contingencies of Reinforcement.
Traditional Interpretations. pp. 341-344
COMMENT: This is an important albeit unorganized section. Skinner is attempting to
provide behavioral interpretations of society's usual ways of dealing with illegal behavior, while
simultaneously attacking the traditional justifications for such.
87
Q16: Three reasons for punishment are revenge, rehabilitation, and deterrence. Which are
behaviorally justifiable and how?
Q17: What is the relationship between the legal concept of "responsibility" and behavioral
controllability?
Other Types of Governmental Control. pp. 345-346
Q18: Provide examples of governmental control which are not based exclusively on the power
to punish.
COMMENT: Notice Skinner's somewhat cautious optimism regarding the evolution of
social control practices toward more positively reinforcing techniques. However, many basic and
applied psychologists do not share his view of punishment's relative ineffectiveness, and
ineffective or not, societies are quick to revert to aversive control when immediate solutions are
important.
Countercontrol of Governmental Agencies. pp. 346-348
Q19: Why is the social system of the government and the governed inherently unstable?
Q20:
What are some indicants of the limits of stability?
Q21: What is the effect of government "by law" regarding the stability of the system?
Justification of Governmental Practices. pp. 348-349
COMMENT: This last section is a mini- tour de force for Skinner with respect to his
interpretations of some classic traditional values. The important point is that even though he sees
each of these terms as behaviorally explicable, they are still collectively inadequate as the bases
for evaluating a given society.
XXIII. RELIGION
COMMENT: Would ethical philosophers agree with the second sentence? In fact, do
you? Can you account for a negative response in behavioral terms?
Q22: What is a superstitious response?
Q23: Who composes a religious controlling agency, i.e., who are the leaders?
Q24: What is their claimed source of power?
Techniques of Religious Control. pp. 352-355
Q25: Compare religious techniques of control with legal and ethical ones.
88
Q26: What process is necessary to establish the power of religious threats and promises? Give
examples of both positive and negative consequences.
Q27: List some of the other techniques of behavioral control used by religious agencies,
including those of agencies which make no especial claims regarding their abilities to
intervene in supernatural outcomes.
The Behavior Controlled by the Religious Agency. pp. 355-357
Q28: How are the behavioral goals of the religious agency distinguished from the ethical
objectives of the larger group?
Q29: What process underlies self-control?
Explaining the Agency. p. 357
Q30: Why do politicians emphasize their religious affiliations?
Q31: Why do opponents of pornography often have large collections of pornographic materials
themselves?
Countercontrol. p. 358
Q32: Why do you suppose that the Mormon church is growing today? What are some reasons
why the Catholic church might be losing members and priests today?
Justification of Religious Control. p. 358
Q33: Does the justification of religious control depend upon supernatural relationships?
XXIV. PSYCHOTHERAPY
Certain by-Products of Control. pp. 359-361
Q34: Why does the group control the individual?
COMMENT: Surprisingly, Skinner omits the constructive objective of shaping
individuals to behave in ways which benefit the group. Groups not only seek to weaken
selfishness, they also seek to instill altruism.
Q35: Be able to describe some of the behavioral by-products of excessive or inconsistent
control.
Q36: Why are traditional agency reactions to such outcomes of control usually ineffective?
Emotional By-Products of Control. pp. 361-363
89
COMMENT: Remember that emotions or emotional responses are not inner states, but
rather are complexes of operant and respondent behaviors that vary together as a function of the
operation of certain independent variables.
Q37: What causes fear, anxiety, anger, and depression?
Q38: How are they to be eliminated?
Some Effects of Control Upon Operant Behavior. pp. 363-367
Q39: How can "self-control" miscarry? Briefly describe each of the inappropriate avoidance
responses described in this section.
Psychotherapy as a Controlling Agency. pp. 367-371
Q40: What is the social role of psychotherapy as an "agency" of control?
Q41: Why should therapy follow directly from diagnosis?
Q42: What source of control is initially available to the therapist?
Q43: How can this control be expanded?
Q44: Describe the two behavioral processes involved in successful psychoanalysis.
Psychotherapy Versus Religious and Governmental Control. pp. 371-372
Q45: Does psychotherapy ordinarily support or contradict other social agencies of control? Be
sure to consider both goals and methods in your answer.
Traditional Interpretations. pp. 372-379
Q46: What are the negative effects of considering the behaviors that necessitate therapy only as
symptoms?
COMMENT: The remainder of this section is a demonstration of Skinner's skill at
interpreting a case of psychodynamic "wish fulfillment" in operant terminology.
Other Therapeutic Techniques. pp. 379-382
Q47: What are other behavioral conditions that require therapy in addition to, or instead of,
histories of excessive or aversive control?
90
Q48: Why does the non-analytic or client-centered therapist wait for the client to suggest a
solution rather than provide it for him? (Behaviorally, of course, not in terms of the
therapist's traditional rationale.)
Explaining the Psychotherapeutic Agency. pp. 382-383
Q49: Why did Bandura claim that behavior modification is only effective when the client
consents? (American Psychological Association Presidential Address, 1974)
XXV. ECONOMIC CONTROL
Q50: Distinguish between goods and wealth.
Reinforcing behavior with money. pp. 384-385
Q51: What is required if a contract is going to be effective in controlling behavior?
Wage Schedules. pp. 385-391
Q52: What features of fixed ratio reinforcement lead to high rates of responding?
Q53: Why do human workers perform "throughout the interval" even when paid only at the
end?
Q54: What are the schedule parameters and effects of a salesperson receiving a salary plus
commissions?
Q55: How and why should bonuses be scheduled?
COMMENT: Skinner's review of extra economic factors and their effects on quality of
workmanship and job attitude is often overlooked by industrial "behaviorists" who tend to
exclusively emphasize monetary contingencies, or at least are often criticized as such.
The Economic Value of Labor. pp. 391-393
Q56: What determines the economic value of labor to both the employer and the employee?
Q57: What is the advantage of money in this regard?
Q58: When is money a bribe?
COMMENT: Notice that this is not the sense in which teachers and parents use the term
"bribe" when objecting to the use of explicit reinforcers in the classroom or home. Their point is
usually that the youngsters should perform appropriately for other reasons.
Buying and Selling. pp. 393-398
91
Q59: What factors are relevant in determining the value of goods?
Q60: What can a professional gambler do that a slot machine cannot?
COMMENT: The conditions necessary to establish economic behavior are often
overlooked by behavior modifiers eager to set up a token economy. It is easy to establish an
effective token system only when the clients have a relatively lengthy history of reinforcement
regarding the use of money. Grade school children and institutionalized populations often do
not.
"Economics" pp. 398-400
Q61: What is the data base of the concept of the Economic Man? What are the limits of the
concept?
The Economic Agency. p. 400
Q62: What determines an economic agency as such?
COMMENT: It isn't clear exactly what Skinner intends to include as those economic
agencies representing "capital". A relatively limited definition would include banks and banktype agencies, brokerages and various investment associations, and possibly money exchanges.
A broader definition would also include corporate agencies which do not produce goods or
services, but are directed by and for stockholders whose primary and possibly exclusive interest
is in profits.
Countercontrol. pp. 400-401
Q63: What is the major source of countercontrol against the misuse of economic power?
COMMENT: For better or worse, money (like political power) grows with success. In
countries where capital investment is broadly encouraged you see an increasing concentration of
wealth, primarily because the major sources of potential countercontrol, including organized
labor, are themselves susceptible to economic control. In other words, resistance can be bought
out. Further, the current growth of multinational corporations permits the controllers to more or
less escape any single government's effort to countercontrol them. History, unfortunately, tends
to indicate that great wealth is often countercontrolled only by revolution. Modern capitalists
have apparently learned this lesson, and seek to avoid such an outcome by avoiding excessive
exploitation, at least at home.
XXVI. EDUCATION
COMMENT: If there is a more succinct two paragraph summary of the goals and
methods of education, I have yet to read them.
Educational Agencies and Their Techniques of Control. pp. 403-404
92
Q64: What distinguishes an agency as educational: goals or methods?
Q65: What maintains the professional educator?
Educational Reinforcement. pp. 405-407
Q66: What were the major sources of control available to educators?
Q67: Why does Skinner say these controls are now weakened or unavailable?
COMMENT: Skinner later broadened his objection to progressive education by pointing
out that natural consequences are rarely programmed well for effective education.
The Behavior Resulting from Educational Control. pp. 407-411
Q68: What is necessary for the acquisition of skill?
COMMENT: The section on knowledge if almost too condensed to be especially
meaningful upon a first reading. Skinner, in fact, refers the reader to Verbal Behavior, which is
possible only in later editions of Science and Human Behavior. Several key concepts are
presented and briefly discussed, however, and the reader should try to follow them.
First, according to Skinner, knowledge is not something which has some mentalistic
independent existence and is somehow "stored" away for use. Instead, knowledge is the use, or
more appropriately, is behavior. It may be primarily verbal, or it may be with respect to the
physical environment, but in either case knowledge is behavior.
Second, to understand something, in the fullest sense of the word, is to be able to do or
say the same thing for the same reasons. To merely repeat a poem is not the same thing as to
understand it. When a listener comes to understand the speaker (or writer) he or she is then able
to respond in the same way for the same reasons.
Third, instruction consists of the control of behavior through words which "refer" to other
events and objects. The resultant behavior, however, is thus behavior under the stimulus control
of words. However, if the instructions are followed, and the behavior is thereby effective, the
behavior then comes under the control of the natural stimuli and consequences to which the
instruction originally referred. There are two points to be considered: to be controlled by
instruction requires a history of reinforcement regarding instructional control; instructional
repertoires are different, i.e., under different functional control, then are repertoires established
by natural events and consequences. Skinner discussed this critical distinction later in a chapter
entitled "An Operant Analysis of Problem Solving" in Contingencies of Reinforcement (1969).
Fourth, a student can acquire self-instructional repertoires which permit the mediation of
the interval between educational instruction and later functional use. When the repertoires
consist of sets of problem solving skills, they represent an analogue to the self-control that other
agencies often try to establish. Essentially, these future-oriented repertoires are designed to deal
with novel or unanticipatable circumstances.
Countercontrol. pp. 411-412
Q69: Who countercontrols the schools, and why?
93
COMMENT: Skinner seems to have omitted the most obvious and frequent source of
countercontrol over the public school, the community in which it is located. Parents and
schoolboards can exercise large amounts of control over school activities, and usually represent a
strongly conservative force when they do.
SECTION VI: THE CONTROL OF HUMAN BEHAVIOR
XXVII. CULTURE AND CONTROL
Manners and Customs. pp. 415-419
Q1:
Describe the process of induction as a basis for manners and customs.
Q2:
Why are such patterns of conformity self-sustaining?
The Social Environment as Culture. pp. 419-421
Q3:
What determines a culture?
Q4:
Are the sustaining controls for a culture always unified?
Q5:
What factors led to change in some cultural practices regarding the control of sexual
behavior?
COMMENT: Given what has been said before, you would expect Skinner to support
such changes because of the supposed reductions in aversive controls (and subsequent aversive
side effects of "by-products"). Notice, however, the ambiguity of his remark concerning the "net
result" of these changes.
The Effect of Culture Upon Behavior. pp. 421-424
Q6:
For each of the cited characteristics of individual behavior, describe to what extent the
resultant behavior depends upon physical or social variables, or both.
Cultural Character. pp. 424-425
Q7:
What characteristics of the social environment must exist if two cultures are to be
different?
Q8:
What is necessary to establish a relationship between cultural practices and cultural
modes of behavior? Why is it difficult to do so?
XXVIII. DESIGNING A CULTURE
94
Q9:
List some factors which introduce cultural practices "by accident".
COMMENT: "By accident" doesn't imply that a leader who introduces the practice
doesn't intend to do so, rather than the change isn't an intentional effort to produce a better
culture based upon its intended effect. Consider, for example, the origin of the Episcopalian
Church.
Q10: What is necessary for deliberate cultural design?
Q11: How can a future goal control current efforts to achieve it?
Value Judgments. pp. 428-430
COMMENT: This section introduces Skinner's analysis of cultural values, a perspective
he has referred to and explicated several times since.
Q12: Provide a behavioral translation of (1) "You ought to take this short cut." and (b) "You
ought to give yourself up."
COMMENT: Willard Day has observed that a more complete analysis of such
statements would entail a knowledge of the factors controlling the speaker's behavior as well.
For example, "You ought to take an umbrella." means: (1) Keeping dry is reinforcing to you. (2)
An umbrella keeps you dry when it rains. (3) It is going to rain. (4) Having to pay your medical
expenses is aversive to me. (5) You may require medical attention if it rains on you.
The Survival of a Culture. pp. 430-434
Q13: What are the three kinds of selection discussed in this first paragraph?
Q14: In what sense does a cultural practice have survival value? Does its origin matter in this
regard?
Q15: Is a current culture by definition "better" than one which has perished?
COMMENT: You'd better be able to answer the above question, since if you agree with
Skinner publicly on this issue, you definitely can anticipate the question being asked.
Q16: Why does Skinner state that a culture is an experiment?
Q17: Is survival value always comparable with traditional values?
Q18: Why does behavior usually lead to survival? Can an individual who so behaves be said
to have "chosen" to survive as a value?
Q19: What is the relevance of science, and especially behavioral science, upon the cultural
value of survival?
95
COMMENT: I suspect that this last point could be substantiated by comparing
governmental practices across several decades. Consider, for example, how many regulations are
now in effect regarding pollution as compared with those of 75 years ago. The unfortunate
aspect of this is that even if it is science which indicates survival, both by measurement and
improved practices, it also is usually science which has necessitated taking such action.
Regarding pollution, science seems to be the problem as well as the only hope for a solution.
Can We Estimate Survival Value? pp. 434-436
Q20: What must the cultural designer be able to estimate in order to be maximally effective?
Q21: What four practical ways does a science help in the selection of survival oriented cultural
practices?
Q22: What is left to do when these scientific practices have been employed to the fullest
extent?
XXIX. THE PROBLEM OF CONTROL
Q23: Why is a doctrine of personal freedom an insufficient protection or countercontrol against
a technology of behavior?
Q24: What are the consequences of a refusal to control?
COMMENT: As governmental control grows, as in welfare programs, our "freedom" is
correspondingly diminished, according to Skinner. If freedom is thus related to governmental
control, why should our government so strongly define the concept?
Q25:
What are some of the advantages of diversified control?
Q26: What are some of the disadvantages of controlling control (implicitly, by force)?
COMMENT: It may be instructive to consider the efforts to control the use of behavior
modification in the light of Skinner's remarks in this section. Federal and state legislation, local
supervisory boards, efforts to prevent its use either legally or through public pressure because of
its "dehumanizing" approach, are each examples of Skinner's classes of solutions to the problems
of control.
A Possible Safeguard Against Despotism. pp. 443-445
Q27: According to Skinner, where does the ultimate strength of a controller lie?
Q28: What is the role of traditional values, such as freedom, security, happiness, and
knowledge, regarding the ultimate source of strength? Do these values necessarily guarantee
cultural survival?
Q29: What may be the role of science in providing "moral values" for governments?
96
Who Will Control? pp. 445-446
Q30: Why is Skinner optimistic regarding the future of a science of behavior?
The Fate of the Individual. pp. 446-449
Q31: What is the central conflict between Western philosophy and a science of behavior?
Q32: What is responsible for that Western tradition?
Q33: Who or what "really" is in control?
COMMENT: This relatively "dry" scientific perspective has not been of especial
comfort to political scientists. For example, while it is true that the behavior of slaves was a
source of control over their owners, it was not a particularly exploitable one toward any
meaningful improvement in the slave's lot.
Q34: Why does even a scientific analysis of cultural behavior result in depicting a controllercontrollee relationship?
COMMENT: The last several paragraphs present Skinner’s perspective on the role of
science and culture, the resultant dethroning of the concepts of individualism and selfdetermination and the eventual cultural progress that can result. The expanded version of this
treatment is in Beyond Freedom and Dignity (1971).
FINI
97