Download SHB Study guides. No answers 282KB Jan 09

Document related concepts

Embodied cognitive science wikipedia , lookup

Operant conditioning wikipedia , lookup

Transcript
1
W. Scott Wood's Study Objectives
For B.F. Skinner's
Science and Human Behavior
Wood, W.S.
A Study Guide to Accompany B.F. Skinner's Science and Human Behavior
SECTION I: THE POSSIBILITY OF A SCIENCE OF HUMAN BEHAVIOR
I. CAN SCIENCE HELP?
The Misuse of Science. pp. 3-5
Q1:
To what does Skinner refer when he states, "science has developed unevenly." Why does
he believe this to be so?
COMMENT: This introductory section is illuminating for two reasons. First, it clearly
reflects Skinner's social concerns, and second, it contains the first of many literary references that
will appear in this and many other of Skinner's books and articles.
Science as a Corrective. p. 5
Q2:
What does Skinner see as a solution to misapplied scientific techniques?
The Threat to Freedom. pp. 6-8
COMMENT: Several important issues are raised in this section, ideas that Skinner
returns to time and again both in this book and elsewhere. The first involves Skinner's view of
the overall conceptual nature of a science of behavior.
Q3:
What else is involved in science other than the description of facts?
COMMENT: At this point, Skinner points out that to arrive at a science that provides
both prediction and control, you have to begin by assuming that such a science is possible; in
Skinner's words, "..we must assume that behavior is lawful and determined." This assumption,
of course, leads Skinner into a direct conflict with certain traditional assumptions concerning
human nature.
Q4:
What tradition stands in opposition to the assumption that behavior is lawful and
determined?
Q5:
What are some ways of defending a free will doctrine without exactly saying so?
The Practical Issue. pp. 8-10
2
COMMENT: Again we see Skinner as the social critic. He appears to be demanding a
science of behavioral prediction and control because of its social necessity. Obviously, this is in
contrast to what many other psychological theorists view as the goal of a science.
Q6:
What is the practical issue?
II. A SCIENCE OF BEHAVIOR
Q7:
Skinner states that science is not to be identified with certain common features of a
science. Be able to list three.
COMMENT: An interesting feature of this section is the comment of Skinner's upon the
noncumulative and thus "unscientific" aspects of writing, art, and philosophy. Few contemporary
practitioners of any of these fields would be likely to agree that there has been no progress in
2500 years.
Some Important Characteristics of Science. pp. 12-14
Q8:
What are the essential characteristics of science?
COMMENT: Skinner understands science to be a set of behaviors that ultimately result
in effective control of the environment. He points out that the order of development of scientific
laws is similar to our own behavior during our early years, when we learn from our early
experiences to appreciate the orderliness of our environment through experience. For Skinner,
science is nothing more than a set of practices which increase the effectiveness of a natural
behavioral process. For him, seeking uniformity or orderliness in order to be able to better
predict and control our environment is something that we all do automatically. Science consists
merely of procedures which permit us to do this better. This, of course, stresses the importance
of the ultimate objective of environmental control and is something that a more traditional
interpretation of science as "seeking knowledge for its own sake" is somewhat opposed to. This
contrast is brought out by his distinction between science as an active process, not a passive or
contemplative one.
Behavior as a Scientific Subject Matter. pp. 14-16
Q9:
What features of behavior make it a difficult subject matter for science?
COMMENT: Notice again on p. 16 how Skinner refuses to equate science with the
abstract process that is represented by logical or mathematic statements of relationships.
Techniques of observation, experimental manipulation, and mathematical systematizing are seen
as the tools of science, not the hallmarks. The goal for Skinner remains effective prediction and
control. Remember that the rationale for this position results from Skinner's belief that science is
nothing more than an amplification of a natural behavioral process.
Some Objections to a Science of Behavior. pp. 17-22
3
Q10: Be able to summarize briefly the typical objections to a science that Skinner cites and be
able to give Skinner's counterarguments.
III. WHY ORGANISMS BEHAVE
COMMENT: Notice Skinner's discussion of "cause and effect" versus functional
relationship between the independent and dependent variable. He literally discounts the entire
definitional issue by asserting that both refer to the same factual core.
Q11: Why are there so many spurious caused accounts of behavior available?
Some Popular "Causes" of Behavior. pp. 24-27
Q12: What principle does Skinner use to account for the emergence of astrology and body type
as causal factors for behavior?
COMMENT: After Skinner similarly disposes of the issue of "heredity" as the layman
uses it, he discusses the possibility of genetically determined behavior. He makes two quite
different but related points. The first involves the reason why the issue of genetic predisposition
is so volatile, and the second is a rationale for his own lack of interest in the topic.
Q13: What role does genetics play in a practical science of behavior?
Inner "Causes." pp. 29-31
Q14: Why does Skinner discount any particular interest in a search for physiological causes of
behavior?
COMMENT: Notice how this argument against physiological psychology is a practical
one involving a technology of control rather than any philosophic belief in the "empty" organism.
Much confusion has resulted from other theorists who don't understand Skinner's objectives in
establishing a science which not only "understands" but also is effective in the prediction and
control of behavior. We will return to this point later in this chapter.
Q15: Skinner's argument against psychic inner causes seems to take two forms. (What are
they?)
COMMENT: This material should be read carefully since most lay and/or philosophic
interpretations of behavior are of this kind. Essentially, the groundwork for an argument against
4
dualism is established in this section. Notice also that in all cases of discarding alternative
interpretations of behavior, Skinner points out how these interpretations might have arisen in the
first place.
Q16: How does Skinner describe a conceptual inner cause?
The Variables of Which Behavior is a Function.
pp. 31-35
Q17: Where do the independent variables that provide for a scientific analysis of behavior lie?
COMMENT: Several other key issues are discussed by Skinner in this section. One,
which he almost breezes over, involves the topic of what the dependent variable of what a
science of behavior ought to be. In the drinking example, he points out that it is the probability of
a single act of drinking that should be determined. Keep this objective in mind when Skinner
begins talking about a rate of measure.
COMMENT: The next few pages return to the issue of psychic and physiological "inner"
causes, where Skinner refines his earlier arguments by discussing a causal against a science
oriented toward a "second link" analysis.
Q18: What are the links in a causal change chain of behavior?
Q19: What are the essential features of Skinner's objection to inner states?
COMMENT: Again, notice that the force of Skinner's argument is against inner states
(and we can correctly assume that he discounts the psychic and is referring here only to the
physiological) because of their current limited utility in a science of behavioral control. How do
you imagine he would discuss the topic of drug effects where currently a relatively powerful
control technology exists?
A Functional Analysis. pp. 35-39
Q20: Be able to give Skinner's position on what constitutes the laws of a science of behavior.
Q21: What does Skinner mean by an analysis within the bounds of a natural science?
.
Q22: Briefly enumerate the potential sources of data for a behavioral analysis.
Q23: What is Skinner's view on the position that there is an essential discontinuity between
humans and animals?
5
Q24: What are some of the advantages of studying lower organisms?
Analysis of the Data.
pp. 39-42
COMMENT: The key to an understanding of the book lies in this section. As with the
text Verbal Behavior, Skinner refers to his objective as an "extrapolation" of certain known
relationships for the purpose of gaining an understanding of complex human events. Many have
accused Skinner of really providing a "hypothesis" about human behavior, but talking about it
rather than rigorously testing it, and even being guilty of denying that he is in fact hypothesizing.
The issue isn't that simple: For Skinner, science is a behavioral chain which proceeds from
observation to prediction and control. His extrapolations represent, for him, examples of stimulus
and response generalizations which are themselves recognized processes. But they are behavioral
processes, not logical or philosophical ones, If one does not understand Skinner's concept of
science, or more accurately, scientific behavior, then he seems to be violating the "scientific
principles of the hypothetico-deductive system by his extension of behavioral principles. On the
other hand, if you understand science as a behavioral process, not a rational or logical one (which
Skinner views only as helpful tools to sharpen scientific behavior) then his extrapolations are
completely in line with what currently is known about behavioral relationships.
SECTION II: THE ANALYSIS OF BEHAVIOR
IV. REFLEXES AND CONDITIONED REFLEXES
Man a Machine pp. 45-47
COMMENT: This, obviously, is a very brief treatment of mechanistic philosophy of the
17th century. Decartes hardly deserves all the credit (or blame). However, Skinner subtly heads
off scholarly criticism and acknowledges a philosophic debt by his selection of the section
heading, which is also the title of Julien de la Mettrie's famed treatise on human behavior.
Reflex Action.
Q1:
pp. 47-49
Define the following terms: stimulus, response reflex and reflex arc.
COMMENT: Skinner's treatment of "spontaneity", i.e., free will, is particularly
interesting since he obviously sees the contemporary situation as analogous. That is, free will is
a "null hypothesis" that can be eliminated only by bringing more and more behavior under
demonstrable environmental control. It is interesting to speculate whether or not any advocate of
a free will position would (a) be persuaded by such evidence or (b) even be willing to admit that
such an attempt should be made.
The Range of Reflex Action. pp. 49-50
6
Q2:
What is the range of reflex action?
COMMENT: Notice that Skinner is delineating a reflex response as one which is almost
an invariant reaction to environmental stimuli. This distinction will be critical to his later
analysis of operant stimulus control.
Conditioned Reflexes. pp. 50-54
Q3:
Describe the process that Pavlov referred to as "stimulus substitution."
COMMENT: There are a number of terms which usually appear in a treatment of the
conditioned reflex which don't appear here. Among them are conditioned stimulus (CS) and
conditioned response (CR). A conditioned stimulus is that previously neutral stimulus now
capable of eliciting a response similar to that elecitable by the originally effective stimulus. The
response to the conditioned stimulus is called the conditioned response. It is interesting to note
that the term "conditioned" is a mistranslation from Russian, and is more correctly read as
"conditional," referring to the fact that the effectiveness of a new stimulus in eliciting a response
is conditional upon continued pairing with the originally effective stimulus.
COMMENT: There are two points that Skinner makes in this section in addition to
briefly discussing the basic reflexive (Pavlovian, classical, or "respondent") conditioning process.
The first involves the distinction between "what a child knows" and "what a scientist knows"
about a given subject area. You might keep this in mind when next someone tells you that
operant principles are just common sense. Unfortunately, in an attempt to appear nonthreatening
to the layman, behavior modifiers have tended to emphasize the common sense approach, e.g.,
referring to the reinforcement principle as "Grandma's Law," etc.
Q4:
When can a scientist effectively dispense with explanatory fictions as causal accounts for
various observations?
Q5:
What, according to Skinner, was not Pavlov's major contribution and why?
The Survival Value of Reflexes. pp. 54-56
Q6:
How can evolutionary theory provide an account for the existence of reflexes and the
process of reflex conditioning?
COMMENT: Notice how Skinner describes the evolutionary process as a selection
mechanism which gradually determines a reflexive behavioral repertoire for a given species, both
in terms of certain inherited classes of behavior as well as in susceptibility to the reflexive
7
conditioning process. In a sense, it is the evolutionary consequences, i.e., survivability, that
determine the behavioral process... a situation which Skinner sees as analogous to the process of
operant reinforcement. An expanded treatment of this perspective is available in his paper, "The
Phylogeny and Ontogeny of Behavior" (1970).
Q7:
How might our inherited susceptibility to reflexive conditioning go awry?
The Range of Conditioned Reflexes. pp. 56-58
Q8:
What does Skinner see as a measure of the "range" of the conditioned reflex?
Q9:
What are several ways that one uses (a) the reflex and (b) the reflex conditioning process
in the practical control of behavior?
COMMENT: Notice how Skinner has begun to use reflex behavior to describe what are
commonly called emotional responses, such as fear, anxiety, aggression, favorable attitudes, etc.
Q10: How does one eliminate inappropriate conditioned reflexes? Describe two techniques.
COMMENT: Skinner refers to several standard behavioral therapy techniques in this
section. The first one he mentions is a technique for eliminating alcoholism and smoking (p. 56)
referred to today as aversion therapy. This consists of the pairing of some noxious stimulus, such
as might evoke vomiting, with the undesirable behavior. The technique presumable results in the
behavior coming to elicit incompatible emotional responses, and thus produces a decrease in the
undesired behavior. (Much more of this will be said later.)
The other process Skinner describes involves using reflexive extinction to eliminate
undesirable conditioned reflexes. As above, this can be accomplished either immediately or "in
gradual doses." One version of this latter method has come to be called systematic
desensitization, a technique popularized by Wolpe. It involves teaching a patient certain
"relaxation techniques," then gradually exposing him to an imaginary or real hierarchy or
increasingly fearful stimuli. As he can successfully relax in the presence of each, the next is
presented until finally the inappropriate anxieties or fears are eliminated.
V. OPERANT BEHAVIOR
The Consequences of Behavior. p. 59
COMMENT: This brief introduction merely sets the stage for the analysis of operant
behavior by distinguishing between the reflex as primarily concerned with internal physiology
and another kind of behavior "which has some effect upon the surrounding world." Skinner
points out that our concern with the latter is both practical and theoretical.
8
Q11: What are some special characteristics of behavior which has an effect upon the
environment?
Learning Curves pp. 59-62
Q12: What was Thorndike's discovery as he describes it?
Q13: How did Thorndike present his data?
Q14: What is Skinner's argument with this kind of data?
COMMENT: Some contemporary learning theorists believe that the successive
elimination of alternatives is the basic learning process (e.g., Stadden & Simmelhay,
Psychological Review, 1971, 78, 3-43.).
Q15: How does Skinner interpret the consistence ("smoothness") and similarity of general
properties revealed by learning curves?
Operant Conditioning. pp. 62-66
COMMENT: This may be the most important section in the book in terms of Skinner's
definition of the basic operant conditioning process. It certainly is the one most frequently cited
by others who wish to describe his views on the topic. The section begins with a discussion of
probability, describing how he believes probability can best be measured, and concludes with a
basic set of definitions.
Q16: What form of data best provides a basis for the concept of probability of response?
COMMENT: For the mathematicians among you, this may hardly be adequate. It is, of
course, the means whereby skinner justified his use of a rate measure as an index or estimate of
the probability of a single response (which, as you recall, he earlier stated should be the
dependent variable in a science of behavior.) The real rationale for this argument is extremely
sophisticated, and hinges upon skinner's interpretation of logic and mathematics. He begins with
the assumption that there is a behavioral origin for logic and mathematic concepts. In the area of
"probability," observed frequency must then be the basis for the concept. A detailed explication
of this point of view is available in Verbal Behavior (pp. 418-431), but unfortunately doesn't
contain any direct references to this particular problem of probability as a concept.
Q17: How does Skinner solve the technical problem of providing for the experimental
observation and interpretation of frequency of behavior?
Q18: Within this environment, how does one study the "stamping in" process?
9
Q19: How is the resultant data described?
COMMENT: We now proceed to a discussion of several basic operant concepts and
term definitions. This material is critical, and these concepts and definitions should be mastered
before going any further in your reading.
Q20: What is an operant?
COMMENT: It is essential that the reader recognize that this is an "effect" or functional
definition, not one which is based upon any specific topography of the response. Thus the
operant is defined in terms of what it accomplishes, and not what it looks like.
Q21: Distinguish between an operant response and a reflexive response.
Q22: What is operant conditioning?
Q23: Distinguish between operant and Pavlovian conditioning.
Q24: Distinguish between reinforcer and reinforcement.
Quantitative Properties. pp. 66-68
COMMENT: Much of the preliminary material here is a rationale for the selection of an
easily recorded response, the key peck, as an operant to be studied.
Q25: What are some of the factors which determine the quantitative effect of reinforcement?
COMMENT: This latter material is relevant to the all-or-none versus incremental theory
of learning. Does learning occur gradually as a result of repeated trials (as a "learning curve"
might indicate), or is a single reinforcement sufficient to maximally increase the probability of a
response with any lesser effect due to competing factors? Skinner puts an odd twist on the old
controversy by his observation that if learning in the rat and/or pigeon is an instantaneous shift
from low to maximum probability, then the vaunted human superiority in learning must reflect
something other than the speed of conditioning.
10
The Control of Operant Behavior. pp. 68-69
Q26: Once conditioned by food reinforcement, what additional source of control is obtained
over the subject?
COMMENT: Notice how Skinner makes a clear distinction between deprivation and
reinforcement. Many theorists discuss deprivation as a factor which determines the effectiveness
of a reinforcer. However, from the standpoint of a manipulable independent variable which can
be used to control behavior, deprivation is a completely separate experimental procedure from
reinforcement. Again we see the distinction between a "control-oriented" and an "explanationoriented) approach to a science of behavior. The same distinction obviously will hold true for
stimulus control, as Skinner also implies.
Operant Extinction. pp. 69-71
Q27: Define operant extinction.
Q28: How can one observe the properties of extinction?
COMMENT: For those of you who haven't read Skinner before, you are probably
wondering about his use of the term "smooth curve." Remember that for Skinner, science is a set
of behaviors that begin with observation and result in prediction and control. In order to derive
correct conclusions about causal relationships orderliness must be observed. Thus obtaining data
that is "smooth," or orderly, is essential.
Q29: What process can interfere with orderly extinction data?
Q30: How can this effect be eliminated?
Q31: What is the general relation between reinforcement and a number of extinction
responses?
Q32: What is more important than the simple number of reinforcers given in determining the
number of responses in extinction?
Q33: Distinguish between extinction and forgetting.
COMMENT: This latter section, "The Effects of Extinction," is a good preview of what
is to come. Skinner takes a relatively common form of human behavior (or non-behavior in this
11
particular case) and shows how it can be analyzed solely in terms of basic operant processes. It
also is a typical example of Skinner's style; constant paraphrase, frequent literary allusions, and
the general attitude that it is all very simple if you just know the right behavioral relationships.
What Events are Reinforcing? pp. 72-75
Q34: What is the defining characteristic of a reinforcer?
Q35: Is the above a "circular" definition?
COMMENT: Actually, the question of circularity is an issue that has been discussed for
quite a while. (Some might say ad nauseum.) Given the earlier definition of an operant as a
class of behaviors that can be specified by a physical effect upon which reinforcement is
contingent, it is clear that Skinner is defining operant in terms of reinforcement, and
reinforcement in terms of its effect upon operants. Is this necessarily circular? Or if it is, is it
necessarily bad? Schick (Journal of the Experimental Analysis of Behavior, 1971, 15, 413-423)
believes that what is involved is the empirical identification of a two-term relationship, a
perfectly "acceptable" practice. Although neither operants or reinforcers can be identified
independently of one another, their interaction clearly can be identified and represents an
environmental-behavioral relationship that is readily distinguishable from the earlier discussed
stimulus-response reflex. Whether these two potential effects of environmental events, to elicit
or to reinforce, exhaust the possibilities of environmental-behavioral interactions (as Skinner
seems to imply on p. 65) is another issue.
Q36: What two kinds of events are reinforcing?
COMMENT: Remember that positive and negative refer to the direction that the
stimulus is going when it produces an increase in responding. Both positive and negative
reinforcement strengthen behavior. It is also worth noting at this point that some recent theorists,
e.g., Premack (1965), include the opportunity to engage in high probability behaviors as
reinforcing events. That is, one can empirically determine, by counting, high and low probability
responses in any organism's behavioral repertoire. A high probability behavior will reinforce a
low probability behavior when access to the high probability behavior depends upon the
occurrence of the designated low probability response. This effect is referred to as the "Premack
Principle." Thus, some say that not only environmental stimuli but also various behaviors can
function as reinforcing events.
Q37: How do we identify and evaluate the strength of reinforcing events in other people's
lives?
12
Q38: Why can't you simply ask a person what reinforces him?
COMMENT: The last paragraph touches upon a number of points, all centered around
the topic of an inherited capacity to be operantly reinforced. First, large differences between
species in the nature of effective reinforcers can reasonably be expected. Second, within species
variations are more likely to be due to individual histories than genetic differences. Finally, and
in any event, the identification of reinforcers can only be made in terms of their effect.
Conditioned Reinforcers pp. 76-81
Q39: What is a conditioned reinforcer and how is it established?
Q40: What are some factors affecting the establishment of conditioned reinforcers?
COMMENT: It is an unresolved experimental issue whether or not simple pairing is
sufficient to establish a conditioned reinforcer. Some say the neutral stimulus must first be
established as a "discriminative stimulus" (Chapter VII). Another more recent point of view is
that the conditioned reinforcer must "contain information" availability of reinforcement and its
absence indicate no (or differential) reinforcement. In any case, the process described by Skinner
for establishing the light as a conditioned reinforcer for the pigeon (p. 76) works, no matter how
it is analyzed.
Q41: How do natural contingencies produce conditioned reinforcers?
Q42: What accounts for the apparent long delays between human activity and resultant
reinforcers?
Q43: How are these intervening conditioned reinforcers important in the practical control of
human behavior?
.
Generalized Reinforcers. pp. 77-81
Q44: What is a generalized conditioned reinforcer?
.
Q45: Why is the generalized conditioned reinforcer especially useful in the control of
behavior?
13
Q46: On the next few pages, Skinner describes six generalized conditioned reinforcers that are
especially important in the control of human behavior. Identify and describe the way that
each is characteristically established.
Q47: What are some difficulties involved in the effective use of attention, affection and
approval?
Q48:
What are some advantages of tokens?
Q49: Why does Skinner doubt that these generalized reinforcers (except, possibly, the first one)
are reinforcing in and of themselves without being established by a conditioning process?
COMMENT: The last paragraph in this section is somewhat of an enigma. Given the
preceding analysis on the nonbiological origin of the effectiveness of generalized reinforcers,
Skinner apparently can only mean one or both of the following. First, given a sufficiently long
history of pairings with enough different reinforcers, a generalized reinforcer becomes
autonomous and no longer need be "backed up" by anything else. A second possibility might be
that a sufficiently long history of pairings establishes a conditioned reinforcing effectiveness so
strongly that it is unlikely to extinguish during the lifetime of the individual.
Why is a Reinforcer Reinforcing? pp. 81-84
COMMENT: This section provides arguments against certain "theories" about why
reinforcement works, and a discussion concerning the possible biological mechanisms underlying
the capacity to be reinforced by certain kinds of stimuli.
Q50: What is Skinner's argument against defining reinforcers as a pleasant or satisfying
events?
Q51: Why can't you ask someone whether he finds an event satisfying or pleasant?
Q52: Can reinforcers be reliably defined as events which reduce deprivation?
Q53: What is the relationship between deprivation, satiation, and reinforcement?
Q54: What are some disadvantages in biologically determined reinforcers?
14
Accidental Contingencies and "Superstitious" Behavior. pp. 84-87
Q55: What is the only property of a contingency required to produce operant conditioning?
COMMENT: There are a couple of points here. First is Skinner's reiteration of the
irrelevance of an individual's ability to "talk about" reinforcement in order for it to be effective.
The second involves Skinner's use of the term "contingency" without any explicit definition. In
his recent article, "The Role of the Environment" (1970), Skinner states it this way, "An adequate
formulation of the interaction between an organism and its environment must always specify
three things: (1) The occasion upon which a response occurs, (2) the response itself, and (3) the
reinforcing consequences. The interrelationships among them are the "contingencies of
reinforcement." In other words, a contingency is the specific description of the complete
environmental and behavioral requirements for reinforcement, the details of when and where and
how much you reinforce for what specific response.
Q56: Define and give an example of superstitious behavior.
Q57: In what sense does superstitious operant behavior represent a miscarriage of the
conditioning process?
COMMENT: Notice how Skinner carefully distinguishes between the origin of
superstitious beliefs, as resulting from accidental reinforcement, and their current practice, which
is the result of deliberate transmission primarily through verbal behavior.
Goals, Purposes, and Other Final Causes. pp. 87-90
COMMENT: There seem to be two important points in this section. The first is an
operant analysis of what is meant by goals, purposes, etc. The second is the way in which this
analysis is accomplished. Notice again how Skinner does not simply say "there are no final
causes" but rather goes to considerable length to point out how such ideas might have
erroneously arisen. It is one thing to have a theory, it's another to show how yours can account
for someone else's.
Q58: How do you rephrase a statement which implies that behavior is occurring because of
what will follow?
Q59: Is purpose a property of behavior?
Q60: If asked why he is in school, a student might reply, "to get a degree," or "to obtain
knowledge." How could this be reanalyzed more operantly?
15
MISSING: QUESTIONS 61, 62, 63, 64 and ANSWERS 61, 62, 63, 64.
VI. SHAPING AND MAINTAINING OPERANT BEHAVIOR
The continuity of behavior. pp. 92-95
Q65: What is meant by "reinforcing a series of successive approximations"?
Q66: How is the pigeon's key peck response different from the usual operant behavioral
objectives?
COMMENT: This material concerning what constitutes the basic elements of behavior is
rather difficult. There seems to be several points: First, that operants are functional units that
can be counted, but we shouldn't forget the underlying continuity of behavior. If we do, pseudoproblems involving transfer and response induction can arise. Skinner believes that the "true
behavioral element" that is strengthened by reinforcement is carried best by the concept of a
behavioral "atom." Presumably the point is that operants or "acts" are analogous to molecules,
which can provide a valid level of understanding, but that there is also an underlying, but yet
incompletely understood, level of behavioral analysis involving the elements of which these acts
are constructed. The rationale for this perspective lies in the difficulty operant conditioners have
in talking about the shaping process. As yet there are not quantitative statements that can
effectively be made concerning how one goes about shaping responses other than to "reinforce
successive approximations" (and then practice a lot). We can push the "molecules" all over the
place, but we construct them intuitively.
Differential Reinforcement. pp. 95-98
Q67: What kind of contingency improves "skill?"
COMMENT: As Skinner points out, this is a section discussing how one improves
topography, the "shape of a response", not how one establishes an operant class in the first place.
The operant set of responses is already available; differential reinforcement simply refines this
class toward a more effective form by variations in reinforcement (differential reinforcement)
depending upon specific response characteristics.
Q68: How could you condition a child to be "annoying?"
The Maintenance of Behavior. pp. 98
Q69: Classic learning theories fail to treat what fundamental issues?
16
Intermittent Reinforcement. pp. 99-106
Q70: How might one distinguish between responses which act upon the physical environment
from responses which have social effects?
Q71: What characteristics of operants that are intermittently reinforced make them useful in
social control systems?
Q72: What are two primary ways of establishing various schedules of reinforcement?
COMMENT: You should remember that those schedules arranged primarily by
conditions prevailing outside the organism, such as time, are not independent of the organism's
behavior. That is, reinforcement isn't delivered just because time has passed --- that would be
called free reinforcement. Instead, the organism has to respond after a fixed (or variable) amount
of time has elapsed before reinforcement is forthcoming.
Q73: In the following several pages, a number of standard schedules of reinforcement are
described. Be able to describe how each of the following schedules is programmed, and
the resulting behavior; fixed interval, variable interval, fixed ratio, and variable ratio.
COMMENT: A number of other issues contained in this section are worth mentioning,
some involving social considerations. First, notice how Skinner points out that the weekly wage
is a more complex situation than a simple fixed interval schedule (p. 101). This particular
situation has been analyzed in considerable detail by Michael (1969). Second, the analysis of
fixed ratio schedule as analogous to human piecework provides a prospective on why labor
unions were formed to prevent industry from constantly "upping the ratio."
There are some theoretical issues as well. The analysis of post-reinforcement pausing in
both fixed interval and fixed ratio schedules as resulting from discriminated non-reinforcement
intervals is important. One misstatement involves the pattern of fixed ratio responding. The
shift from nonresponding to responding is not a smooth gradient of acceleration as Skinner states
on p. 103, but rather a relatively abrupt shift from nonresponding to a rather high rate.
Q74 : MISSING
A74: One on which reinforcement may be determined by either an interval or a ratio
schedule, or both.
Q75: What is a combined schedule?
17
A75: MISSING
COMMENT: As Skinner states these are not simple schedules. Some recent research
has elaborated a number of distinguishing features of several possible combinations. Some of
these schedules are quite well understood. Differentially reinforcing high rates (DRH) and
differentially reinforcing low rates (DRL) have quite simple effects: they produce what they
reinforce. Recent efforts have also produced schedules which tightly control the interresponse
times. One important area of schedule analysis which followed the publication of this text
involves the multiple response-reinforcement relationships called concurrent schedules of
reinforcement. For a detailed background in the role of reinforcement schedules in behavioral
control, two important sources should be consulted: Ferster and Skinner, Schedules of
Reinforcement (1957), and The Journal of the Experimental Analysis of Behavior.
VII. OPERANT DISCRIMINATION
Discriminative Stimuli. pp. 107-110
Q76: Define and describe the three-term contingency that produces discrimination.
COMMENT: In differentiating the discriminative operant from the conditioned or
unconditioned reflex, which shows a similar stimulus-response pattern, Skinner uses the term
emitted to describe the production of the operant and elicited to describe the occurrence of the
reflex.
Q77: What form of behavior control does discrimination provide?
Q78: Give examples of operants under the control of the physical environment, the social
environment, or verbal stimuli.
Q79: How do we use the discrimination process to control behavior, in addition to directly
altering probabilities (as in Q77)?
Voluntary and Involuntary Behavior. pp. 110-116
COMMENT: This section involves analyses of two traditional ways of distinguishing
between operant and respondent behavior; voluntary versus involuntary, and striped muscles
versus smooth muscles and glands.
Q80: What were three historic criteria used to distinguish the reflex from other behavior?
Q81: Why can the use of "who is in control" no longer be considered a distinguishing feature
between operants and respondents?
18
Q82: Can the distinction between operant and respondent be based upon the issue of control
versus lack of control?
COMMENT: In the preceding section, Skinner elaborated the distinction between the
conditioning procedures that establish conditioned reflexes and discriminative operants. In this
section, he refers to another distinguishing feature, quantitative relations. A respondent shows a
correlation between increased intensity of eliciting stimuli and increased magnitude and
decreased latency of the resulting respondent. These relations do not hold with the
discriminative operant.
Q83: MISSING
A83:
Q84: Why has the concept of will survived longer in the interpretation of operant behavior than
in the interpretation of respondent behavior?
COMMENT: Skinner now has included "will" as an inner agent. He argues against such
a construct on the same two grounds, impractical as a source of control and simply raising
another agent to be accounted for.
Q85: Can a distinction between respondent and operant behavior be based upon musculature?
Q86: Can reflexes be operantly conditioned?
COMMENT: This position is not quite so clear-cut as Skinner stated it. The research
area of biofeedback seems to support a contrary notion, at least in some cases. Nevertheless, it
still is important to distinguish between "true" reflexes and discriminated operants which
resemble them, e.g. operant v. respondent crying in children.
Q87: What is Skinner's analysis of the social roles of the concept of personal responsibility?
Q88: How might such objectives be better obtained?
COMMENT: Obviously, the final paragraph in this section directly anticipates Beyond
Freedom and Dignity (1971).
Discriminative Repertoires, pp. 117-122
COMMENT: Skinner is suggesting that any realm of a discriminative stimulus (field)
can be broken down into smaller units to compose a continuous field, just as any operant can be
19
reanalyzed into behavioral elements or "atoms" (p. 94), thus emphasizing the underlying
continuity of the environment and behavior. Any potential discriminative operant can be
represented by a "point-to-point" correspondence between the discriminative stimulus field and
the units of the operant. In such a case, the "functional units" would be each field element
functioning as a discriminative stimulus for each behavioral elements. Ordinarily, however, the
functional units are not so small, since point-to-point correspondence is rarely if ever established.
Q89: How is a skilled copyist distinguished from an unskilled drawer of lines?
A89: A history of differential reinforcement has provided a much larger repertoire of
responses to lines in the skilled copyist than are available to the unskilled or novice
copyist.
COMMENT: Skinner uses the example to represent many of the possible degrees of
correspondence between the stimulus field and the discriminative operant. A point-to-point
correspondence is reached when exact copy can be reproduced. Near, but imperfect,
correspondence is represented by "individual style." The electrical engineer can emit only
discrete units to certain stimuli, for example, stereotyped symbols of resistors, batteries, etc.
Q90: What two sources of differential reinforcement are available to train the skilled copyist?
COMMENT: This issue of automatic self-reinforcement is a little complex. Skinner's
point is that if one has been taught to discriminate good copying from bad, one can react
appropriately to his own efforts as well, thus "reinforcing" himself when his copying is good.
Many of us learn to recognize "good" behavior long before we can emit it consistently, for
example, golf, and thus when we do hit a good shot, we are likely to tell ourselves so. The next
issue is in what sense can praising ourselves function as a reinforcer? Skinner's position is that
once praise has been established as a conditioned reinforcer from its prior history of being paired
with other reinforcers, it still functions as a conditioned reinforcer when self-administered. The
effect of reinforcement is independent of who administers it. An extended analysis of this
process is available in Verbal Behavior (1957, pp. 438-446).
Q91: Describe the repertoire of the singer with "poor pitch."
Q92: Give an operant definition of imitation.
Q93: What is an inverse imitative repertoire? Give an example.
COMMENT: The establishment of an imitative repertoire has become a standard
practice with current behavior modification psychologists. It is easier to shape a subject (for
example, a retarded child) to imitate, and then teach them new behaviors imitatively, than it is to
try to "shape" each new behavior separately.
20
Attention. pp. 122-125.
Q94: What is attention?
Q95: How can we tell whether or not someone is "paying attention?"
COMMENT: This is an important section, inasmuch as it makes clear use of Skinner's
functional approach to a behavioral analysis, notice how Skinner rules out "receptor orientation",
a topographical description, as a definition of attention, and relies, instead, upon the
discriminative relationships between the stimulus and a response. Consider this approach the
next time you read an article involving the conditioning of "study" behavior or "attentive"
behavior in the classroom.
Temporal Relations between stimulus, response & reinforcement, pp. 125-128
Q96: Which characteristics of the natural environment are responsible for the occurrence of a)
respondent conditioning, b) operant conditioning, and c) operant discrimination?
Q97: What is the role of temporal relations in each of the above?
COMMENT: Notice that Skinner is not discussing what is commonly called "delayed"
reinforcement, either in the respondent or operant case. Instead, he is talking about the temporal
specification as part of the contingency, such that the maximum probability of response occurs
within a given stimulus situation after a specific period of time has elapsed.
Q98: What are the behavioral effects of waiting for a delayed consequence following a
discriminative stimulus (often called "expectancy" or "anticipation"?
COMMENT: Actually, only two behavioral processes are involved, respondent
conditioning being responsible for (a) and (c) and operant conditioning for (b) and (d). Skinner
extends the analysis to account for different kinds of conditioned behavior so far as they have
commonsense labels, for example, attention, anxiety, etc.
VIII. THE CONTROLLING ENVIRONMENT
The Importance Of The Environment. pp. 129-130
COMMENT: This section makes two rather important points: First, it suggests the
potential complexity of behaviors under various combinations of conditioning histories and
stimulus events, as indicated by the "X seeing Y" example. Second, Skinner implies that current
practices in clinical psychology are inadequate in this regard, a theme which will be greatly
expanded later in the book.
21
The Analysis of Stimuli. pp. 130-131
Q99: How do we begin to study the effects of the physical environment?
Q100: What about physical events that are undetectable by the organism?
.
Q101: Are all stimulus detection problems simply functions of receptive physiology?
COMMENT: The last paragraph suggests the generality of processes across stimulus
dimensions that Skinner sees in several fundamental areas of behavioral control. Earlier, in his
rationale for the study of lower organisms, he stated that important behavioral relationships are
generalizable across species. Essentially, he is implying that if we study the effects of visual
stimuli on pigeons, we will begin to understand discriminative stimulus control of human
behavior.
Induction. pp. 132-134
Q102: What is stimulus generalization (or induction)?
Q103: How does one account for this effect?
Q104: What are some human situations which demonstrate this process?
.
COMMENT: For those of you who view Skinner and Freud to be irreconcilable, prepare
yourself for a shock. Skinner cites Freud more often than he cites any other psychologist, and
frequently quite favorably. It is not Freud's observations but rather his "inner agents" that
Skinner disagrees with.
Q105: How does one empirically assess stimulus generalization?
COMMENT: Obviously, this is a very brief treatment of an extensive research area.
Examples of recent operant research and stimulus generalization can be found in a number of
sources, including the Journal of the Experimental Analysis of Behavior or the appropriate
section of Catania, Contemporary Research and Operant Behavior (1968) or Verhave, The
Experimental Analysis of Behavior (1966).
22
Discrimination. pp. 134
Q106: How does Skinner explain the process of generalization, both in terms of what it is and
what it isn't?
Q107: How does one establish a discrimination? Give an example.
Abstraction. pp. 134-136
Q108: What is an abstraction, and how is it established?
Q109: In what sense is abstraction not a natural process?
COMMENT: This is an especially important section since it suggests one of the
important processes underlying Skinner's approach to science as behavior. Verbal abstractions
have isolated many important and subtle properties of nature (for example, "change"), which
probably could not have been identified without the mediation of a verbal community. It is this
refining of concepts in relationships that describe our physical and behavioral world that is the
foundation of empirical science. (Hint: Etymology; the study of historical linguistic change,
especially as applied to single words).
Some Traditional Problems in Stimulus Control. pp. 136-140
Q110: What is cross modal induction?
Q111: What are several possible behavioral accounts for this phenomenon?
COMMENT: The example here is of little help if you don't know Butler, Handel or what
the Wetterhorn is. The process, however, is a relatively important one, as it involves covert
operant mediating behavior. Notice that Butler never said aloud the word "shoulder". It
mediated a visual impression and resultant humming behavior. In accounting for somewhat
similar phenomena, others, for example Staats (1963), have relied upon classical conditioning as
the underlying process. However, the word, "shoulder" is clearly an operant. The analysis of
such covert operant responding will be more extensively considered in Section III.
Q112: Why might a pigeon conditioned to peck a 5" disk instead of a 3" disk when presented
together, peck a 7" disk when presented a 7" disk and a 5" disk?
COMMENT: Two points: First, notice how Skinner has extended the concept of
stimulus. With the earlier account of stimulus elements sharing control, he has freed the concept
of the stimulus from a particular physical event or a single dimension. It now has become a
combination of events which differentially control probability of responding. Critics of Skinner's
extension of basic principles to human behavior often misunderstand his use of the
23
discriminative stimulus concept (for example, Chomski, (1957). Second, Skinner suggests that
discriminated operants, such as the pigeon's response to a relationship, are learned
discriminations, not natural ones. That is, pigeons may be trained to respond either to size or
relationship between sizes. In the natural environment, relationships may be more important, and
thus pigeons may initially respond on this basis, but this history of conditioning can be
experimentally reversed.
Q113: What is an "interpreted" stimulus?
Q114: Distinguish functionally between "sensing" and seeing, perceiving and knowing.
COMMENT: This treatment of seeing as behavior has a number of ramifications which
are more fully explicated in Chapter XVII.
IX. DEPRIVATION AND SATIATION
Q115: What unwarranted extension of the concept of a stimulus followed discovery of stimulus
controlled behavior?
Deprivation. pp. 141-143
Q116: What are the effects of deprivation of water on the probability of drinking?
COMMENT: Skinner reiterates an evolutionary "explanation" for the effects of various
kinds of deprivation upon the probability of responses which alleviates the deprived conditions.
Some critics have accused Skinner of appealing to "conceptual evolutionary principles" in the
same way that he has accused others of appealing to a "conceptual nervous system."
Q117: What are some disadvantages with the concept of homeostasis?
Q118: Must "deprivation" concern itself with ingestion exclusively?
COMMENT: The logic here clearly anticipates Premack's research.
Q119: Does deprivation affect only a single response?
24
Needs and Drives. pp. 143-146
Q120: Under what conditions do inner events such as needs or wants add nothing to the
functional account?
Q121: How is "drive" legitimately used as a term in scientific discourse?
COMMENT: Skinner here is providing a rationale for the use of "drive" as what is
commonly called an intervening variable. Notice he precludes the necessity of such as a "real"
mental or physiological state (e.g., a hypothetical construct). A detailed account of hunger as a
drive is available in Behavior of Organisms (1938, pp. 341-378).
Q122: Why is a drive not a stimulus?
Q123: Why is a drive not a physiological state?
Q124: Is a drive a psychic state?
COMMENT: Skinner isn't ruling out of consideration the problem of what you "feel",
but simply delays his discussion of the problem until Chapter XVII.
Q125: Is a drive a state of strength?
The Practical Use of Drives. pp. 146-148
Q126: Be able to give examples involving the use of deprivation and satiation in the practical
control of human behavior.
Q127: How are the effects produced by operant reinforcement different from those described
above?
COMMENT: This section and the following one are a little unusual in the sense that
after all the trouble Skinner went to in elaborating the appropriate use of drives as intervening
states, he is now pointing out their relative uselessness as constructs.
Some Questions Concerning Drive. pp. 148-154
Q128: What two questions are implied by the question "how many drives are there?"
25
COMMENT: Skinner is pointing out that a question involving "drive" may be asked
either in terms of dependent or independent variables, and that a question involving the
intervening state is inappropriate.
Q129: What is the relationship between the effect of operant reinforcement and deprivation?
COMMENT: The effects of deprivation level on extinction responding referred to on pp.
149-150 is reported in Behavior of Organisms (1938, pp. 379-405). There is, obviously, a direct
effect of deprivation on responding: for the same schedule of reinforcement, higher deprivation
levels produce higher response rates (e.g., Clark, 1958).
Q130: What is the relationship between deprivation and conditioned reinforcers?
Q131: What is necessary to demonstrate an autonomous drive associated with a generalized
conditioned reinforcer such as attention, affection, etc.?
COMMENT: Notice that a strong reinforcing effect is not sufficient to justify a separate
drive. That would require the associated operation of deprivation or satiation. Presumably this is
because a conditioned reinforcer is effective even when the back-up reinforcer does not follow
each occurrence of the conditioned reinforcer. Effectiveness of the conditioned reinforcer will
still be maintained by only occasional pairings.
Q132: In what sense can chemical agents such as alcohol, morphine, etc., be called "acquired
drives"?
COMMENT: The issue raised by the example of "sublimation" involves the concept of
response and/or stimulus induction. Certain behavioral features of raising children are shared by
the behavior of raising and caring for pets. Operations which strengthen the probability of child
raising (e.g., T.V. campaigns, magazine articles, etc.) may also strengthen the behavior of raising
pets through this process of induction or commonality of certain features. If the strengthening
operation is one of deprivation, the induced response is strengthened, but probably won't produce
a reduction in deprivation. Why you should observe an increase in strength of the induced
response instead of the behavior appropriate to the deprivation condition (e.g., when the
appropriate behavior is "sublimated") is discussed further in Chapter XXIV.
Q133: How can questions involving "interrelated drives" be experimentally tested?
Q134: Is either the sex drive or the drive to dominate seen as primary in light of the above?
Q135: How can you experimentally assess the relative strength of drives?
Q136: What are some of the experimental difficulties involved in such questions?
26
Time as a Variable. pp. 154-156
Q137: In what sense can time be used as an independent variable?
Q138: How is control obtained over behavior which displays this type of periodicity?
Q139: Is time alone the relevant independent variable in certain cases of annual cycles, such as
migration?
COMMENT: You should note that the "mere passage of time" can never itself be the
exclusive independent variable. At some point, "time passing" must contact the organism.
Q140: How can predictions be made about behavior which reflects maturational processes?
Q141: What practical problem does this produce?
.
The Individual and the Species. pp. 156-157
Q142: As an account for behavior, why is "instinct" described as an explanatory fiction?
COMMENT: Skinner's observation that behavior is as much a part of the organism as
are its anatomical features, and is describable with respect to species status, would probably
startle many ethologists who doubt contemporary behaviorists' capacity to deal effectively with
species specific behavior. Their arguments usually center around the species specificity of
certain food-getting behaviors in rodents and the lack of justification for generalizing information
gleaned from observing this particular behavior. Skinner's position is that a shared behavioral
process justified such extensions, such as operant behavior's capacity "to be reinforced." Thus
arbitrary operants (like lever pressing) reveal underlying common properties (like
reinforcability). Of course, other processes may be common to behavior as well, such as
discrimination, deprivation and satiation, etc.
Summary. pp. 158-159
COMMENT: This summary section is interesting in a couple of ways. First, it is the
only chapter summarization in this book. Second, of the list of seven potential questions
concerning factors which can determine the probability of responding, only the last has received
much attention as an operant research area, although (2) has resulted in some research, e.g.,
young versus old rats, etc. By and large, however, operant research has been devoted almost
27
exclusively to the factors Skinner describes in other chapters and refers to here in the last
paragraph; reinforcement, emotion, aversive stimulation, and punishment.
Q143: Why have so many of these variables apparently been ignored?
A143: a) From a research perspective, they can be relatively easily stabilized or "held
constant" and then more powerful variables can be explored.
b) From a perspective of deriving practical behavioral control techniques, they are
relatively trivial in comparison with other more manipulable independent variables.
X. EMOTION
What Is An Emotion? PP. 160-162
Q144: Why is an emotion, as it is characteristically used, an example of an explanatory fiction?
Q145: What are some of the difficulties in identifying emotions with (a) internal responses of
the smooth muscles and glands, (b) common expressions of facial and postural muscles?
COMMENT: You might observe that particularly in the case of internal measures, some
psychologists give an emotion an "operational" definition simply by calling it a certain pattern of
responding, e.g., a GSR of such and such value equals "anxiety."
Emotion as a Predisposition. pp. 162-163
Q146: How does Skinner define emotion?
The Responses Which Vary Together in Emotion. pp. 163-164
COMMENT: First, observe the caveat in the first paragraph. It is an example of how
Skinner uses common sense terminology in a way that is often confusing to readers. This is a
chapter on "emotions" containing words like "joy," "fear," and "anger," and yet it is a curious
blend of behavioral analysis and lay terminology. Skinner does define the term emotion to
eliminate it as an explanatory fiction (as he did with the word "drive"), but he then goes on to
identify what most people mean when they use these words.
Second, Skinner is unclear about the factors which "cluster" certain behaviors together
when the organism is said to be behaving emotionally. Presumable they do so in part because of
common consequences and these consequences may have produced the behavior either
evolutionarily and thus the behavior today is primarily respondent, or the consequences may be
functioning as a contemporary reinforcer to shape current behaviors, or both, as in the case of
"angry" behavior.
28
Emotional Operations. pp. 164-166
Q147: Does "frustration" produce "rage?"
Q148: In what sense do drives and emotions overlap?
COMMENT: The last sentence in this section may well have been the title of this book.
What Skinner is obviously talking about is an attempt to "force" what is commonly observed
about human behavior into a conceptual framework involving certain well established
environmental-behavioral relationships. The objective? To be able to understand better (i.e.,
talk effectively about) and alter (i.e., predict and control) human behavior.
The Total Emotion. pp. 166-167
Q149: How would one go about defining an emotion?
COMMENT: Notice in the last three sections, Skinner has provided behavioral
interpretations of loneliness, nostalgia, an employee angered by criticism, and several phobias.
Emotions are not Causes. pp. 167-169
Q150: What is the proper subject matter of emotion?
COMMENT: Much of this material simply restates and extends the earlier comments
concerning emotion as a conceptual second link, not necessarily a psychic or psychological one,
which is best interpreted as a "predisposition" to act in certain ways. When one starts talking
about "predispositions to predispositions," e.g., "moods," "dispositions," you are talking
probabilistically about probabilities. You might have observed by now that Skinner provides a
behavioral interpretation for more words than most people know.
The Practical Use of Emotion. pp. 169-170
Q151: How can emotional responses that are primarily respondent in nature be controlled?
Q152: How can larger categories of behavior predispositions be altered?
COMMENT: The last couple of paragraphs are obviously more "forced" than the earlier
material. Obviously, individual behavioral histories play a major role since prior conditioning is
the key factor determining reactions to certain words.
29
XI. AVERSION, AVOIDANCE, ANXIETY
Aversive Behavior. pp. 171-174
Q153: What is an aversive stimulus?
Q154: What are the physical characteristics of aversive stimuli?
Q155: What is escape behavior?
Q156: How do you study behavior under the control of aversive stimuli?
Q157: What are some of the practical advantages and disadvantages of the use of aversive
stimuli in the above manner?
COMMENT: This particular disadvantage that Skinner cites is the result of an aversive
event producing emotional behavior, e.g., emotional respondents elicited by the aversive event or
some unanalyzed blend of operants and respondents which are called "predispositions" (Chapter
X) generated by the aversive event, is a rather controversial position in contemporary behavior
modification practices. These "side effects" of using aversive stimuli to control appropriate
behavior are used by many to argue against utilizing aversive control. (Skinner will give many of
these arguments later in this book.) Others feel that these side effects, if they exist at all, don't
necessarily rule out aversive control as a legitimate behavior modification technique (e.g., Baer,
Psychology Today, October, 1971).
Q158: What is aversive behavior?
Q159: Is deprivation an operation equivalent to presenting an aversive event?
COMMENT: Notice again that Skinner cites the evolutionary advantages of being
reinforcable by the withdrawal of certain conditions.
Q160: What is a conditioned aversive stimulus?
COMMENT: This analysis of tobacco and alcohol cures isn't too clear. What is intended
is that some substances produce nausea which is an aversive state whose removal is reinforcing.
This nausea inducing capacity can be transferred to other substances, like tobacco and alcohol, by
classical conditioning. Then tobacco and alcohol can also produce nausea and behavior which
30
will reduce the stimuli as strengthened. Vomiting may be one of these responses but, obviously,
stopping smoking and drinking are others.
The Practical Use of Aversive Stimuli. pp. 174-175
COMMENT: Presumably Skinner means that we use negative reinforcers in several
different ways. Negative reinforcement would refer only to operation of removing an aversive
stimulus contingent upon a response. He refers herein to other procedures as well.
Q161: List several ways that aversive stimuli can be used to control behavior, and give an
example of each.
COMMENT: Notice that Skinner is using what is called a "two-process" explanation for
the reduction or weakening of certain behaviors. Earlier he described the reduction of smoking
and drinking as a means of escape from conditioned nausea and now he is talking about escaping
the stimulation that results from engaging in a behavior that has been paired with social
disapproval, which is also a case of response reduction. Much more of this will be discussed in
Chapter XII.
Avoidance pp. 176-178
Q162: What is avoidance behavior?
Q163: What is the "reinforcer" in maintaining avoidance behavior?
Q164: What are the practical consequences of successful avoidance?
COMMENT: Both in this section and in the preceding one, Skinner refers to the removal
of a positive reinforcer as definitionally equivalent to the presentation of a negative one. Strictly
speaking, that is not precisely the case. Skinner demonstrates that presenting some events
(positive reinforcers) and removing others (negative reinforcer or aversive stimuli) will
strengthen behavior. These events, however, are defined in terms of their effects on behavior.
That the opposite operations of removing positive reinforcers and presenting negative reinforcers
will have the same or similar effects is nowhere stated. True, Skinner calls these operations
"punishment," but this is not a definition in terms of behavioral effect as is the definition of
reinforcement. Of course other authors have defined punishment in terms of behavioral effect,
both historically (Thorndike) and more recently (Holtz & Azrin, 1965). But for Skinner, any
statement that removing a positive reinforcer will have the same effect as presenting a negative
one should have to be demonstrated, since it will not necessarily be true by his definitions alone.
Anxiety. pp. 178-180
Q165: What is anxiety?
31
Q166: Is escape from anxiety equivalent to avoiding the event responsible for the conditioned
aversive stimuli which produce the anxiety?
COMMENT: Remember that for Skinner an emotion is nothing more than a conceptual
reference for a relationship between certain environmental operations and the resultant
respondent and operant behavior.
Anxiety and Anticipation. pp. 180
Q167: What is anticipation?
Q168: How is anticipation behaviorally contrasted with anxiety?
Anxiety Not a Cause. pp. 180-181
Q169: How can one reduce the effects of anxiety?
COMMENT: In 1941, Estes and Skinner wrote a paper entitled "Some Quantitative
Properties of Anxiety" (Journal of Experimental Psychology, 1941, 29, 390-400.) in which they
describe the effects of a certain experimental procedure designed to produce "anxiety." Their
method first established operant performance on an FI 4 minute schedule, then superimposed a
tone followed 3 minutes later by an unavoidable shock several times each session. The result
was a gradual but ultimately almost complete cessation of responding during the tone. This
effect, later called the "conditioned emotional response" (CER), or sometimes "conditioned
suppression," has been frequently utilized as an experimental procedure, and many parameters
have been studied.
Using an opposite procedure to investigate anticipation has only recently received much
attention. An operant base line is established, then a neutral stimulus "followed by a free positive
reinforcer, in the same way that a tone is followed by a shock in the CER procedure. Results
have been equivocal. Some authors finding suppression (1) (Azrin & Hake, 1969) while other
research has produced heightened responding (Henton & Brady, 1970). Several factors,
including particular base line schedule, nature and degree of conditioned and unconditioned
stimuli, and temporal parameters, are all implicated as factors determining the overall outcome of
these kinds of procedures.
XII. PUNISHMENT
A Questionable Technique. pp. 182-183
Q170: What is the objective of punishment?
Q171: Why is this a questionable technique?
32
Q172: What aspect of punishment causes this disadvantage?
Does Punishment Work? pp. 183-184
Q173: Why did Thorndike conclude that punishment didn't "stamp out" behavior?
COMMENT: This revision of the original Law of Effect is referred to as the "Truncated
Law of Effect" and is a popular piece of psychological trivia, frequently found in comprehensive
examinations.
Q174: What is the effect of punishment on extinction responding in animal research?
COMMENT: This is, possible, a weak position. The theoretical formulation hinges upon
Skinners' original concept of the "Reflex Reserve," a hypothetical model whereby reinforcement
"stores up" responses and extinction exhausts them somewhat in the same manner water is stored
and drained from a water tower (Behavior of Organisms, 1938, pp. 26-28). He almost
immediately retracted the concept (formally in a paper delivered at APA) and later observed that
as a theory it was "utterly worthless in suggesting further experiments" (Cumulative Record,
1961, p.88). Operationally, however, it can be viewed simply as a predicted extinction curve and
it is in this latter sense that Skinner is discussing the effects of punishment.
The experimental evidence cited is based on Estes' (1944) "bar slap" study of the effects
of punishment, wherein the first few extinction responses were "punished." As described, the
response rate recovered fully following the cessation of punishment and the predicted total
number of extinction responses ultimately occurred. However, more recent research (e.g.,
Rachlin, 1966) clearly demonstrates that extinction curves are considerably smaller under
continued punishment. The temporary effect of punishment is seen as no different than a similar
temporary effect of reinforcement. Both must continue to be effective.
The Effects of Punishment. pp. 184-185
Q175: How are positive and negative reinforcers defined?
Q176: How is punishment defined?
COMMENT: Recently a more generally accepted operant definition of punishment has
been in terms of its behavioral effects "... a consequence of behavior that reduced the future
probability of that behavior" (Azrin & Holtz, 1966). However, that is not to say that Skinner
himself necessarily buys that approach. In 1965, Skinner stated "punishment does not merely
33
cancel reinforcement; it leads to a struggle for self-control which is often violent and time
consuming." (The Environmental Solution, in Contingencies of Reinforcement, 1970, p. 52). A
similar position is stated in Evans, R.I., B. F. Skinner, The Man and His Ideas (1968, pp. 33-34).
A First Effect of Punishment. p. 86
Q177: What is the first effect of punishment?
Q178: In what sense is this a temporary effect?
A Second Effect of Punishment. pp. 186-188
Q179: What is the second effect of punishment?
Q180: Can this incompatible behavior be evoked by other events?
A Third Effect of Punishment. pp. 188-180
Q181: What is the third effect of punishment?
COMMENT: This concludes Skinner's analysis of the effects of punishment. In essence,
the key effect of punishment is the establishment via conditioned negative reinforcement of
behavior incompatible with the previously punished responding, e.g., the avoidance behavior of
"doing something else." Punishment, then, doesn't eliminate behavior, instead it establishes the
conditions for the acquisition of replacement responding. As in the comment above, frequently
this behavior can be described as learning "not to respond" or "self-control." It is active
behavior, however, not simply the vacuum that would be left if punishment had the effect of
successfully removing a response from the organism's repertoire of behavior.
Q182: What happens when you punish someone for not doing what he is supposed to do?
COMMENT: It is in this paradigm that this account is weakest. The experimental
parallel, of course, is Sidman (or nondiscriminative) avoidance, where the organism is shocked
for not pressing the bar. Only by responding is punishment avoided. From Skinner's point of
view, all behavior other than bar pressing must be capable of eliciting (or actually become)
conditioned aversive stimulus. The subject learns to bar press because all other possible
responses generate aversive stimulation. That means that the rat could learn to press the bar until
everything else he could do had been paired with the shock. Well, they learn to bar press far too
rapidly for that to be a completely plausible explanation. (See Hearnstein, or Anger, for a more
detailed explanation of this particular issue.)
Some Unfortunate By-Products of Punishment. pp. 190-191
34
COMMENT: Notice how Skinner avoids the issue of elicited aggression. The problem,
of course, is that some species clearly demonstrate behavior that appears to be a reflexive
reaction to pain, while others don't seem to have any similar fixed responses.
Q183: What are the by-products of the use of punishment?
COMMENT: This is an important section in that it clearly reflects Skinner's special
concern about the use of punishment. In the first paragraph, he suggests several reasons for not
using punishment: (1) It is only temporarily effective. (2) It reduces the group's overall
effectiveness. (3) It makes people unhappy. Even if, as today, punishment can be effectively
used to permanently eliminate behavior, have these other concerns been similarly eliminated?
Alternatives to Punishment. pp. 191-193
Q184: Be able to list the several alternative ways of eliminating behavior other than punishment.
COMMENT: Contrary to Skinner's hopes in the last paragraph, the recent research on
punishment has probably led more toward effective utilization of aversive stimuli than to the use
of these alternatives.
XIII. FUNCTION VERSUS ASPECT
Q185: Are traits descriptive of specific responses?
What Are Traits? pp. 195-199
Q186: What are the equivalents of traits in a functional analysis?
Q187: List, with an original example, each of the several behavioral differences resulting from
independent variables that can give rise to trait names.
COMMENT: It seems to me that Skinner omitted a rather critical set of differences,
those with respect to the independent variables of conditioned reinforcement. This could be with
respect to the degree to which certain generalized reinforcers control behavior, e.g., sociable vs
retiring, boastful vs modest, avaricious vs indifferent to wealth. Or, the difference could be with
respect to whether or not certain conditioned reinforcers have even been established; e.g.,
scholarly, sports loving, "gay" as in homosexual, etc.
35
Q188: What do these kinds of traits represent in a functional analysis?
Q189: How are such repertoires assessed?
Q190: What is a "process" difference?
Q191: Can these process differences be inventoried?
COMMENT: It is important to note that Skinner includes as an explanation for rate of
conditioning, the possibility that nothing more is involved than a particular history of
reinforcement.
Q192: How are traits usually quantified?
Q193: What alternative method is more appropriate to a functional analysis?
Q194: Summarize the basic categories of behavioral differences that Skinner believes give rise
to traits.
Prediction in Terms of Traits. pp. 199-202
Q195: What is the sense in which a test permits prediction?
.
Q196: How does this differ from prediction based upon a functional analysis?
Q197: Under what practical conditions are test results useful?
Q198: What is the disadvantage of prediction based upon trait description?
COMMENT: Notice how Skinner qualifies the observation that traits imply only
descriptions. Many have attempted to identify the causes of traits (e.g., Freud). What Skinner
means in these cases is that they haven't correctly identified the causes (as contrasted with his
own functional analysis).
Q199: Why haven't trait descriptions been especially useful in a functional analysis?
36
COMMENT: This is a rather important point, but it is made somewhat more difficult by
Skinner's earlier presentation of traits resulting from different independent variables and
processes. What is intended is the fact that a trait analysis only identifies behavior in terms of
what it looks like. A functional analysis identifies behavior in terms of its controlling relations.
This distinction is expressed by the chapter title. It wouldn't be a problem if different
appearances were directly correlated with separate functions, but they are not. Identically
appearing behaviors may appear in two individuals or in the same individual at different times
for completely different reasons. That is, similarly appearing behavior may represent different
functional relationships with the environment. This problem will be more fully discussed in the
next chapter.
Traits are not Causes. pp. 202-203
Q200: Why are traits not causes?
Q201: Give an example of a trait beginning as an adjective and becoming utilized as a cause.
XIV. THE ANALYSIS OF COMPLEX CASES
Oversimplification. PP. 204-205
Q202: What is a frequent criticism of behavioral principles that are based upon laboratory
research with lower organisms?
Q203: What is a common misunderstanding concerning basic behavioral principles?
COMMENT: This section, written in the early 1950's, well anticipates Chomsky's harsh
review of Verbal Behavior (1957).
Multiple Effects of a Single Variable. pp. 207-209
Q204: In what sense can a single variable have multiple effects? Give an example from the field
of punishment.
COMMENT: Skinner again restates his view of the dynamics of punishment. This
section is somewhat different from that in Chapter XII in that here he separates reflex elicitation
from an emotional operation. These were combined in his "First Effect of Punishment" section
(p. 186), since he was speaking there of the production of incompatible behavior, of which each
of these effects is an example. Notice also that he uses the traditional terminology in referring to
an unconditioned stimulus in the classical conditioning paradigm as a "reinforcing" stimulus. It's
37
effect as such is obviously not what is meant by the same term when referring to operant
conditioning.
Q205: When can multiple effects be easily observed? Give an example.
Q206: In what sense does giving attention to a child who "needs" it weaken his demands?
Q207: Separate satiation effects from discriminative effects in the example of giving a child a
piece of candy when he hasn't asked for it.
Q208: What multiple effects are involved in the wavelike oscillation frequently observed in
extinction curves?
Q209: What is the effect of repeated exposures to extinction?
Q210: Is the emotional effect of frustration restricted to the response being extinctioned? How
can your answer be demonstrated?
COMMENT: If this experiment has actually been accomplished, I haven't been able to
find it. Of course, it could have been conducted and not published, and if so, probably during
Skinner's research at Indiana University.
Multiple Causes. pp. 209-213
Q211: What is a second way in which important behavioral variables may interact? Give an
example.
Q212: How might emotional operations act in conjunction with reinforcement?
Q213: Give some examples of multiple strengthening that involve interacting discriminative
stimuli.
COMMENT: As the footnote on p. 210 suggests, the analysis of multiply determined
behavior is conducted extensively in Verbal Behavior (1957). It is especially important to
remember that words "may have many meanings." A single word (as in Skinner's example of
"house." p. 210) may be under the control of many variables. This is an example of the problem
of form versus function (Chapter XIII). Many cognitive interpretations of verbal behavior result
from a lack of understanding of the role of multiple causation in language.
The Practical Use of Multiple Causation. pp. 213-216
38
Q214: What is "suggestion?"
Q215: Define and give an example of the classes of suggestions that Skinner describes in this
section.
Projection and Identification. pp. 216-217
Q216: How would projective tests be categorized in terms of the above forms of suggestion?
Q217: What is the Freudian and the behavioral interpretation of projection and interpretation?
Q218: What is the difference between projection and identification in terms of the relationship
between the supplementary stimulus and the response?
COMMENT: It is important to understand the dynamics of Skinner's concept of multiple
causation. When Skinner says a behavior already exists in some strength but is not presently
occurring, what he intends is that certain variables that control the behavior are present. This
may be one or several of the factors already discussed; history of reinforcement, emotional
operations, discriminative stimuli, etc. But the behavior doesn't occur. Possibly some
counteracting variable is also present, a preaversive stimulus, for example. However, the
addition of one more behavioral variable, another discriminative stimulus perhaps, summates
those already present controlling variables to the point where their strength is sufficient to
produce the behavior. Confusion can exist if this behavior is then erroneously contributed to the
single variable, since it may well not cause the response at some later time when the other
variables are not present.
Multiple Variables in Perception. pp. 218
Q219: What is the role of multiple causation in the field of perception?
COMMENT: You might review the section, The Analysis of Stimuli, pp. 130-131.
Variables With Incompatible Effects. pp. 218-223
Q220: What is conflict?
Q221: What is algebraic summation?
39
Q222: What kinds of behavior can result in algebraic summation?
COMMENT: Notice how skinner stresses the fact that stimuli, e.g., physical events, do
not exhaust the world of behavioral control variables. Much of the weakness of a traditional S-R
approach to the interpretation of human behavior is due to an over-reliance on stimulus variables
(p. 141). Discriminative stimuli are, of course, important, but so are reinforcement contingencies
and drive and emotional operations.
Q223: What is prepotency?
COMMENT: The following section, "To Do or Not To Do," is somewhat difficult. The
essential point is that the avoidance behavior acquired to replace punished responding is not
"doing nothing," but rather is a specific behavior. Punishment does not create a negative
response tendency, but rather strengthens incompatible avoidance behaviors. Thus, the
punishment paradigm creates two types of competing response situations: The first, obviously, is
between the strength of the punished behavior and the avoidance behavior that the punishment
has established --- to go to the dentist or to do something else. The second is between the various
possible avoidance responses; why one should avoid the dentist by one response or another. It is
the second point that is involved in the Barchester Towers example.
Chaining. pp. 224
Q224: What is a chain of behavior?
SECTION III: THE INDIVIDUAL AS A WHOLE
XV. "SELF-CONTROL"
The "Self-Determination" of Control. pp. 227-230
Q1:
Why is the notion of control implicit in a functional analysis of behavior?
Q2:
What are the theoretical and practical implications of this possibility?
COMMENT: Skinner's aside at the use of statistics is not gratuitous. There is a
fundamental difference between his approach to a functional analysis of behavior based upon
demonstrable control and the usual hypothetico-deductive approach which frequently utilizes
inferential statistics, both to frame hypotheses and to test them.
Q3:
What is a typical objection to the behaving organism as it has been described so far?
40
Q4:
What typical observations about behavior lead to a concept of "self-control?"
Q5:
What is the proper response of a functional analysis of behavior to these apparent
contradictions?
COMMENT: The remainder of this section primarily describes the objectives of the next
few chapters. However, a number of important distinctions are made.
Q6:
How can one behaviorally distinguish between self-control and creative thinking?
COMMENT: This distinction between self-control and creative thinking will be
considerably elaborated upon in this and the next chapter.
Q7:
Why is the concept of a private event relevant to a discussion of self-control and creative
thinking?
COMMENT: Notice how Skinner's position on middle links that are "private events" is
different from those on psychic or physiological ones. He wants to provide an account for the
private events but he disavows completely the notion of psychic middle links and he is
uninterested in physiological ones. The reason for this lies in his analysis of private events as
behavior.
"Self-Control" pp. 231
Q8:
What is self-control?
Q9:
What two responses are involved in the self-control relationship and how are they
related?
COMMENT: It is important to be able to distinguish between self-control and the other
classes of avoidance behavior that may occur and replace punished responding. In the latter case,
responses which escape the conditioned aversive stimuli generated by the incipient punished
responding, and thereby avoid the punishing stimulus, are automatically reinforced. Their only
effect on the punished behavior is to replace it. In self-control, controlling response directly acts
upon the variable of which the previously punished response is a function.
Techniques of Control. pp. 231-240
COMMENT: This is a rather detailed section. However, to understand much that
Skinner has written since, it should be mastered in equal detail. The goal is to understand the
acquisition and maintenance of the controlling response. (Remember that each case, the
controlled response is one which has been previously punished.)
41
Q10: Give examples of the five techniques of self-control through physical restraint and
physical aid, explaining how the controlling response is functionally acquired and
maintained.
Q11: Describe each of the techniques of self-control involved in the manipulation of stimuli,
and in each case explain how the consequences will maintain the behavior.
Q12: Give examples of self-control techniques utilizing deprivation and satiation effects.
Q13: Give examples of the various techniques of self-control involving the manipulation of
emotional conditions.
Q14: Describe various ways to use aversive stimulation in self-control.
Q15: Give some examples of the use of drugs as self-control techniques.
COMMENT: Obviously, this field has grown since Skinner wrote this section, and many
more examples could be given. Self-control through drugs is much simpler than self-control by
acquiring self-controlling responses. The key difference lies in the fact that self-controlling
behaviors "stay" with you in a way that drug effects don't. In other words, you will always
require the drug to produce the desired self-control if drugs are what you rely upon. Secondly,
contemporary pharmacology being what it is, you often get more than you bargain for with drugs,
physiological side effects and addiction are two well-known examples.
Q16: What is the role of self-reinforcement and self-extinction as self-control techniques?
COMMENT: This is an extremely significant section, since it clearly distinguishes
Skinner from many other contemporary behavioral psychologists who are trying to utilize selfreinforcement as a technique to establish self-controlling behavior, e.g., Homme (1967) and
Kanfer (1970). Reinforcing and not-reinforcing yourself for specific behavior is a subtle issue.
On the fact of it, it seems little different from reinforcing or extinguishing someone else's
behavior. That is, you respond, then obtain reinforcers (e.g., mow the lawn, then drink a beer).
But is it really that simple a question? Its effect on lawn mowing seems clearly different than if
you pay a neighborhood youngster to do the job. He comes under the control of deprivation and
satiation, and will ask to mow the lawn again when he needs money. Similar effects don't seem
to be relevant to the self-reinforcement paradigm. In any event, Skinner seems much less certain
of the effectiveness of self-reinforcement and self-extinction (and in the next section, selfpunishment) than do many others.
Q17: What is the role of self-punishment as a self-control technique?
42
COMMENT: The issue of self-punishment, as well as those of self-reinforcement and
self-extinction are probably best summarized for Skinner in his statement, "The ultimate question
... is whether a practice of this sort shows the effect which would be generated by the same
stimulation arranged by others." Obviously, he is not sure they will.
Q18: How is the principle of prepotency utilized as a technique of self-control?
The Ultimate Source of Control. pp. 240-241
Q19: Who arranges for the behavior of self-control?
Q20: What are the practical advantages of a functional analysis of self-control in contrast to
the traditional conception of self-determination?
XVI. THINKING
The Behavior Of Making A Decision. pp. 242-244
Q21: How does making a decision differ from self-control as a form of self-determination?
Q22: What is the role of private events in decision-making?
Q23: What are the similarities and differences in the techniques used to accomplish self-control
and decision-making?
Q24: What is meant by the term "deciding?"
COMMENT: This distinction between deciding and the act decided upon parallels the
distinction between the controlling response and the controlled response in the self control
paradigm of Chapter XV.
Q25: Give some examples of behavior that terminate decision-making prior to the execution of
the decided upon behavior.
Origin and Maintenance of the Behavior of Deciding. p. 244
Q26: What are some of the consequences that reinforce decision-making?
43
Q27: What are some of the disadvantages of these consequences and how are these revealed?
Q28: Why do we see as much decision-making as we do?
The Behavior of Recall. pp. 245-246
Q29: What special circumstances in decision-making occasions the use of a self-probe?
Q30: Describe some of the techniques that are available to aid in recall.
Problems and Solutions. pp. 246-252
Q31: What is a problem?
Q32: How is the strength of this response usually revealed?
COMMENT: At one point in this section, Skinner defines the solution to a problem as
simply a response which alters the situation so that a strong response can be emitted (p. 247).
However, throughout the remainder of the chapter, he refers to this activity as "problem-solving"
and refers to the emitted strong response as the solution. The questions in this guide will adopt
the latter formulation.
Q33: What are the consequences of the emission of the behavioral solution?
Q34: What are the similarities and differences between problem-solving and self-control?
Q35: Why is the appearance of a solution no guarantee that problem-solving behavior has
occurred? Give an example.
Q36: Under what circumstances does trial-and-error learning display some minimum problemsolving?
COMMENT: The behavior Skinner described in this section as minimum problemsolving of the trial-and-error type is often observed in young children and the retarded when you
begin to teach simple discriminations, such as color names, for example.
Q37: What are two problem-solving techniques that involve the manipulation of stimuli? Give
examples.
44
Q38: What are two other commonly used techniques in addition to the manipulation of stimuli
for problem-solving?
Q39: What constitutes a problem's difficulty?
Q40: What makes a problem insoluble?
Having an Idea. pp. 252-254
Q41: What are some of the sources of "ideas" that do not occur as the result of deliberate
problem-solving. Cite some examples.
Originality in Ideas. pp. 254-256
Q42: What is the problem for a functional analysis that is posed by the concept of originality
and creativity in thought or behavior?
Q43: How does Skinner suggest we account for an original or novel idea?
Q44: What are the practical advantages of a functional analysis?
COMMENT: The paragraph on p. 255, describing the fact that the environment is now
in better control of man contains some extremely important ideas of Skinner on the nature of
cultural evolution. He further expands these concepts toward the goal of accelerated cultural
development in his book, Beyond Freedom and Dignity, (1971).
Suggested Further Readings:
"Teaching Thinking" in Skinner's Technology of Teaching (1968)
"The Creative Student" also in the above text.
"An Operant Analysis of Problem Solving" in Skinner's Contingencies of Reinforcement:
A Theoretical Analysis (1969)
XVII. PRIVATE EVENTS IN A NATURAL SCIENCE
45
The World Within One's Skin. pp. 257-258
Q45: What is the "world within one's skin?"
COMMENT: Notice that Skinner doesn't suggest that a private event cannot be observed
by another organism, only that when it is, it is responded to differently. That is, a cavity
constitutes a completely different stimulus event for the dentist's behavior than it does for the
patient's. Behaviorally it is by no means the "same thing."
Q46: How is this realm of events distinguished from the external world?
Q47: What is the task of a functional analysis?
COMMENT: Obviously, many behavioral psychologists, both experimental and applied
(e.g., behavior modifiers) might wish at this point that Skinner had let sleeping dogs lie. He
seems to be opening the Pandora's box of inner events that a more "tough-minded" approach to
human behavior has been trying to shut for the last several decades. However, it has long been
Skinner's position that to ignore the phenomological world --- usually called "mental" --- is to
weaken rather than strengthen the position of the behaviorist. Thus, he has taken several
opportunities to extend the concept of a functional analysis based on the prediction and control of
observable behavior into the private world of the individual. An elaboration of his own
description of the goals and rationale of this effort, "radical behaviorism" as it is frequently
called, is available in his article "An Operational Analysis of Psychological Terms" (1945,
reprinted in Cumulative Record, 1961)
Verbal Responses to Private Events. pp. 258-261
Q48: What is the problem in establishing private events as discriminative stimuli for verbal
responses?
Q49: What are the three methods Skinner describes for the establishment of private events as
discriminative stimuli for verbal behavior? That is, what does the community respond to
when it reinforces discriminative behavior to private events? Give examples.
Q50: What is the problem for the listener of someone's description of subjective events?
Q51: What is the result of this imprecision for the individual who acquires such a repertoire of
responses to private events?
.
46
COMMENT: This is a rather subtle and much debated point. However, it is a
straightforward extension of what is known about responding to external stimuli. Animals not
shaped to respond differentially to certain stimulus features simply don't respond to them. Since
most verbal behavior called knowledge hinges upon verbal abstractions, failure to be taught them
results in failure to observe them. This well apply to the private event as well as to the public
one.
Varieties of Private Stimulation. pp. 261-264
Q52: What are three kinds of private stimulation and their sources?
Q53: To which of these does an individual respond when describing his own behavior?
Q54: How can the verbal community establish self-descriptive behavior that includes responses
to private events?
Q55:
What are the special advantages of covert verbal behavior?
COMMENT: Skinner's analysis of the role of covert verbal behavior is greatly extended
in Verbal Behavior (1957). One distinction made therein that is relevant to this section of Science
and Human Behavior is that between the roles of the speaker and listener when each occurs
simultaneously in the same individual. That is, we "talk to ourselves" and to the extent that this is
effective (in the sense of self-determination) the behavior is maintained. We both speak and
listen to ourselves for the same reason that we speak and listen to others. It represents a case of
publicly reinforced behavior occurring at a covert level and being reinforced in that form.
Responses to One's Own Discriminative Behavior. pp. 264-266
COMMENT: This section must be read carefully, since Skinner presents two difficult
and quite different points, although both involve discriminative repertoires. The first centers
around the question of how one learns to describe such repertoires, and the second involves the
problem raised by discriminative responding in the absence of relevant external stimuli.
Q56: Give an example of a response to a discriminative behavior.
Q57: How does the verbal community teach the individual to respond to his own discriminative
behavior?
Q58: To what does the individual ultimately respond in these cases?
47
Q59: Under what conditions does this prove to be a problem for the analysis of behavior?
Q60: What are the major sources of such responding?
Conditioned Seeing. pp. 266-270
Q61: In what manner does classical conditioning account for seeing something when it is not
there? Give examples.
COMMENT: Skinner is using the Pavlovian formula strictly to refer to the operational
procedures of stimulus pairing to produce a stimulus substitution effect. Earlier discussed
behavioral distinction ("autonomic vs. external musculature") and stimulus function (elicits vs.
occasions) are not involved. If one effect of a stimulus is somehow to produce a discriminative
response, perhaps through contingencies of reinforcement, then presumably that effect can be
transferred to another stimulus by classical conditioning procedures.
The next several paragraphs discuss how certain fragmentary or partial stimuli can
produce a conditioned discriminative response appropriate to the original complete stimulus, thus
causing one to "see" the completed stimulus rather than the fragmentary one. The context of this
discussion, of course, are the Gestalt phenomena and optical illusions.
Q62: What is the range of control exerted by external stimuli over discriminative responding?
Q63: What are the sources of individual differences with respect to conditioned discriminative
responding?
Q64: What is an hallucination?
Q65: What is the practical importance of conditioned discriminative responding?
Operant Seeing. pp. 270-275
COMMENT: This section is quite complex, both because of the subtleties of the
behavior(s) involved and the particular style of organization (or lack thereof).
Q66: What are some evidences of strength for a discriminative operant?
Q67: How is operant discriminative responding distinguished from classically conditioned
sensory processes?
48
Q68: What are the advantages and disadvantages of discriminative responding in the absence
of the external stimulus?
Q69: What are some sources of reinforcement for discriminative responding?
Q70: Is private problem-solving wholly discriminative behavior?
Q71: What are some sources of individual differences in private behaviors utilized in problemsolving?
Q72: Why is it difficult to distinguish among particular private problem-solving skills?
COMMENT: Although Skinner relied on problem-solving as a source of reinforcement
for private responding and thus as a vehicle to give many examples, he includes at the end of this
section the fact that the self-control relationship would similarly be a source of reinforcement for
private events.
Traditional Treatment of the Problem. pp. 275-280
Q73: How does one account for the verbal behavior which describes discriminative behavior?
COMMENT: Be sure to remember that both the descriptive verbal response and the
discriminative response are behaviors.
Q74: What accounts for the descriptive response in the absence of the relevant stimulus?
COMMENT: Note that although the descriptive response provides evidence for the
discriminative response, it is not an inevitable concomitant. Thus, one can assume that the
private discriminative responding also occurs in nonverbal organisms, such as the case of
animals having "dreams."
Q75: What is the traditional solution to private seeing?
COMMENT: The remainder of this section provides an analysis of the possible causes
for the traditional perspective, and the more appropriate behavioral account for the same
phenomena.
Q76: Briefly describe the problems which have resulted in the traditional sensation-perception
distinction.
49
COMMENT: The final part of this section, "Objections to the traditional view," is the
predictable discussion of the irrelevance of conceptualizing "sensations" and "images" as "mental
events." As such they may occupy the middle link position, but are characteristically seen as
causal. The behavioral account must not only provide a description of their occurrence, but also
trace their causes to the external world.
Q77: Are there "wholly private events," events which are knowable only to the organism?
Other Proposed Solutions. pp. 280-281
Q78: What other proposed solutions to the problem of private events are available, and are
they compatible or incompatible with the present functional analysis?
COMMENT: Two quick observations. Additional support to Skinner's position on the
nature of discriminative responding to private events is provided by those studies which
demonstrate that additional control can be gained when covert behavior is instrumentally
amplified, "fed back" to the organism, and reinforcement is made upon certain properties thereof,
e.g., Hefferline, Keenan, & Hartford, Science, 1959. Second, although the sensation and
perception psychologists are characteristically unaware of Skinner's functional analysis, a rather
simple switch in their viewpoint from stimulus "tracing" to trying to identify neuro-physiological
"responding" would certainly enhance the importance of their efforts from a functional
perspective.
Additional Readings:
Skinner, B.F. The operational analysis of psychological terms. Psychological Review,
1945. Reprinted in Skinner's Cumulative Record.
Skinner, B.F. Verbal Behavior, 1957.
Skinner, B.F. Behaviorism at fifty, Science, 1963. Reprinted in Skinner's Contingencies
of Reinforcement, 1969.
Day, W.F. Radical behaviorism in reconciliation with phenomenology. Journal of the
Experimental Analysis of Behavior, 1969.
XVIII. THE SELF
Q79: What is the common interpretation of the role of the "self?"
Q80: Why are selves and/or personalities sometimes said to be multiple?
50
Q81: Give a brief description of the behaviors involved in the Freudian personality structures.
Q82: Why does Skinner feel that such phenomena are worth considering?
The Self as an Organized System of Responses. pp. 285-288
Q83: What is a self?
COMMENT: Notice how Skinner deals with the "explanatory fiction" of the self; by
examining the facts upon which it is based. This is a completely different strategy than ignoring
it, or claiming that since it is supposedly non-physical, it has no place in science.
Q84: Identify and give examples of the several ways in which different response systems
(selves) can be organized or unified.
COMMENT: Notice that any or all of these factors can come into play in the same
individual at different times or sometimes simultaneously.
Q85: What difficulties can arise utilizing an analysis of the individual in terms of selves or
personalities?
Q86: In the next section, "Relations among selves," three basic relations between response
systems are identified and discussed. Be able to define and give a detailed example of
each of these.
COMMENT: The last paragraph is a little difficult, but rather important. Essentially,
Skinner has already described self-knowledge as a set of self-descriptive responses under the
control of contingencies deliberately imposed to produce that effect. He now is asking whether
or not other organized systems of behavior (selves), occasioned by other variables and controlled
by different contingencies, can come to display this kind of self-descriptive responding with
respect to another system of responses, e.g., does the id "know" about the superego?
The Absence of Self-Knowledge. pp. 288-292
Q87: What are several examples of situations where self-knowledge is missing?
COMMENT: The example of automatic writing is treated extensively by Skinner
in his article, "Has Gertrude Stein a Secret?", Atlantic, 1934, reprinted in Cumulative Record.
Q88: The next few paragraphs describe some simple situations where self-knowledge or
awareness may be lacking. Be able to describe these briefly.
51
COMMENT: Notice Skinner's description for how an adult may be able to describe a
childhood situation if the visual scene can be evoked later, presumably as in conditioned seeing.
This book is full of two sentence explanations for behavioral phenomena that some spend a good
portion of their careers trying to explain.
Q89: What is Skinner's account of repressed self-knowledge?
Q90: If the form of the punished response always important with respect to repressing selfdescription?
.
Q91: What is "rationalization," according to Skinner?
Symbols. pp. 292-294
COMMENT: Skinner is using the word "symbol" in this section somewhat differently
than in the commonsense meaning of "representing something else," as when flags symbolize
countries, etc. Instead, he is talking about the "Freudian" symbol, where whatever being
symbolized is punishable in appropriate or accurate form.
Q92: What is a Freudian symbol?
Q93: Why might such symbols occur in art, literature, or dreams?
COMMENT: I'll bet you never thought you'd read a Skinnerian interpretation of dreams,
did you?
SECTION IV: THE BEHAVIOR OF PEOPLE IN GROUPS
XIV. SOCIAL BEHAVIOR
Q1:
What is Skinner's definition of social behavior?
Q2:
Can the laws of a social science be based upon a science of the behavior of individuals?
Q3:
Should the social sciences state their laws in the terms of a functional analysis of
individual behavior?
52
COMMENT: Note how Skinner describes his objectives for this section. He is going to
use the principles established from a functional analysis of individual behavior "to account for"
social behavior. The objective is to see whether or not such an analysis can, in fact, be made
without introducing new terms or presupposing new principles or processes. If it can, the
adequacy of the original analysis is further substantiated, and the social sciences will be shown a
new and simpler perspective for their subject matter. This also was his rationale for dealing with
private events, as well as with human behavior as a whole.
The Social Environment. pp. 298-304
Q4:
What kind of roles do other individuals play in social reinforcement? Give examples.
Q5:
What are some characteristic features of behavior that have been reinforced through the
mediation of others?
COMMENT: The next few paragraphs describe the effects of various schedules of social
reinforcement of behavior. It is interesting in that all of the examples are of misapplication, or,
in any event, of unhappy outcomes. It is these kinds of situations, where positive reinforcement
is badly applied or applied to the disadvantage of some individual that led to Beyond Freedom
and Dignity (1971).
Q6:
What is a feature of social reinforcement contingencies that rarely, if ever, prevails in
inanimate nature?
Q7:
Why is it difficult to identify the physical features of such social stimuli as a smile?
COMMENT: An example of the point Skinner is making would be to consider the
redness of a ripe apple as a discriminative stimulus for picking and eating it compared with the
smile of the person you are with as discriminative for asking a favor (or something like that).
The "behavior: of the apple is naturally determined, e.g., redness always equals ripeness, whereas
the smile may merely reflect a history of being polite rather than any particular inclination to
further reinforce.
Q8:
What is it about social stimuli that set them apart from a simple analysis in terms of
physical features? Give an example.
COMMENT: Again, it is the long history of reinforcement that can produce such effects,
but there are analogues in physical nature, since slight physical differences can be associated with
considerably different properties.
Q9:
What class of social stimulus seems not to vary to any great extent between cultures?
53
The Social Episode. pp. 304-309
Q10: How does one account for the behavior of two individuals in a social episode?
COMMENT: The next several paragraphs describe a number of common social
episodes, both animal and human. Several situations are considered. An important point to
remember in all of these is the role of the external environment in determining the overall
contingency, i.e., one form of social interaction (cooperation) may provide more reinforcement
than others within a given situation.
Q11: What is the behavioral nature of the leader and follower relationship?
Q12: To what extent do verbal interactions defy a functional analysis of social interactions?
COMMENT: Skinner again refers the reader to Verbal Behavior (1957) for a more
detailed explication of language. It would be difficult to exaggerate the importance of
understanding a behavioral account of language, since nonbehavioral ones are so prevalent and
misleading, as in the nature of scientific or logical thinking.
Q13: Provide a functional analysis of the following: You (A) are in the library reading.
Another student (B) comes up, and asks you for the time. You tell him, he thanks you
and goes on.
Q14: What is an autocatalytic social episode?
COMMENT: Note the last sentence in this section. Skinner's social concerns enter time
and again into the book, and are clearly articulated in the last section.
Supporting Variables in the Social Episode. pp. 309-311
Q15: Under what condition are social interactions not self-sustaining? Give an example.
COMMENT: As Skinner notes, this is an extremely important concept when considering
the social practices of the culture. Most contemporary agencies of social services depend greatly
upon external social support for the behavior of those involved, e.g., fund drives, charities, etc.
Sometimes, of course, socially established altruistic behavior is missing, or is quickly
suppressed, as in a special education classroom with "culturally disadvantaged" youngsters, and
the interlocking system becomes unstable and nonproductive. Virtually all contemporary "social
values" represent descriptions of behavior that require social support to be maintained and their
"value" is the extent to which society will, in fact, support these behaviors. Why systems are not
self-sustaining to the individuals involved, but are sufficiently important to the group as a whole
to be supported (either by prior conditioning or supplementary variables), is considered in
Chapter XXVIII and elsewhere (e.g., Beyond Freedom and Dignity, 1971).
54
The Group as a Behaving Unit. pp. 311-312
Q16: Provide two reasons why people may join group activities.
XX. PERSONAL CONTROL
Q17.
What is the asymmetrical social relationship referred to when we say that someone is
"deliberately" controlling someone else?
Control of Variables. pp. 314-315
Q18: What special advantage does personal control have that is not usually available to
organized agencies of social control?
Q19: What is usually the first objective of the exercise of personal control?
Q20: How is this typically accomplished? Give an example.
Techniques of Control. pp. 315-320
COMMENT: This is an absolutely critical section. Being able to successfully analyze
instances of control as well as apply these techniques when necessary are the bases of
contemporary applied behavior theory.
Q21: Be able to enumerate each of the nine basic techniques of control listed in this section,
and give an example of each.
COMMENT: Several of these categories require further analysis.
Q22: What are some of the disadvantages of physical force as a technique of control?
Q23: What are some of the ways the stimuli can be manipulated to control behavior?
COMMENT: Notice that Skinner distinguishes between using discriminative stimuli in
isolation or as a source of supplementary stimulation as techniques of control.
Q24: What is one risk in the social utilization of conditioned reinforcers?
55
Q25: What is the distinction between aversive stimulation and punishment?
Q26: What is required if "pointing up contingencies of reinforcement" is going to be effective
as a control technique?
Q27: When is the technique most likely to be used?
COMMENT: Notice how Skinner describes the use of deprivation. It is a control
technique that permits you to strengthen behavior by deprivation alone, given that the behavior
has previously been reinforced by the event you are currently depriving the organism of. This is
a completely different utilization of deprivation than when you use it to enhance the future
reinforcing effectiveness of something you wish to use to strengthen behavior.
Q28: What are behaviors that are described as emotional?
COMMENT: It is extremely important to remember Chapter XIV while reading this
section. Probably no single technique of personal control represents an isolated instance of one
of these described categories. Many of the environmental variables have multiple effects, and
some of these may come into play when you attempt to control behavior, i.e., using positive
reinforcers may both strengthen behavior and generate a favorable predisposition toward you, as
well as permit you to exercise control via deprivation over the particular response at some later
point in time.
Objections to Personal Control. pp. 320-322
Q29: Why is deliberate control not a popular topic?
Q30: What is the effect of this?
COMMENT: Notice how one form of group countercontrol is to make the exercises of
control a conditioned aversive event, and thus, lessen the likelihood that individuals so
conditioned will attempt to use personal control techniques.
Q31: What are the cultural consequences of such a history of the aversive use of control?
COMMENT: It scarcely needs said that this last paragraph anticipates the book Beyond
Freedom and Dignity (1971).
XXI. GROUP CONTROL
Q32: Why do groups act to control individuals?
56
Q33: What is the principle technique whereby the group exerts control over the individual?
Q34: Is this classification system foolproof? Why or why not?
Q35: How is aversive behavior controlled by group practices?
Why the Group Exerts Control. pp. 325-326
Q36: What is required to explain group control?
COMMENT: The remainder of this section provides a description of two general
formulae that account for the nature of group control; the first involves group behavior as an
emotional reaction to individual behavior, and the second suggests that group consequences are
deliberately provided to increase or decrease future instances of the same behavior.
Q37: Give an example of group countercontrol as an emotional reaction and as an example of
deliberate punishment.
COMMENT: Notice that emotional counter-aggression may be effective in suppressing
behavior, and thus also may be maintained because of its consequences.
The Effect of Group Control. pp. 327-328
Q38: What are the disadvantages and advantages of group control to the individual so
controlled?
Q39: What counterbalances the power of group control?
Justification of Group Control. pp. 328-329
Q40: How does a functional analysis of behavior deal with the ethical problem of right and
wrong, good and bad, etc.?
Q41: What sources of justification for such distinctions are frequently used?
57
COMMENT: Notice how a behavioral account frequently can be provided for the
behavior of those who make such distinctions along other lines.
Q42: Does an analysis of controlling practices provide a rationale for determining what the
group controlling practices should or should not be?
SECTION V: CONTROLLING AGENCIES
XXII. GOVERNMENT AND LAW
Controlling Agencies. pp. 333-335
COMMENT: The introductory paragraph in this section briefly presents Skinner's views
regarding the nature and source of social control.
Q1:
What is Skinner's objection to the various conceptions of the behaving individual as
encountered in law, economics, education, etc.?
Q2:
What is his alternative strategy?
COMMENT: Notice again how Skinner regards his own analysis as an effort to achieve
"a plausible account."
Q3:
What must one describe in the analysis of a social system? What must be known in to
accomplish this?
The Governmental Agency. pp. 335-336
Q4:
What defines a government?
Q5:
Is the power of the ultimate agency within government always aversive?
COMMENT: Skinner is distinguishing here between the leaders and their agents in
terms of controlling techniques. The general nature of control exercised by a government may be
based upon its power to punish, but the within-agency control may well be of a different source.
Q6:
What is necessary for a government to control "with the consent of the governed"?
Q7:
Need that relationship continue once that government has established its control over the
society?
Techniques in Governmental Control. pp. 336-338
Q8:
What determined legal and illegal acts in a dictatorship? In a democracy?
58
Q9:
What is the net effect of governmental control? Be able to describe the behavioral
processes which account for this outcome.
COMMENT: Notice how Skinner's two process analyses of the effect of punishment is
important to the above account.
Q10: What is obedience? Why is obedience to a verbal command based upon a history of
aversive control?
Q11: What is the advantage of obedient citizens to the controlling agency?
Law. pp. 338-341
Q12: What is a law?
Q13: How does it specify behavior?
Q14: How does the average citizen become law-abiding?
Q15: What makes the effect of punishing others as a deterrent a relatively weak technique of
control?
COMMENT: Skinner's analysis of the role of verbal processes in mediating the
effectiveness of laws and other forms of rules which govern behavior can be found in Chapter 6,
"An Operant Analysis of Problem Solving," in Contingencies of Reinforcement.
Traditional Interpretations. pp. 341-344
COMMENT: This is an important albeit unorganized section. Skinner is attempting to
provide behavioral interpretations of society's usual ways of dealing with illegal behavior, while
simultaneously attacking the traditional justifications for such.
Q16: Three reasons for punishment are revenge, rehabilitation, and deterrence. Which are
behaviorally justifiable and how?
Q17: What is the relationship between the legal concept of "responsibility" and behavioral
controllability?
Other Types of Governmental Control. pp. 345-346
Q18: Provide examples of governmental control which are not based exclusively on the power
to punish.
COMMENT: Notice Skinner's somewhat cautious optimism regarding the evolution of
social control practices toward more positively reinforcing techniques. However, many basic and
applied psychologists do not share his view of punishment's relative ineffectiveness, and
59
ineffective or not, societies are quick to revert to aversive control when immediate solutions are
important.
Countercontrol of Governmental Agencies. pp. 346-348
Q19: Why is the social system of the government and the governed inherently unstable?
Q20: What are some indicants of the limits of stability?
Q21: What is the effect of government "by law" regarding the stability of the system?
Justification of Governmental Practices. pp. 348-349
COMMENT: This last section is a mini- tour de force for Skinner with respect to his
interpretations of some classic traditional values. The important point is that even though he sees
each of these terms as behaviorally explicable, they are still collectively inadequate as the bases
for evaluating a given society.
XXIII. RELIGION
COMMENT: Would ethical philosophers agree with the second sentence? In fact, do
you? Can you account for a negative response in behavioral terms?
Q22: What is a superstitious response?
Q23: Who composes a religious controlling agency, i.e., who are the leaders?
Q24: What is their claimed source of power?
Techniques of Religious Control. pp. 352-355
Q25: Compare religious techniques of control with legal and ethical ones.
Q26: What process is necessary to establish the power of religious threats and promises? Give
examples of both positive and negative consequences.
Q27: List some of the other techniques of behavioral control used by religious agencies,
including those of agencies which make no especial claims regarding their abilities to
intervene in supernatural outcomes.
The Behavior Controlled by the Religious Agency. pp. 355-357
Q28: How are the behavioral goals of the religious agency distinguished from the ethical
objectives of the larger group?
Q29: What process underlies self-control?
60
Explaining the Agency. p. 357
Q30: Why do politicians emphasize their religious affiliations?
Q31: Why do opponents of pornography often have large collections of pornographic materials
themselves?
Countercontrol. p. 358
Q32: Why do you suppose that the Mormon church is growing today? What are some reasons
why the Catholic church might be losing members and priests today?
Justification of Religious Control. p. 358
Q33: Does the justification of religious control depend upon supernatural relationships?
XXIV. PSYCHOTHERAPY
Certain by-Products of Control. pp. 359-361
Q34: Why does the group control the individual?
COMMENT: Surprisingly, Skinner omits the constructive objective of shaping
individuals to behave in ways which benefit the group. Groups not only seek to weaken
selfishness, they also seek to instill altruism.
Q35: Be able to describe some of the behavioral by-products of excessive or inconsistent
control.
Q36: Why are traditional agency reactions to such outcomes of control usually ineffective?
Emotional By-Products of Control. pp. 361-363
COMMENT: Remember that emotions or emotional responses are not inner states, but
rather are complexes of operant and respondent behaviors that vary together as a function of the
operation of certain independent variables.
Q37: What causes fear, anxiety, anger, and depression?
Q38: How are they to be eliminated?
Some Effects of Control Upon Operant Behavior. pp. 363-367
Q39: How can "self-control" miscarry? Briefly describe each of the inappropriate avoidance
responses described in this section.
Psychotherapy as a Controlling Agency. pp. 367-371
61
Q40: What is the social role of psychotherapy as an "agency" of control?
Q41: Why should therapy follow directly from diagnosis?
Q42: What source of control is initially available to the therapist?
Q43: How can this control be expanded?
Q44: Describe the two behavioral processes involved in successful psychoanalysis.
Psychotherapy Versus Religious and Governmental Control. pp. 371-372
Q45: Does psychotherapy ordinarily support or contradict other social agencies of control? Be
sure to consider both goals and methods in your answer.
Traditional Interpretations. pp. 372-379
Q46: What are the negative effects of considering the behaviors that necessitate therapy only as
symptoms?
COMMENT: The remainder of this section is a demonstration of Skinner's skill at
interpreting a case of psychodynamic "wish fulfillment" in operant terminology.
Other Therapeutic Techniques. pp. 379-382
Q47: What are other behavioral conditions that require therapy in addition to, or instead of,
histories of excessive or aversive control?
Q48: Why does the non-analytic or client-centered therapist wait for the client to suggest a
solution rather than provide it for him? (Behaviorally, of course, not in terms of the
therapist's traditional rationale.)
Explaining the Psychotherapeutic Agency. pp. 382-383
Q49: Why did Bandura claim that behavior modification is only effective when the client
consents? (American Psychological Association Presidential Address, 1974)
XXV. ECONOMIC CONTROL
Q50: Distinguish between goods and wealth.
Reinforcing behavior with money. pp. 384-385
Q51: What is required if a contract is going to be effective in controlling behavior?
62
Wage Schedules. pp. 385-391
Q52: What features of fixed ratio reinforcement lead to high rates of responding?
Q53: Why do human workers perform "throughout the interval" even when paid only at the
end?
Q54: What are the schedule parameters and effects of a salesperson receiving a salary plus
commissions?
Q55: How and why should bonuses be scheduled?
COMMENT: Skinner's review of extra economic factors and their effects on quality of
workmanship and job attitude is often overlooked by industrial "behaviorists" who tend to
exclusively emphasize monetary contingencies, or at least are often criticized as such.
The Economic Value of Labor. pp. 391-393
Q56: What determines the economic value of labor to both the employer and the employee?
Q57: What is the advantage of money in this regard?
Q58:
When is money a bribe?
COMMENT: Notice that this is not the sense in which teachers and parents use the term
"bribe" when objecting to the use of explicit reinforcers in the classroom or home. Their point is
usually that the youngsters should perform appropriately for other reasons.
Buying and Selling. pp. 393-398
Q59: What factors are relevant in determining the value of goods?
Q60: What can a professional gambler do that a slot machine cannot?
COMMENT: The conditions necessary to establish economic behavior are often
overlooked by behavior modifiers eager to set up a token economy. It is easy to establish an
effective token system only when the clients have a relatively lengthy history of reinforcement
regarding the use of money. Grade school children and institutionalized populations often do
not.
"Economics" pp. 398-400
Q61: What is the data base of the concept of the Economic Man? What are the limits of the
concept?
The Economic Agency. p. 400
63
Q62: What determines an economic agency as such?
COMMENT: It isn't clear exactly what Skinner intends to include as those economic
agencies representing "capital". A relatively limited definition would include banks and banktype agencies, brokerages and various investment associations, and possibly money exchanges.
A broader definition would also include corporate agencies which do not produce goods or
services, but are directed by and for stockholders whose primary and possibly exclusive interest
is in profits.
Countercontrol. pp. 400-401
Q63: What is the major source of countercontrol against the misuse of economic power?
COMMENT: For better or worse, money (like political power) grows with success. In
countries where capital investment is broadly encouraged you see an increasing concentration of
wealth, primarily because the major sources of potential countercontrol, including organized
labor, are themselves susceptible to economic control. In other words, resistance can be bought
out. Further, the current growth of multinational corporations permits the controllers to more or
less escape any single government's effort to countercontrol them. History, unfortunately, tends
to indicate that great wealth is often countercontrolled only by revolution. Modern capitalists
have apparently learned this lesson, and seek to avoid such an outcome by avoiding excessive
exploitation, at least at home.
XXVI. EDUCATION
COMMENT: If there is a more succinct two paragraph summary of the goals and
methods of education, I have yet to read them.
Educational Agencies and Their Techniques of Control. pp. 403-404
Q64: What distinguishes an agency as educational: goals or methods?
Q65: What maintains the professional educator?
Educational Reinforcement. pp. 405-407
Q66: What were the major sources of control available to educators?
Q67: Why does Skinner say these controls are now weakened or unavailable?
COMMENT: Skinner later broadened his objection to progressive education by pointing
out that natural consequences are rarely programmed well for effective education.
The Behavior Resulting from Educational Control. pp. 407-411
Q68: What is necessary for the acquisition of skill?
64
COMMENT: The section on knowledge if almost too condensed to be especially
meaningful upon a first reading. Skinner, in fact, refers the reader to Verbal Behavior, which is
possible only in later editions of Science and Human Behavior. Several key concepts are
presented and briefly discussed, however, and the reader should try to follow them.
First, according to Skinner, knowledge is not something which has some mentalistic
independent existence and is somehow "stored" away for use. Instead, knowledge is the use, or
more appropriately, is behavior. It may be primarily verbal, or it may be with respect to the
physical environment, but in either case knowledge is behavior.
Second, to understand something, in the fullest sense of the word, is to be able to do or
say the same thing for the same reasons. To merely repeat a poem is not the same thing as to
understand it. When a listener comes to understand the speaker (or writer) he or she is then able
to respond in the same way for the same reasons.
Third, instruction consists of the control of behavior through words which "refer" to other
events and objects. The resultant behavior, however, is thus behavior under the stimulus control
of words. However, if the instructions are followed, and the behavior is thereby effective, the
behavior then comes under the control of the natural stimuli and consequences to which the
instruction originally referred. There are two points to be considered: to be controlled by
instruction requires a history of reinforcement regarding instructional control; instructional
repertoires are different, i.e., under different functional control, then are repertoires established
by natural events and consequences. Skinner discussed this critical distinction later in a chapter
entitled "An Operant Analysis of Problem Solving" in Contingencies of Reinforcement (1969).
Fourth, a student can acquire self-instructional repertoires which permit the mediation of
the interval between educational instruction and later functional use. When the repertoires
consist of sets of problem solving skills, they represent an analogue to the self-control that other
agencies often try to establish. Essentially, these future-oriented repertoires are designed to deal
with novel or unanticipatable circumstances.
Countercontrol. pp. 411-412
Q69: Who countercontrols the schools, and why?
COMMENT: Skinner seems to have omitted the most obvious and frequent source of
countercontrol over the public school, the community in which it is located. Parents and
schoolboards can exercise large amounts of control over school activities, and usually represent a
strongly conservative force when they do.
SECTION VI: THE CONTROL OF HUMAN BEHAVIOR
XXVII. CULTURE AND CONTROL
Manners and Customs. pp. 415-419
Q1:
Describe the process of induction as a basis for manners and customs.
65
Q2:
Why are such patterns of conformity self-sustaining?
The Social Environment as Culture. pp. 419-421
Q3:
What determines a culture?
Q4:
Are the sustaining controls for a culture always unified?
Q5:
What factors led to change in some cultural practices regarding the control of sexual
behavior?
COMMENT: Given what has been said before, you would expect Skinner to support
such changes because of the supposed reductions in aversive controls (and subsequent aversive
side effects of "by-products"). Notice, however, the ambiguity of his remark concerning the "net
result" of these changes.
The Effect of Culture Upon Behavior. pp. 421-424
Q6:
For each of the cited characteristics of individual behavior, describe to what extent the
resultant behavior depends upon physical or social variables, or both.
Cultural Character. pp. 424-425
Q7:
What characteristics of the social environment must exist if two cultures are to be
different?
Q8:
What is necessary to establish a relationship between cultural practices and cultural
modes of behavior? Why is it difficult to do so?
XXVIII. DESIGNING A CULTURE
Q9:
List some factors which introduce cultural practices "by accident".
COMMENT: "By accident" doesn't imply that a leader who introduces the practice
doesn't intend to do so, rather than the change isn't an intentional effort to produce a better
culture based upon its intended effect. Consider, for example, the origin of the Episcopalian
Church.
Q10: What is necessary for deliberate cultural design?
Q11: How can a future goal control current efforts to achieve it?
Value Judgments. pp. 428-430
COMMENT: This section introduces Skinner's analysis of cultural values, a perspective
he has referred to and explicated several times since.
66
Q12: Provide a behavioral translation of (1) "You ought to take this short cut." and (b) "You
ought to give yourself up."
COMMENT: Willard Day has observed that a more complete analysis of such
statements would entail a knowledge of the factors controlling the speaker's behavior as well.
For example, "You ought to take an umbrella." means: (1) Keeping dry is reinforcing to you. (2)
An umbrella keeps you dry when it rains. (3) It is going to rain. (4) Having to pay your medical
expenses is aversive to me. (5) You may require medical attention if it rains on you.
The Survival of a Culture. pp. 430-434
Q13: What are the three kinds of selection discussed in this first paragraph?
Q14: In what sense does a cultural practice have survival value? Does its origin matter in this
regard?
Q15: Is a current culture by definition "better" than one which has perished?
COMMENT: You'd better be able to answer the above question, since if you agree with
Skinner publicly on this issue, you definitely can anticipate the question being asked.
Q16: Why does Skinner state that a culture is an experiment?
Q17: Is survival value always comparable with traditional values?
Q18: Why does behavior usually lead to survival? Can an individual who so behaves be said
to have "chosen" to survive as a value?
Q19: What is the relevance of science, and especially behavioral science, upon the cultural
value of survival?
COMMENT: I suspect that this last point could be substantiated by comparing
governmental practices across several decades. Consider, for example, how many regulations are
now in effect regarding pollution as compared with those of 75 years ago. The unfortunate
aspect of this is that even if it is science which indicates survival, both by measurement and
improved practices, it also is usually science which has necessitated taking such action.
Regarding pollution, science seems to be the problem as well as the only hope for a solution.
Can We Estimate Survival Value? pp. 434-436
Q20: What must the cultural designer be able to estimate in order to be maximally effective?
Q21: What four practical ways does a science help in the selection of survival oriented cultural
practices?
Q22: What is left to do when these scientific practices have been employed to the fullest
extent?
67
XXIX. THE PROBLEM OF CONTROL
Q23: Why is a doctrine of personal freedom an insufficient protection or countercontrol against
a technology of behavior?
Q24: What are the consequences of a refusal to control?
COMMENT: As governmental control grows, as in welfare programs, our "freedom" is
correspondingly diminished, according to Skinner. If freedom is thus related to governmental
control, why should our government so strongly define the concept?
Q25: What are some of the advantages of diversified control?
Q26: What are some of the disadvantages of controlling control (implicitly, by force)?
COMMENT: It may be instructive to consider the efforts to control the use of behavior
modification in the light of Skinner's remarks in this section. Federal and state legislation, local
supervisory boards, efforts to prevent its use either legally or through public pressure because of
its "dehumanizing" approach, are each examples of Skinner's classes of solutions to the problems
of control.
A Possible Safeguard Against Despotism. pp. 443-445
Q27: According to Skinner, where does the ultimate strength of a controller lie?
Q28: What is the role of traditional values, such as freedom, security, happiness, and
knowledge, regarding the ultimate source of strength? Do these values necessarily guarantee
cultural survival?
Q29: What may be the role of science in providing "moral values" for governments?
Who Will Control? pp. 445-446
Q30: Why is Skinner optimistic regarding the future of a science of behavior?
The Fate of the Individual. pp. 446-449
Q31: What is the central conflict between Western philosophy and a science of behavior?
Q32: What is responsible for that Western tradition?
Q33: Who or what "really" is in control?
COMMENT: This relatively "dry" scientific perspective has not been of especial
comfort to political scientists. For example, while it is true that the behavior of slaves was a
source of control over their owners, it was not a particularly exploitable one toward any
meaningful improvement in the slave's lot.
68
Q34: Why does even a scientific analysis of cultural behavior result in depicting a controllercontrollee relationship?
COMMENT: The last several paragraphs present Skinner’s perspective on the role of
science and culture, the resultant dethroning of the concepts of individualism and selfdetermination and the eventual cultural progress that can result. The expanded version of this
treatment is in Beyond Freedom and Dignity (1971).
FINISH
69