Download Chapter 6 – Perception

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Behavioral modernity wikipedia , lookup

Educational psychology wikipedia , lookup

Thin-slicing wikipedia , lookup

Theory of planned behavior wikipedia , lookup

Attribution (psychology) wikipedia , lookup

Abnormal psychology wikipedia , lookup

Theory of reasoned action wikipedia , lookup

Psychophysics wikipedia , lookup

Applied behavior analysis wikipedia , lookup

Neuroeconomics wikipedia , lookup

Adherence management coaching wikipedia , lookup

Learning theory (education) wikipedia , lookup

Verbal Behavior wikipedia , lookup

Behavior analysis of child development wikipedia , lookup

Insufficient justification wikipedia , lookup

Social cognitive theory wikipedia , lookup

Classical conditioning wikipedia , lookup

Behaviorism wikipedia , lookup

Psychological behaviorism wikipedia , lookup

Operant conditioning wikipedia , lookup

Transcript
Chapter 8 – Learning – ‘15
Learning: experience; know sequence, association for both classical and operant conditioning
CC:
OC:
Classical Conditioning: Pavlov, Watson, behaviorism, principles
UCS
UCR
CR
NS/CS
Terms (for both): acquisition: timing, extinction, spontaneous recovery, generalization, discrimination,
Updating: “mentalistic”  cognitive processes (predictability), biological predispositions – Garcia experiment showed
1.
2.
Pavlov’s legacy: scope, methodology, application, influence on Watson, “Little Albert”
Versus Operant conditioning:
BF Skinner: Law of Effect, Skinner Box, shaping / successive approximations
Reinforcers: positive and negative, primary, conditioned / secondary
-immediate versus delayed (marshmallow test)
-continuous versus partial
-partial / intermittent reinforcers and effects: fixed ratio (FR), variable ratio (VR), fixed interval (FI) – “scalloped,
variable interval (VI)
Punishment definition, problems with:
Updating Skinner: Cognition, cognitive map, latent learning, overjustification effect – intrinsic versus extrinsic
motivation, informational rewards versus bribes, controlling; biological predispositions
Skinner’s legacy: controversy; application of operant conditioning: at school – “teaching machines”, branching
at work: for managers; at home – parents, you:
Observational: AKA modeling -- us, monkeys; memes, mirror neurons;
-by 14 months, copy TV
*Bandura: “Bobo” experiment; lowered inhibitions, copied model;
-bad: hypocrites - antisocial; good: Gandhi - prosocial,
-TV: not an accurate world, spread coincided with rise in violence -- definitely correlated;
-consensus: does lead to violence in kids and teens; desensitizes.
Chapter 8 – Learning – ‘15
Learning: experience; know sequence, association for both classical and operant conditioning
CC:
OC:
Classical Conditioning: Pavlov, Watson, behaviorism, principles
UCS
UCR
CR
NS/CS
Terms (for both): acquisition: timing, extinction, spontaneous recovery, generalization, discrimination,
Updating: “mentalistic”  cognitive processes (predictability), biological predispositions – Garcia experiment showed
1.
2.
Pavlov’s legacy: scope, methodology, application, influence on Watson, “Little Albert”
Versus Operant conditioning:
BF Skinner: Law of Effect, Skinner Box, shaping / successive approximations
Reinforcers: positive and negative, primary, conditioned / secondary
-immediate versus delayed (marshmallow test)
-continuous versus partial
-partial / intermittent reinforcers and effects: fixed ratio (FR), variable ration (VR), fixed interval (FI) –
“scalloped, variable interval (VI)
Punishment definition, problems with:
Updating Skinner: Cognition, cognitive map, latent learning, overjustification effect – intrinsic versus extrinsic
motivation, informational rewards versus bribes, controlling; biological predispositions
Skinner’s legacy: controversy; application of operant conditioning: at school – “teaching machines”, branching
at work: for managers; at home – parents, you:
Observational: AKA modeling -- us, monkeys; memes, mirror neurons;
-by 14 months, copy TV
*Bandura: “Bobo” experiment; lowered inhibitions, copied model;
-bad: hypocrites - antisocial; good: Gandhi - prosocial,
-TV: not an accurate world, spread coincided with rise in violence -- definitely correlated;
-consensus: does lead to violence in kids and teens; desensitizes.
Chapter 8 Notes (Complete)
Learning
I. Learning
A. Adaptability: our capacity to learn new behaviors that enable us to cope with changing circumstances
i. Unlike salmons, which have most of the behavioral instructions they need for life through
genes, humans mostly learn from experience
B. Learning: a relatively permanent change in an organism’s behavior due to experience
i. What is learnable can potentially be taught
ii. What has been learned can potentially be changed by new learning
iii. Experience is key to learning
C. We learn by association
i. Associative learning: learning by linking certain two events that occur close together. The
events may be 2 stimuli (classical conditioning) or a response and its consequences (operant
conditioning)
ii. Our minds naturally connect events that occur in sequence, therefore, we associate them
iii. If, after seeing and smelling fresh bread, you eat some and enjoy it, the next time you see and
smell fresh bread your experience will lead you to expect that it will be satisfying
iv. Associations:
a. If you associate a sound with a frightening consequence, then your fear may be aroused
by the sound itself
b. If sea snail repeatedly receives an electric shock just after being squirted, its withdrawal
response to being squirted becomes stronger : associates squirt with shock
c. Seals repeat behaviors that prompt people to toss them food : associate the act with the
reward
d. The animals learn something important to their survival, to predict the immediate future
D. Successful adaptation requires both nature and nurture
i. Keiko, killer whale in Free Willy, had the genes to survive in the wild but not the experience
ii. The Mexican gray wolves, after being bred and raised in captivity, when released into
Arizona’s Apache National Forest, were unable to survive. The lone survivor was recaptured.
They did not know how to run from a human with a gun.
iii. Of 145 reintroductions of 115 species in the 20th century, only 11% produced self-sustaining
populations in the wild
iv. Learned associations influence people
E. Conditioning: process of learning associations
i. Classical conditioning: learn to associate two stimuli and thus to anticipate events
a. Flash of lightning signals an impending crack of thunder, so we start to brace ourselves
when lightning flashes nearby
ii. Operant conditioning: learn to associate a response and its consequence = repeat acts followed
by reward and avoid acts followed by punishment
iii. Operant and classical conditioning often occur in the same situation
a. A rancher herds cattle by outfitting them with pagers; after training, the animals learn to
associate two stimuli—the beep on their pager and the arrival of food (classical) and
their hustling to the food trough with the pleasure of eating (operant)
iv. Observational learning: learn from others’ experiences and examples
v. Humans adapt to environments through operant and classical conditioning, and observation
a. Classical: learn to expect and prepare for significant events such as food or pain
b. Operant: learn to repeat acts that bring good results and avoid acts that bring bad results
c. Observational: watching others and learning new behaviors
II. Classical Conditioning
A. Classical/Pavlovian/Respondent conditioning: associating stimuli; a neutral stimulus that signals an
unconditioned stimulus begins to produce a response that anticipates and prepares for the
unconditioned stimulus.
i. Pavlov’s work laid the foundation for Watson’s idea that human behavior is mainly a bundle of
conditioned responses
ii. Behaviorism: psychology as an observable science based on observable behaviors
iii. Both Watson and Pavlov shared a disdain for mentalistic concepts, preferring observable
behaviors
B. Pavlov’s experiments
i. He noticed that when he worked with the same dog repeatedly, the dog began salivating to
stimuli associated with the food
ii. Paired various neutral stimuli such as a tone, with food in the mouth to see if the dog would
begin salivating to the neutral stimuli alone
iii. Isolated dog to eliminate influence of extraneous stimuli; dog secured to harness; saliva
diverted to measuring instrument
iv. Just before placing food in the dog’s mouth Pavlov sounded a tone after several pairings of the
tone and food, the dog began to salivate to the tone in anticipation of the meat
v. Unconditioned response: UCR: the unlearned response: salivation
vi. Unconditioned stimulus: UCS: unlearned stimulus: food
vii. Conditioned response: CR: learned response: salivation in response to sound of tone
viii. Conditioned stimulus: CS: learned stimulus: tone
ix. Conditioned = learned; unconditioned = unlearned
C. Five major conditioning processes
i. Acquisition: initial learning of the stimulus-response relationship
a. Question of timing: How much time should elapse between the neutral stimulus (NS)
and the UCS?: generally half a second
b. Conditioning would not likely happen if the food (US) appeared before the tone (CS)
c. Conditioning seldom occurs when CS comes after the UCS
d. Classical conditioning is biologically adaptive: Pavlov’s tone signals an important
biological event-the arrival of food
e. If the food or bad event had already occurred, the CS would not likely signal anything
significant
f. Conditioning helps and animal survive and reproduce; prepare for good and bad events
g. In humans, stimulus associated with sexual pleasure, can become CS for sexual arousal
h. Associations, even those not consciously noticed, can give rise to attitudes
ii. Extinction
a. Diminishing response that occurs when the CS no longer signals an impending UCS
b. Ex: when Pavlov sounded the tone again and again without presenting food, the dogs
salivated less
c. Ex: After breaking up with girlfriend, one still stops pairing CS of smell of onion breath
with kissing (US)
iii. Spontaneous recovery
a. Reappearance of a weakened CR after a rest pause: suggests that extinction suppresses
the CR but does not eliminate it
b. Ex: if Pavlov allowed several hours to elapse before sounding the tone again, the
salivation to the tone would reappear spontaneously
c. Ex: Occasionally, smelling onion breath still awakens a small version of emotional
response
iv. Generalization
a. Tendency to respond to stimuli similar to the CS
b. Pavlov and his students noticed that a dog conditioned to the sound of one tone also
responded somewhat to the sound of a different tone never paired with food
c. Generalization can be adaptive: toddlers fear moving cars and respond similarly to
trucks, motorcycles, etc.
d. Shown an angry face on a computer screen, abused children’s brain-wave responses are
dramatically stronger and longer lasting
e. Stimuli that are similar to naturally disgusting or appealing object will, by association,
evoke some disgust or liking
f. People’s emotional reactions to one stimulus generalize to similar stimuli
v. Discrimination
a. Learned ability to distinguish between a conditioned stimulus and other irrelevant
stimuli
b. Has survival value, because slight different stimuli are at times followed by vastly
different consequences
c. Pavlov’s dogs also learned to respond to the sound of a particular tone only
d. Pavlov’s and Watson’s disdain for “mentalistic” concepts has given way to a growing
realization that they underestimated the importance of cognitive processes and
biological constraints
D. Updating Pavlov’s Understandings
i. Cognitive processes
a. Early behaviorists believed that learned behaviors of various organisms could be
reduced to mindless mechanisms, therefore, presumption of cognition seemed
unnecessary
b. Robert Rescorla and Allan Wagner argued that when two significant events occur
closely together in time, an animal learns the predictability of the second event: if tone
always precedes shock, and light sometimes accompanies tone, rats will fear tone not
light
c. The more predictable the association the stronger the CR
d. Expectancy: awareness of how likely it is that the UCS will occur
e. Conditioning occurs best when the CS and UCS have just the sort of relationship that
would lead a scientist to conclude that the CS causes the UCS
f. Classical conditioning treatments that ignore cognition often have limited success
ii. Biological predispositions
a. Pavlov and Watson believed the basic laws of learning were similar in all animals
b. An animal’s capacity for conditioning is constrained by its biology; biological
predispositions of each species dispose it to learn the particular associations that
enhance its survival
c. Garcia and Robert Koelling noticed that rats began to avoid drinking the water from the
plastic bottles in radiation chambers
d. Garcia and Koelling gave rats a particular taste, sight, or sound and later gave them
radiation or drugs that led to nausea and vomiting
a. Even if sickened as late as several hours after tasting a particular novel flavor,
the rats thereafter avoided that flavor
b. Sickened rats developed aversions to the tastes but not the sights or sounds
c. This contradicted the behaviorists’ idea that any perceivable stimulus could
serve as a CS
e. Conditioning is even speedier, stronger, and more durable when the CS is “ecologically
relevant”
f. Birds, which hunt by sight, appear biologically primed to develop aversions to the sight
of tainted food
g. Humans, too, seem biologically prepared to learn some things rather than others
h. These cases support Darwin’s principle that natural selection favors traits that aid
survival
i. Birds, which hunt by sight, appear biologically primed to develop aversions to the sight
of tainted food
j. Nature prepares the members of each species to learn those things crucial to their
survival
k. Experiments revealed that conditioned taste aversion could successfully prevent
baboons from riding African gardens, raccoons from attacking chickens, and ravens and
crows from feeding on crane eggs
l. Learning enables animals to adapt to their environments: discovery of biological
constraints on learning affirms this
m. Animals are generally predisposed to associate a CS with a UCS that follows
predictably and immediately
n. Adaptation also helps explain exceptions such as the taste-aversion finding
E. Pavlov’s Legacy
i. Classical conditioning is a basic form of learning
ii. Many other responses to many other stimuli can be classically conditioned in many other
organisms
iii. Classical conditioning is one way that virtually all organisms learn to adapt to their
environment
iv. Pavlov showed us how a process such as learning can be studied objectively, by isolating
elementary blocks of complex behaviors and studying them with objective laboratory
procedures
v. Applications of classical conditioning
a. Rug counselors advise addicts to steer clear of settings associated with the euphoria of
previous drug use
b. Counselors sometimes provide people who abuse alcohol with experiences that may
reverse their positive associations with alcohol
c. Works on the body’s disease-fighting immune system
d. Provided a basis for John Watson’s idea that human emotions and behavior, though
biologically influenced, are mainly a bundle of condtitioned responses
e. “Little Albert” feared loud noises but not white rats: Watson and Rayner conditioned a
fear of rats: Albert generalized this fear to rabbits, dogs, and sealskin coats
f. Watson used his knowledge of associative learning to conceive many successful
campaigns
g. One patient who feared elevators for 30 years, forced himself to enter 20 elevators a
day for 10 days, and his fear nearly vanished
F. Close-Up:
i. Her fear (CR) was most powerfully associated with particular locations and people (CS), but it
generalized to other places and people
ii. Rape victim fears apartment, town, darkness: fear is subsiding after eleven years
III. Operant Conditioning
A. Operant conditioning: association of behaviors with their consequences: more likely to repeat
rewarded, reinforced behaviors and less likely to repeat punished behaviors
B. Difference between classical and operant: classical conditioning forms an association between stimuli
and involves respondent behavior: behavior that occurs as a automatic response to some stimulus;
operant conditioning involves operant behavior: the act operates on the environment ot produce
rewarding or punishing stimuli
C. Skinner’s Experiments
i. Law of effect: rewarded behavior is likely to recur
ii. Skinner developed a “Behavioral technology” that revealed principles of behavior control
iii. Enabled him to teach pigeons such unpigeonlike behaviors as playing ping-pong, and keep a
missile on course by pecking at a target on a screen
iv. Designed an operant chamber/Skinner box: soundproof, with a bar or key that an animal
presses or pecks to release a reward of food or water, and a device that records these responses
v. Shaping, a procedure in which reinforcers, such as food, gradually guide an animal’s actions
toward a desired behavior
a. After observing how the animal naturally behaves before training, you would build on
its existing behaviors
b. To condition hungry rat to press a bar you build on existing behaviors by using
successive approximations: reward responses that are ever-closer to the final desired
behaviors and ignore all other responses
c. By making rewards contingent on desired behaviors, researchers and animal trainers
gradually shape complex behaviors
d. By shaping nonverbal organisms to discriminate between stimuli, a psychologist can
also determine what they perceive
e. Experiments show that some animals are remarkably capable of forming concepts; they
demonstrate this by discriminating between classes of events or objects
f. If an experimenter reinforces a pigeon for pecking after seeing a human face
(discriminative stimulus), but not after seeing other images, the pigeon will learn to
recognize human faces
g. In shaping procedure the trainer builds on the individual’s existing behaviors by
expecting and immediately rewarding successively closer approximations of a desired
behavior.
h. In everyday life we may unintentionally continually reward and shape the behavior of
others
vi. Principles of reinforcement
a. Reinforcement: any event that increases the frequency of a preceding response, not
necessarily a reward; any consequence that strengthens behavior
b. Positive reinforcer may be a tangible reward
c. Two kinds of reinforcement: positive: strengthens a response by presenting a typically
pleasurable stimulus after a response and negative: strengthens a response by reducing
or removing an aversive stimulus.
d. Negative reinforcement is not punishment; rather, it removes a punishing aversive
event
e. Primary reinforcers-innately satisfying
f. Conditioned reinforcers/secondary reinforcers: are learned: get their power through
association with primary reinforcers (if rat in skinner box learns that a light reliably
signals that food is coming, the rat will work to turn on the light: light becomes a
secondary reinforcer)
g. Humans respond to reinforcers that are greatly delayed
h. To function effectively we must learn to postpone immediate rewards for greater longterm rewards
i. To our detriment, small but immediate consequences are sometimes more alluring than
big but delayed consequences
j. Big step to maturity is learning to delay gratification to control impulse in order to
achieve more valued rewards
k. Continuous reinforcement: desired response is reinforced every time it occurs: learning
occurs rapidly: extinction works rapidly: not provided in real life
l. Partial/intermittent reinforcement: responses sometimes reinforced and sometimes not:
initial learning slower (continuous reinforcement preferable until a behavior is learned):
produces greater persistence and greater resistance to extinction
m. With intermittent reinforcement, hope springs eternal
vii. 4 Schedule of Partial Reinforcement
a. Fixed-ratio schedules: behavior reinforced after set number of responses: piecework:
often found tiring
b. Variable-ratio schedules: provide reinforcers after an unpredictable number of
responses: commission: high rates of responding because reinforcers increase as the
number of responses increases
c. Fixed-interval schedules: reinforce the first response after a fixed time period: salary:
produces choppy stop-start pattern rather than a steady rate of response
d. Variable-interval schedules: reinforce the first response after varying time intervals:
random: tend to produce slow, steady, long-lasting responding
e. Animal behaviors differ, yet Skinner contended that these reinforcement principles of
operant conditioning are universal
D. Punishment
i. Effect opposite to that of reinforcement
ii. Decreases frequency of preceding behavior usually by administering an undesirable
consequence or withdrawing a desirable one
iii. Swift and sure punishers can powerfully restrain unwanted behavior
iv. Some punishments, though unintentional, are nevertheless quite effective: A dog that has
learned to come running at the sound of the electric can opener will stop coming if its master
starts running the machine to attract the dog and banish it to the basement
v. Problem with human punishment studies, which often find that spanked children are at
increased risk for aggression, depression, and low self-esteem
vi. Physical punishment has drawbacks; punished behavior not lost only suppressed; temporary
suppression may negatively reinforce the parents’ punishing behavior; may increase
aggressiveness by demonstrating that aggression is a way to cope with problems
vii. If punishment is avoidable, the punished behavior may reappear in safe settings,
viii. The child may simply learn discrimination: different places mean different rules
ix. Physical punishment may increase aggressiveness by demonstrating that aggression is a way to
cope with problems
x. Punishment can create fear
xi. When punishments are unpredictable and inescapable, both animals and people may develop
the sense that events are beyond their control; feel helpless and depressed
xii. Even though punishment suppresses unwanted behavior, it often does not guide one toward
more desirable behavior
xiii. Punishment combined with reinforcement is usually more effective than punishment alone
xiv. Parents of delinquent youth often lack this awareness of how to reinforce desirable behavior
without screaming or hitting; training programs are available
xv. Psychologists now favor an emphasis on reinforcement
E. Updating Skinner’s Understanding
i. Skinner resisted the growing belief that cognitive processes have a necessary place in the
science of psychology and even in our understanding of conditioning
ii. Cognitive processes might be at work in operant learning
iii. Evidence of cognitive process has come from studying rats in mazes: develop a cognitive map,
a mental representation of the environment: occurs even when passively carried through the
maze in a wire basket
iv. The animals behave as if they expected that repeating the response would soon produce the
reward
v. Cognitive map: a mental representation of the layout of one’s environment; e.g. maze
vi. Rats exploring a maze develop a cognitive map; when an experimenter then places a reward in
the maze’s goal box, the rats very quickly perform as well as rats that been reinforced with
food for running the maze
vii. Latent learning: learning that becomes apparent only when there is some incentive to
demonstrate it
viii. There is more to learning than associating a response with consequence, there is cognition
ix. Learning can occur without reinforcement or punishment
x. Unnecessary rewards sometimes carry hidden costs
xi. Promising children a reward for a task they already enjoy can backfire. People who begin to
see the reward as their motive for an activity may lose their intrinsic interest in it:
overjustification effect: an already justifiable activity becomes overjustified by the promise of
a reward
xii. Intrinsic motivation: desire to perform a behavior effectively and for its own sake, can be
undermined by excessive rewards
xiii. Extrinsic motivation: seeking external rewards and avoiding threatened punishment
xiv. Person’s interest often survives when a reward is used neither to bribe nor to control but to
signal a job well done
xv. If a reward boosts feeling of competence after doing good work, enjoyment of task may
increase
xvi. Rewards rightly administered can motivate high performance and creativity.
xvii. An animal’s natural predispositions constrain its capacity for operant conditioning
xviii. Biological constraints predispose organisms to learn associations that are naturally adaptive
xix. Biological predispositions were more important than had been supposed
xx. “Misbehaviors” occur when animals revert to their biologically predisposed patterns
F. Skinner’s Legacy
i. Skinner believes to manage people effectively we should worry less about their illusions of
freedom and dignity; recognizing that behavior is shaped by its consequences, we should
administer rewards in ways that promote more desirable behavior
ii. Skinner’s critics objected saying that he dehumanized people by neglecting their personal
freedom and by seeking to control their actions
iii. At School
a. Skinner and others advocated the use of teaching machines and textbooks that would
shape learning in small steps and provide immediate reinforcement for correct
responses
b. Teachers should pace the material according to each students’ rate of learning and
provide prompt feedback with positive reinforcement to both slow and fast learners
c. Students must be told immediately whether what they do is right or wrong and, when
right, they must be directed to the step to be taken next
d. Computers can pace the material according to each student, keeping flawless records
for the supervising teacher
iv. In Sports
a. Reinforcement principles can also enhance athletic abilities: key is to shape behavior,
by first reinforcing small successes and then gradually increasing the challenge
b. Compared with children taught by convention methods, those trained by this behavioral
method show, in both testing and game situations, faster improvement in their skill
v. At Work
a. Business managers have capitalized on psychological research: many companies now
enable their employees to share profits and to participate in company ownership; when
workers’ productivity boosts rewards for everyone, their motivation, morale, and
cooperative spirit often increase
b. Reinforcement for a job well done is especially effective in boosting productivity when
the desired performance is well-defined and achievable
c. It is wise to make the reinforcement immediate
d. Rewards need not be material, nor should they be substantial
vi. At Home
a. Many economists and psychologists believe people’s spending behavior is controlled
by its consequences (its costs and benefits)
b. In homes immediate consequences most effectively influence behavior
c. When parents cave in to protests or defiance, they reinforce such behaviors
d. Give children attention and other reinforcers when they are behaving well; target a
specific behavior
e. Ignore whining
f. When children misbehave or are defiant, do not yell or hit them
g. Can be applied to ourselves: state your goal, monitor yourself, reinforce the desired
behavior, reduce the incentives gradually
vii. Contrasting Classical and Operant Conditioning
a. Both are forms of associative learning and both involve acquisition, extinction,
spontaneous recovery, generalization, and discrimination
b. Through classical conditioning, an organism associates different stimuli that it does not
control and responds automatically
c. Through operant conditioning, an organism associates its operant behaviors with their
consequences
d. Both are influenced by cognitive processes and biological predispositions
IV. Learning by Observation
A. Observational learning: we observe and imitate others
B. Modeling: process of observing and imitating a specific behavior
C. We can glimpse the roots of observational learning in other species
D. Imitation is all the more striking in humans: so many ideas, fashions, and habits travel by imitation
that these transmitted cultural elements now have a name: memes
E. Mirror neurons: in frontal lobe area adjacent to the brain’s motor cortex, provide a neural basis for
observational learning; may also serve in human language, and give rise to children’s empathy and to
their theory of mind
F. PET scans reveal that humans have mirror neurons in this brain area
G. Imitation of models shapes children’s development
H. Bandura’s experiments:
i. Albert Bandura: pioneering researcher of observational learning
ii. Child imitates aggression towards Bobo doll, at a time of aggression, after having seen the
behavior in an adult
iii. Observing the adult model beating up the doll lowered their inhibitions
iv. Part of whether we will imitate a model is reinforcements and punishments
v. By looking we learn to anticipate a behavior’s consequences
vi. Especially likely to imitate those we perceive as similar to ourselves, as successful, or as
admirable
I. Applications of Observational Learning
i. Antisocial models, may have antisocial effects
ii. By watching TV, children may learn that physical intimidation is an effective way to control
others, that free and easy sex brings pleasure without consequences, or that men are supposed
to be tough and women gentle
iii. Lessons we learn as children are not easily unlearned as adults, and they are sometimes visited
on future generations
iv. Intergenerational transmission of abuse could be genetic
v. Prosocial: positive, constructive, helpful behavior, can have prosocial effects; people who
exemplify nonviolent, helpful behavior can prompt similar behavior in others
vi. Parents are powerful models, observational learning of morality begins early
vii. Models are most effective when their actions and words are consistent
viii. When exposed to a hypocrite, children tend to imitate the hypocrisy by doing what the model
did and saying what the model said
ix. Wherever television exists it becomes the source of much observational learning
x. Most people have access to televisions and a major portion of their life watching it
xi. U.S. network programs have offered about 3 violent acts per hour during prime time, and 18
per hour during children’s Saturday morning programs; real world-87% of crimes are
nonviolent on television,13% of crimes are nonviolent
xii. Television viewers are learning about life from a rather peculiar storyteller, one who reflects
the culture’s mythology but not its reality
xiii. In a study, 6 in 10 shows featured violence, that 74% of violence went unpunished, that 58%
did not show the victims’ pain, and that nearly half the inciedents involved justified violence
and nearly half involved an attractive perpetrator
xiv. Correlational studies do link violence-viewing with violent behavior: more hours spent
watching violent programs the more at risk they are for aggression and crime as teens and
adults: increased homicide rates in US, Canada, and South Africa
xv. These studies do not prove that viewing violence causes aggression
xvi. Experiments show that violence shows can to an extent cause more cruel behavior
xvii. The violence effect seems to stem from a combination of factors including imitation
xviii. Prolonged exposure to violence also desensitizes viewers: watching cruelty fosters indifference