* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download 1 - Wofford
Terror management theory wikipedia , lookup
Applied behavior analysis wikipedia , lookup
Verbal Behavior wikipedia , lookup
Conservation psychology wikipedia , lookup
Learning theory (education) wikipedia , lookup
Abnormal psychology wikipedia , lookup
Insufficient justification wikipedia , lookup
Behavioral modernity wikipedia , lookup
Theory of reasoned action wikipedia , lookup
Neuroeconomics wikipedia , lookup
Perceptual control theory wikipedia , lookup
Descriptive psychology wikipedia , lookup
Theory of planned behavior wikipedia , lookup
Adherence management coaching wikipedia , lookup
Classical conditioning wikipedia , lookup
Behaviorism wikipedia , lookup
Sociobiology wikipedia , lookup
Psychological behaviorism wikipedia , lookup
Behavior analysis of child development wikipedia , lookup
Attribution (psychology) wikipedia , lookup
Behavioral economics wikipedia , lookup
Psychophysics wikipedia , lookup
Sample Study Questions for Exam 2: Chapters 5-8 in Powell et al. These questions are sample questions to help you prepare for the exam. This list does not include every question that you should be able to answer. Instead, the list should serve to help you anticipate the types of questions you may receive, without giving you the actual questions on the exam. Chapter 5: 1. How is the nature of the CR related to S-S versus S-R explanations of classical conditioning? 2. Explain the compensatory-response model. 3. The compensatory-response model has attracted most attention as an application to which human problem? Identify the predictions of this model to this problem. 4. Summarize the Rescorla-Wagner model. 5. Does the Rescorla-Wagner model make the single-associative value assumption? Explain. 6. Show how the Rescorla-Wagner model explains the development of conditioned inhibition. 7. Show how the Rescorla-Wagner model explains the development of overshadowing. 8. How did John B. Watson explain the development of phobias? 9. Explain the procedure called systematic desensitization as related to the treatment of phobias. 10. What are some medical applications of classical conditioning? If you were the patient, which application would you prefer? Chapter 6: 11. Provide the correct explanation for the Law of Effect, given that the textbook explains it inaccurately. 12. Explain the theoretical distinction between emitted responses and elicited responses. 13. Why is instrumental behavior normally considered to be goal-directed? What is the problem with this characterization? 14. What is the difference between a reinforcer and a reward, according to Howard Rachlin? 15. Explain what is meant by a discriminative stimulus, and identify the different types. 16. Explain the formal distinction between negative reinforcement procedures and punishment procedures. 17. When we arrange for the instrumental response to produce the reinforcer without delay, we create two important relations between the response and the reinforcer: temporal contiguity and response-reinforcer contingency. Explain them. 18. Explain the concept of conditioned industriousness and identify its causes. 19. Explain the concept of conditioned reinforcer. How can we tell if something is a conditioned reinforcer? 20. Studies of delay of reinforcement show that a perfect contingency between response and reinforcer is not sufficient to produce instrumental conditioning. Explain why and include an example. 21. Extrinsic reinforcers undermine intrinsic interest only under certain conditions. Identify those conditions. 22. Extrinsic reinforcers can strengthen intrinsic interest under certain conditions. Identify those conditions. 23. Which would you prefer to receive, a conditioned reinforcer or a contrived reinforcer? Explain why. 24. Explain the procedure of shaping, distinguishing between haphazard and systematic procedures. How does a handheld clicker help? Chapter 7: 25. Explain the difference between reinforcing a behavior and a schedule of reinforcement (without relying heavily on examples). 26. Explain the difference(s) between continuous versus partial (or intermittent) reinforcement schedules. 27. Explain why schedules of reinforcement provide information about the factors that control maintenance and performance, rather than the learning of instrumental behavior. 28. Describe the differences between the cumulative records produced by FR, FI, VR, and VI schedules. 29. Explain why VI and VR schedules have no postreinforcement pause, but FR and FI do have pauses after each reinforcer delivery. 30. Give a clear example from your own life of a variable-interval schedule with a limited hold. Do not use an example from the text. 31. The text describes an analogy between the postreinforcement pause and procrastination and identifies a way to reduce procrastination. Explain how the technique works and how it is related to FR performance. 32. Why do interval schedules generally motivate less responding than do ratio schedules, even when the reinforcement rate for both is equalized? [Don’t give a superficial answer] 33. Explain the differences between interval schedules, duration schedules, and noncontingent time schedules. 34. Explain how each of the response-rate schedules function. 35. Explain the difference between conjunctive and adjusting schedules of reinforcement. 36. What is the difference between a chained schedule of reinforcement and the L-R behavior chain that your rats are learning? 37. The delay-of-reinforcement gradient is related to the goal gradient effect. Explain how the gradient(s) affect (a) behavior and (b) learning. 38. Explain drive reduction theory as a theory of reinforcement. Who was its champion? How does incentive motivation fit in? 39. Explain the Premack Principle as a theory of reinforcement. 40. Explain the Response-Deprivation Hypothesis as a theory of reinforcement. Who was its champion? 41. Explain the Behavioral Bliss Point approach as a theory of reinforcement. Who was its champion? Chapter 8: 42. Explain in detail what happens to instrumental behavior when an extinction procedure is implemented for an extended period of time. 43. Explain the side effects of extinction. 44. Why is the extinction effect considered to be different than unlearning? 45. Resistance to extinction is often used as a measure of response strength. Explain what factors influence resistance to extinction and how. 46. What is the partial-reinforcement extinction effect, and what makes it intriguing? 47. Describe how the discrimination hypothesis might explain the partial-reinforcement extinction effect. 48. How does spontaneous recovery indicate that extinction does not produce unlearning? 49. How is reinforcement used in DRO schedules used to decrease and eliminate behavior? Does this contradict the definition of reinforcement? 50. Explain how stimulus generalization and stimulus discrimination are used to measure stimulus control. 51. Explain the peak shift effect. What controls the direction of the shift? 52. Explain how a multiple schedule of reinforcement works. 53. Explain positive and negative behavioral contrast effects. 54. What is the difference between behavioral contrast and anticipatory contrast? 55. Tell the story of St. Neot’s Margin. 56. Explain how fading was used to produce errorless discrimination learning. 57. How is targeting a useful procedure of stimulus control? 58. Explain how the difficulty studying on Friday nights should be considered a problem of stimulus control. 59. What tasks have rats and pigeons been trained to do to help or kill people? 60. Explain how Guthrie’s theory was an extreme version of an S-R theory. Questions from lecture: 1. Explain the principle of trans-situationality and evaluate its validity. 2. Some theorists have considered reinforcers to be stimuli, and others have considered them to be behaviors. Briefly identify two reinforcement theories of each type. 3. Explain how drive-reduction was considered to be a mechanism of reinforcement. 4. Explain why drive-reduction is not considered to be a sufficient explanation of reinforcement. 5. Explain how drive-reduction is based on the idea of physiological homeostasis. 6. What is the difference between primary motivation and incentive motivation? 7. How are theories of reinforcement that are based on behavioral regulation different than theories that try to determine the essential stimulus characteristics of reinforcers? 8. Why is the measurement of response probability a very important issue in Premack’s theory? What seems to be the best technique of measuring it? 9. Explain Timberlake and Allison’s response-deprivation hypothesis. 10. Explain the concept of a behavioral bliss point. How can it be measured? 11. Explain how reinforcement schedules are constraints on the normal allocation of behavior. 12. What is behavioral regulation theory? 13. Explain Staddon’s minimum-deviation (minimal-distance) model of behavioral regulation. 14. The relative (not just absolute) amount of reinforcement is an important factor in instrumental conditioning. Describe a published experiment that demonstrates this point. 15. What are positive and negative behavioral contrast effects? How are they measured? 16. A teacher says that psychology relies too much upon animal studies and adds that “you can’t tell anything about people from research on rats.” How could you defend “rat psychology” against this criticism? 17. Provide a single detailed example of stimulus control in your life (outside of the classroom), clearly identifying the stimuli that control your behavior and the different responses you make in their presence. 18. Identify ways in which the squirrels’ activities on campus are under stimulus control. 19. Identify and explain the stimulus control of your behavior in psychology classes. 20. Explain how stimulus control can be measured in the laboratory. 21. Explain how stimulus control depends on differential responding. 22. Explain the concept of the stimulus generalization gradient, without relying heavily on pictures. 23. Explain how a multiple schedule (Mult VI EXT) works. 24. Explain the concept of occasion setting. 25. Explain why a stimulus that becomes an Sd also becomes a conditioned reinforcer. 26. Five-year old David gives up easily in the face of frustration. How could you develop his persistence? Provide details. 27. A student tells you that studying schedules of reinforcement is a waste of time. Give arguments for the opposing view. 28. How would Premack define a punisher? How would Timberlake and Allison define a punisher? 29. What are the effects of a delay of reinforcement on instrumental learning, and how can those effects by reduced? 30. Explain the concept of the delay-of-reinforcement gradient. What does the y-axis represent? 31. What is meant by the ABCs of functional analysis? 32. Explain what one does when one carries out a functional analysis of a behavior. 33. Explain how stimulus control can be measured in the laboratory. 34. Explain the concept of a discriminative stimulus, clearly identifying the factors that produce one. 35. Explain how instrumental stimulus discrimination procedures differ from classical conditioning stimulus discrimination procedures. 36. Explain how differential reinforcement influences the shape of the stimulus generalization gradient. In each of the six questions that follow, identify the ABC’s of each situation. 37. Give a clear example of a situation in which human behavior decreases in frequency due to positive punishment (punishment by addition). 38. Give a clear example of a situation in which human behavior negative punishment (punishment by subtraction). 39. Give a clear example of a situation in which human behavior habituation. 40. Give a clear example of a situation in which human behavior negative reinforcement. 41. Give a clear example of a situation in which human behavior sensitization. 42. Give a clear example of a situation in which human behavior spontaneous recovery (from operant conditioning). decreases in frequency due to decreases in frequency due to increases in frequency due to increases in frequency due to increases in frequency due to 43. Parents often observe that their children lose interest in playing with their Christmas toys. What learning process is most likely responsible? Explain. 44. Give two examples of associative learning processes and two examples of nonassociative learning processes. 45. Give a clear example of sensory reinforcement. 46. Give a clear example of how two people might differ in their optimal levels of sensory stimulation. What might produce this difference? 47. What is the basic difference between an observing response and a response reinforced by a conditioned reinforcer? 48. Does extinction work the same or a different way with responding maintained by secondary or primary reinforcers? Justify your answer. 49. Give three clear examples of social reinforcers. 50. Give three clear examples of social punishers. 51. Give three clear examples of generalized reinforcers. 52. Give three clear examples of tokens. 53. Explain the peak-shift effect and how it is measured. 54. Explain why the peak-shift effect was considered so important. 55. Explain the fading procedure. How can it be used in a clinical setting? 56. Explain the difference between an Sd and an S∆. How does this difference influence your life? 57. Explain the concept of resurgence and how it is measured. Why does it occur? 58. Explain how stimulus control procedures can allow us to study memory, which clearly cannot be measured directly. What is the y-axis when these results are plotted? 59. Explain the delayed matching-to-sample procedure. What does it measure? 60. Explain how stimulus control procedures can help you improve your study habits.