Chapter 1-7 assignments and quizzes

Chapter 1-7 assignments and quizzes


  1. Which of the following is not an example of respondent conditioning? A baby smiling at its parent after having been picked up for smiling in the past
  2. Skinner argued that internal events such as feelings, thoughts, and intentions are behaviors that need to be explained
  3. Behavior analysis seeks to define the principle and rules of behavior, apply them across species, and develop behavior management techniques
  4. Behavior analysts define culture as all the conditions, events, and stimuli arranged by other people that regulate human action
  5. Applied behavior analysis is the use of behavior principles to solve practical problems
  6. The neural basis of reward most closely involves dopamine and endogenous opiates
  7. Analysis of behavior becomes experimental when it involves the manipulation of a condition to see how behavior is affected
  8. A reflex is behavior that is elicited by a biologically relevant stimulus while a/an operant is behavior controlled by its consequences.
  9. Watson’s conditioning of Little Albert used a white rat as a neutral stimulus and the sound of a hammer hitting a rail as the unconditioned stimulus.
  10. The experimental analysis of behavior is concerned with controlling and changing factors that affect behavior, a natural-science approach to understanding behavior regulation, and concerned with the principle of reinforcement
  11. The behavior of an organism is everything an organism does, including thinking and feeling
  12. How are thinking and feeling treated from a behavioral perspective? More behavior to be explained
  13. Learning refers to the acquisition of behavior, the maintenance of behavior, and the change in behavior as a result of events
  14. The context of behavior can be defined as both the physiological and environmental conditions that surround a behavior
  15. Which of the following is not a difference between Skinner and Watson? The rejection of internal events as causes of behavior
  16. A researcher who is interested in the effect of serotonin on an individual’s engagement in social behavior would be focused on immediate causation
  17. Selection by consequences occurs at three levels, what are these? Natural selection, behavior selection, and cultural selection
  18. Behavior analysts recognize the importance of biology, but tend to focus more on environment
  19. What does a duckling inherit in terms of imprinting? The capacity to be reinforced by reducing the distance between itself and a moving object
  20. According to Baer, Wolf, and Risley (1968), what is the difference between basic and applied behavior analysis? Basic research is likely to look at any behavior and any variable, and applied research looks at variables that could improve behavior
  21. In operant conditioning, a stimulus that reliably precedes an operant response (SD) is said to set the occasion for the response
  22. Functional analysis involves classifying behavior according to its response functions and analyzing the environment in terms of stimulus functions.
  23. The motivational operation (MO) (motivating operation) refers to any event that alters the reinforcement effectiveness of behavioral consequences and changes the frequency of behavior maintained by those consequences.
  24. The central question in all experiments is whether the changes in the dependent variable are uniquely caused by changes in the independent variable.
  25. The reversal design is ideally suited to show that specific features of the environment control the behavior of a single organism.
  26. One mayor problem is that behavior, once changed, may not return to baseline levels.
  27. Direct replication involves manipulating the independent variable in the same way for each subject in the experiment.
  28. The experimental analysis of behavior is a systematic set of tactics for the exploration of the controlling variables of behavior.
  29. A response class refers to all the topographic forms of the performance that have similar function.
  30. When the occurrence of an event changes the behavior of an organism, we may say that the event has a stimulus function.
  31. Those events that increase behavior when presented are called positive reinforcers, and those that increase behavior when removed are negative reinforcers.
  32. A researcher reports that a classroom where students received points that could be exchanged for extra recess time whenever they completed math practice problems saw an average increase of five completed homework problems per child per day compared to baseline. In terms of evaluating the effectiveness of this program, the researcher would most likely focus on changes in trend
  33. A(n) response class refers to all the forms of a behavior that have a similar function.
  34. The variable manipulated by the experimenter is the independent and the measured effect is the dependent
  35. Any stimulus (or event) that follows a response and increases its frequency is said to have: a reinforcement function
  36. The way an individual chooses to turn on a light switch is an example of topography of response. This is determined by the function and consequences.
  37. A behavior analytic explanation of the Stroop effect would likely rely on response competition between the multiple elements of the stimulus
  38. In terms of the reversal design and behavioral experiments: The A-phase is called the baseline, the B-phase is called the experimental manipulation, and the design is used in single-subject experiments
  39. A researcher examined the possibility that additional recess time would increase the number of math facts learned over a 1-month period. The procedure tested this hypothesis using a first-grade classroom with extended recess time and a second-grade classroom with regular recess time. In this study, the number of math facts learned over 1 month would be considered the: dependent variable

  1. Conditioned emotional responses that include an increase in heart rate, perspiration, or a change in blood pressure are examples of respondents
  2. Which of the following would be an example of an abolishing operation (AO) for eating? Putting a lock on the fridge
  3. Negative reinforcers increase behavior when removed
  4. In terms of finding an object that is missing or hidden: both a structural account points to stages of development and object permanence, and a behavioral account points to a particular history of reinforcement
  5. Baseline sensitivity means that behavior is sensitive to a low dose of drug
  6. Which of the following research questions would be the most difficult to address using a reversal design? Evaluating the effectiveness of a reading program on a child’s reading ability
  7. In terms of the Stroop effect, behavior analysts point to response competition and history of reinforcement as reasons for hesitation.
  8. The law of the threshold is based on the observation that at very weak intensities a stimulus will not elicit a response, but as the intensity of the eliciting stimulus increases, there is a point at which the response is elicited.
  9. Habituation is observed to occur when an unconditioned stimulus repeatedly elicits, and unconditioned response and the response gradually declines in magnitude.
  10. When an unconditioned stimulus elicits an unconditioned response, the relationship is called a reflex
  11. Fixed-action patterns have been observed and documented in a wide range of animals and over a large number of behaviors related to survival and reproduction.
  12. Because these relationships are relatively invariant and biologically based, we refer to the eliciting event as the unconditioned stimulus
  13. The related behavior following the stimulus is called the unconditioned response.
  14. Each organism has a unique ontogenetic history or lifetime of conditioning.
  15. Respondent conditioning involves the transfer of the control of behavior from one stimulus to another by S-S pairing.
  16. Generalization is an adaptive process that allows the organism to respond similarly even when conditions do not remain exactly the same from a trial to trial.
  17. Behavior relations that are based on the genetic endowment of an organism are described as phylogenetic and are present on the basis of species history.
  18. A response to the CS presented in training but not to other values of the CS demonstrates respondent discrimination
  19. In second-order conditioning, a conditioned stimulus is paired with the neutral stimulus
  20. Overall, backward conditioning appears to be ineffective except when the CS is biologically relevant
  21. McCully (1982) suggested that many overdoses may be the result of a failure of tolerance due to the absence of the CS
  22. Fixed action patterns are sequences of behavior that are phylogenetic in origin
  23. To do away with an unwanted CR one should present the CS without the US
  24. The conditioning history of an individual can be referred to as their ontogenetic history
  25. With regards to the laws of the reflex, the relation between the conditioned stimulus and the conditioned response (CS-CR) typically do not hold true for any of the laws of the reflex
  26. Complex sequences of released behaviors are called fixed action patterns
  27. Reflexive behavior is said to be involuntary and elicited
  28. Recently, Joan invited Francine to have lunch with her at her favorite restaurant. They both order Francine’s favorite dish and they both received food poisoning from some poorly cleaned lettuce. Joan develops a taste aversion to the dish, but Francine does not. Which concept below best describes why Francine did not develop a taste aversion to the food? Latent inhibition
  29. After several dates with Julie, Max finds that his heart rate increases slightly whenever she walks into the room. When visiting Julie’s house, Max experiences a similar increased heart rate when Julie’s twin sister walks into the room. This is an example of respondent generalization
  30. Not one of the four ways discussed in the text for pairing a CS and a US: overshadowing (simultaneous, delayed, trace)
  31. While all members of a species share the same phylogenetic history, each member has a unique ontogenetic history
  32. When the relationship is invariant and biologically based, the eliciting event is the unconditioned stimulus and the behavior following is the unconditioned response
  33. Drug tolerance has been shown to be a result of elicited CRs
  34. Positive reinforcers usually include consequences such as food, praise, and money. These events, however, cannot be called positive reinforcers until they have been shown to increase behavior.
  35. Operants are responses that operate on the environment to produce changes as a result have an increased (or decreased) probability of occurrence.
  36. A positive reinforcer is defined as any consequence that increases the probability of the operant that produced it.
  37. Topography refers to the physical form or characteristics of the response.
  38. A contingency of reinforcement defines the relationship between the events that set the occasion for behavior, the operant class, and the consequences that follow this behavior.
  39. When an operant results in the removal of an event, and this procedure increases the rate of response, the contingency is called negative reinforcement
  40. For example, spanking a child for running onto a busy road is positive punishment if the child now stops (or turns) before reaching the road.
  41. In these examples, watching, television, talking to others, participating in classroom activities are assumed to be reinforcing events. When removal of these events contingent of fighting, telling sexist jokes, or passing notes decreases such behavior, negative punishment has occurred.
  42. Once baseline measures of behavior have been taken, the Premack principle holds that any higher-frequency (or longer duration) behavior may serve as reinforcement for any behavior of lower frequency.
  43. The procedure of withholding reinforcement for a previously reinforced response is called extinction
  44. During magazine training, when a pigeon is placed in a chamber for the first time it may demonstrate a variety of behavioral responses because of the novel features in the chamber that may serve as aversive stimuli
  45. How does in-vitro reinforcement relate to Skinner’s “atoms of behavior”? supports Skinner by showing that reinforcement exists at the neural level
  46. Although she was happy in the relationship, Joan decided to break up with her boyfriend several weeks ago. He tried contacting her for several days afterward, but after several weeks without any response from Joan, he stopped trying to contact her. Then, one day, Joan accidentally sends her ex a text message asking about his day. After this message, he starts frequently texting and calling Joan again. This reappearance of the previously extinguished behavior is an example of reinstatement of responding
  47. Which of the following best illustrates a study using the free-operant method? A rat is placed in a chamber for 1 hour with a freely available lever that delivers food after every 10th response
  48. The instrumental response is the behavior that produces the opportunity to engage in some activity
  49. Kobayashi and colleagues demonstrated that the presentation of juice in the mouth following a spike in neural activity could be used as reinforcer for activity in the lateral prefrontal cortex (LPFC)
  50. Consider the following example: “Your phone won’t allow you to make a call, so you turn it off and then back on again. After this your phone allows you to make a call.” Identify the operant in this example. Turning the phone on and off
  51. Skinner proposed that the basic datum (measure) for operant analysis should be rate
  52. In negative punishment, a stimulus is removed and as a result behavior decreases
  53. Max finds that his new dog will work really hard for bites of a doggy treat at the beginning of a training session but appears to lose interest in the treats the more that Max gives to him. This is an example of satiation
  54. One of the main criticisms of behavioral rewards and reinforcement is the idea that external rewards will lead to lower intrinsic motivation
  55. Andrew spends a lot of timing playing guitar and very little time studying. The Premack principle suggest that playing the guitar could be a reinforcer for studying.
  56. An operant class is all the variations of behavior that produce the environmental changes required for reinforcement
  57. Consider the following example: “Bob is running late for work, so he drives faster than usual. As a result of his increased speed, Bob is pulled over by a police officer and receives a ticket. The next time Bob is running late for work he drives at the correct speed.” This is an example of positive punishment
  58. In terms of rewards and intrinsic motivation, Cameron et al. (2001) conducted an statistical procedure called meta-analysis and one of the findings indicated that verbal rewards increased performance and interest on tasks.
  59. To experimentally study the probability of response, a researcher uses operant rate as the basic measure and follows the free operant
  60. Consider the following example: “Kendra stays out two hours past curfew. As a result of her tardiness, her parents take away her privileges to go out for two weeks. The next time Kendra goes out she makes sure to come home on time.” This is an example of negative punishment
  61. A schedule of reinforcement describes the arrangement of stimuli, operants, and consequences.
  62. A fixed ratio schedule is programmed to deliver reinforcement after a fixed number of responses have been made.
  63. Variable ratio schedules are similar to FRs except that the number of responses required for reinforcement changes after each reinforce is presented.
  64. On fixed interval schedules, an operant is reinforced after a fixed amount of time has passed.
  65. On a variable interval schedule, responses are reinforced after a variable amount of time has passed.

  1. Fixed-interval schedules produce a characteristic steady-state pattern of responding. There is a pause after reinforcement (PRP), then a few probe responses, followed by more and more rapid responding to a constant high rate as the interval times out. This pattern is called scalloping
  2. During extinction, the break and run pattern shows increasing periods of pausing followed by high rates of response
  3. The time between any two responses, or what is called the inter-response time, may be treated as an operant.
  4. Continuous reinforcement or CRF, is probably the simplest schedule of reinforcement. On this schedule, every operant required by the contingency is reinforced.
  5. Research shows that PRP is a function of the inter-reinforcement interval (IRI). As the IRI becomes longer, the PRP
  6. A slot machine is an example of a random ratio schedule of reinforcement.
  7. Consider the following example: “Darius works at a factory. He is paid for every 20 dolls he makes.” This is an example of a fixed ratio schedule of reinforcement.
  8. The critical measure on progressive-ratio (PR) schedules is typically: the breakpoint where the organism fails to complete the requirement
  9. Schedules that generate predictable stair-step patterns are fixed ratio
  10. Infrequent reinforcement generates responding that is persistent, which is called the partial reinforcement effect
  11. Progressive-ratio (PR) schedules are frequently used to evaluate: the value of a reinforcer
  12. Consider the following example: “Jill’s boss stops by her desk periodically throughout the day to check her progress and ask questions.” This is an example of a variable interval schedule of reinforcement.
  13. Behavioral momentum: refers to behavior that persists or continues in the presence of a stimulus for reinforcement despite disruptive factors
  14. Schedules of reinforcement were first described by: Skinner
  15. Human performance on FI varies from animal data due to self-instruction
  16. Consider the following example: “A rat receives a pellet for the first response after 5 minutes regardless of how often they press the lever.” This is an example of a(n) interval schedule
  17. The shape of the response pattern generated by an FI is called a scallop
  18. A schedule that is made up of a series of alternately presented fixed-ratio (FR) schedules with the following values, FR 5, FR 10, FR 20, FR 25, and FR 40, would be best described as: a variable ratio 20 schedule
  19. The post-reinforcement pause is not caused by: pausing to consume the reinforcer
  20. When considering ratio and interval schedules of reinforcement: ratio schedules produce a higher rate of response

Consider the following example: “A rat receives a pellet for pressing a lever 1 time.” This is an example of a(n) ratio schedule.

  1. From a behavioral analysis standpoint, a child working hard to achieve good grades is caused by environmental contingencies
  2. Behavior is said to be in transition when it is between stable states
  3. Persons who enjoy watching the Today Show, which airs every morning at the same time, are being reinforced on a fixed interval
  4. Aversive stimuli are those events that organisms evade, avoid, or escape from.
  5. For people, conditioned aversive stimuli include threats, public criticism, a failing grade, a frown, and verbal disapproval.
  6. Any event or stimulus that decreases the rate of operant behavior is called a punisher.
  7. Positive punishment occurs when a stimulus is presented following an operant and the operant decreases in frequency.
  8. Unlike reinforcement, contingencies of punishment do not teach or condition new behavior.
  9. In escape learning, an operant change the situation from one in which a negative reinforce is present to one in which it is absent, for some period of time.
  10. Any event or stimulus that increases operant rate by its removal is called a negative reinforcer.
  11. Because the organism only responds when the warning signal occurs, the procedure is called discriminated avoidance.
  12. One way to place avoidance behavior on extinction is to expose the organism to aversive stimulation while preventing effective escape responses.
  13. When an ongoing stimulus is removed contingent on a response and this removal results in a decrease in the rate of behavior, the contingency is called negative punishment or omission.
  14. In terms of dropping out, Sidman (2001) indicates that one basic element is escape due to negative reinforcement.
  15. Which of the following environmental changes most influences responding during a timeout from avoidance procedure? The reduction in response effort during the timeout period.
  16. Which of the following is true regarding escape and avoidance responses? Animals acquire escape responses faster than avoidance responses because the negative reinforcer is immediately absent.
  17. Azrin, Holtz, and Hake (1963) found that when pigeons were shocked with a punisher that increased intensity gradually the birds would continue to respond, and when they were shocked with a punisher of moderate intensity the birds quit responding.
  18. Azrin, Holtz, and Hake’s (1963) study on punishment and food deprivation in pigeons demonstrated that: the more food deprived the pigeons were the lower the efficacy of the punisher.
  19. The response cost procedure is an example of negative punishment.
  20. Contingencies of punishment do not teach or condition new behavior.
  21. Research on the use of skin-shock punishment in treatment of self-injurious behavior indicates that skin-shock treatment eliminates the need of physical restraint.
  22. Consider the following example: “Paige got a ticket for littering. As a result, she has to pick up trash along the highway for at least 20 hours”. This is an example of overcorrection.
  23. If a behavior analyst wants to reduce the rate of response in an organism: they can use satiation, extinction, punishment, or behavioral contrast contingencies.
  24. Skinner (1953) reported a game played by sailors in the 18th The game involved each boy being told to hit another boy when he was hit, a slight tan on one boy, and tying several boys in a ring.
  25. Regarding the side effects of punishment, Solomon’s (1969) solution is to search for the rules or principles governing such side effects.
  26. If wheel running is a higher frequency operant, then wheel running will reinforce drinking; if wheel running is a lower frequency operant, then wheel running will punish
  27. Hackengerg and Hineline (1987) gave one group of rats an avoidance period from electric shock before food and one group after food. They found that rats respond to long-term aversive consequences in their environment.
  28. Berton and colleagues (2006) found that the regulation of avoidance behavior in mice confronted by an aversive social target requires BDNF (brain-derived neurotrophic factor) from the vental tegmental area (VTA).
  29. In general, adjunctive behavior refers to any excessive and persistent behavior patterns that occurs as a side effect of reinforcement delivery.
  30. Respondent behavior is elicited by the events that precede it, and operants are strengthen (or weakened) by stimulus consequences that follow them.
  31. Taste aversion is another example of biological factors underlying conditioning
  32. That is, once respondent behavior is elicited and reinforced, it is controlled by its consequences and is considered operant.
  33. Operant conditioning during an organism’s lifetime selects response topographies, rates of response, and repertoires of behavior through the feedback from reinforcing consequences.
  34. Instinctive drift refers to species-characteristic behavior patterns that became progressively more invasive during training or conditioning.
  35. Thus, the temporal arrangement of signal followed by response, and the topography of the responses, both suggest respondent conditioning.
  36. That is, once respondent behavior is elicited and reinforced, it is controlled by its consequences and is considered to be operant.
  37. There are also occasions when behavior that appears to be respondent is regulated by its consequences and is therefore operant behavior.
  38. Therefore, it appears that for some stimuli the animal is prepared by nature to make a connection and for others it may even be
  39. What does the evidence suggest about the operant conditioning of reflexive behavior? Reflexes can be conditioned by operant procedures in some circumstances
  40. Pierce, Epling, and their colleagues found that the reinforcing value of wheel running decreased as food consumption increased
  41. According to Falk (1977), schedule-induced or adjunctive behavior could be displacement behavior

  1. Excessive drinking is technically called polydipsia
  2. During omission training pigeons produce short duration pecks to the illuminated key and during autoshaping, pigeons produce long duration pecks to the illuminated key.
  3. When considering adjunctive behaviors, there is an increase in adjunctive behaviors when the time between reinforcement increases.
  4. In taste aversion studies that condition both quails and rats to being sick after drinking salty blue water, researchers have found that following recovery when the animals are given the option for either salty water or normal water colored blue: the rats choose the normal blue water and the quails choose the salty water
  5. Falk (1977) suggested that the adaptive function of adjunctive behavior may be to maintain the animal on the schedule during periods when it would be likely to leave or escape
  6. The observation that pigeons will peck at a key light associated with the presentation of food even when doing so cancels the delivery of reinforcement is seen as evidence that pecking at the key light is a respondent behavior
  7. The bivalent effect of wheel running in rats refers to wheel running creates taste aversion for novel foods that preceded the activity and taste preferences for novel foods that follow the activity
  8. What is the relationship between adjunctive behavior and the level of deprivation? The higher the level of deprivation the greater the adjunctive behavior
  9. During the time between food reinforcer, rats engage in interim, facultative, and terminal behaviors
  10. In terms of operant conditioning of reflexive behavior, the experiment by Miller and Carmona (1967) showed: that the increased flow of saliva was accompanied by the dogs being more alert
  11. Phenomena like instinctive drift, sign tracking, and autoshaping have been analyzed as both stimulus substitution when the CS substitutes for the US, and behavior systems activated by the US and the physical properties of the CS
  12. The basic finding for activity anorexia is that both decreased food intake increases physical activity, and physical activity decreases food intake