All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
methods behavioral research
Questions and Answers of
Methods Behavioral Research
2. In clicker training with dogs, the click is a s reinforcer that has been established by first pairing it with f which is a p reinforcer.
1. Shaping is the creation of operant behavior through the reinforceQUICK QUIZ M ment of s a to that behavior.
5. In most cases, the most important consequence in developing a highly effective slapshot in hockey will be the (contrived/natural) consequence of where the puck goes and how fast it travels.
4. In applied behavior analysis, although one might initially use (contrived/natural) consequences to first develop a behavior, the hope is that, if possible, the behavior will become tr by the n c
3. You thank your roommate for helping out with the housework in an attempt to motivate her to help out more often. To the extent that this works, the thankyou is an example of a(n)
2. You flip the switch and the light comes on. The light coming on is an example of a(n) (contrived/natural) reinforcer; in general, it is also an example of an (intrinsic/extrinsic) reinforcer.
1. A(n) reinforcer is a reinforcer that typically occurs for that behavior QUICK QUIZ L in that setting; a(n) reinforcer is one that typically does not occur for that behavior in that setting.
4. They also found that extrinsic rewards generally increased intrinsic motivation when the rewards were (tangible/verbal) , and that tangible rewards increased intrinsic motivation when they were
3. In their meta-analysis of relevant research, Cameron and Pierce (1994) found that extrinsic rewards decrease intrinsic motivation only when they are (expected/unexpected) , (tangible/verbal) , and
2. Running to lose weight is an example of an motivated activity; running because it “feels good” is an example of an motivated activity.
1. An motivated activity is one in which the activity is itself reinforcing;an motivated activity is one in which the reinforcer for the activity consists of some type of additional consequence that
7. Behavior modification programs in institutional settings often utilize generalized reinforcers in the form of t . This type of arrangement is known as a t e .
6. Two generalized secondary reinforcers that have strong effects on human behavior are and .
5. A generalized reinforcer (or generalized secondary reinforcer) is a secondary reinforcer that has been associated with .
4. A (CS/US) that has been associated with an appetitive (CS/US)can serve as a secondary reinforcer for an operant response. As well, a stimulus that serves as a(n) for an operant response can also
3. Honey is for most people an example of a reinforcer, while a coupon that is used to purchase the honey is an example of a reinforcer.
2. Events that become reinforcers through their association with other reinforcers are called s reinforcers. They are sometimes also called reinforcers.
1. Events that are innately reinforcing are called p reinforcers. They are sometimes also called un reinforcers.
3. It has been suggested that delayed reinforcers (do / do not) function in the same manner as immediate reinforcers. Rather, the effectiveness of delayed reinforcers in humans is largely dependent
2. It is sometimes difficult for students to study in that the reinforcers for studying are and therefore w , whereas the reinforcers for alternative activities are and therefore s .
1. In general, the more the reinforcer, the stronger its effect on behavior.
5. When Tenzing shared his toys with his brother, his mother stopped criticizing him.Tenzing now shares his toys with his brother quite often. The consequence for sharing the toys was the of a
4. When Alex held the car door open for Stephanie, she made a big fuss over what a gentleman he was becoming. Alex no longer holds the car door open for her.The consequence for holding open the door
3. When Alex burped in public during his date with Stephanie, she got angry with him. Alex now burps quite often when he is out on a date with Stephanie. The consequence for burping was the of a
2. Whenever Sasha pulled the dog’s tail, the dog left and went into another room. As a result, Sasha now pulls the dog’s tail less often when it is around. The consequence for pulling the dog’s
1. When Sasha was teasing the dog, it bit her. As a result, she no longer teases the dog. The consequence for Sasha’s behavior of teasing the dog was the(presentation/removal) of a stimulus, and
5. Turning down the heat because you are too hot is an example of an (escape/avoidance) response; turning it down before you become too hot is an example of an (escape/avoidance) response.
4. With respect to escape and avoidance, an response is one that terminates an aversive stimulus, while an response is one that prevents an aversive stimulus from occurring. Escape and avoidance
3. Karen cries while saying to her boyfriend, “John, I don’t feel as though you love me.” John gives Karen a big hug saying, “That’s not true, dear, I love you very much.” If John’s hug
2. When the dog sat at your feet and whined during breakfast one morning, you fed him. As a result, he sat at your feet and whined during breakfast the next morning. The consequence for the dog’s
1. When you reached toward the dog, he nipped at your hand. You quickly pulled your hand back. As a result, he now nips at your hand whenever you reach toward him. The consequence for the dog’s
4. Reinforcement is related to a(n) (increase/decrease) in behavior, whereas punishment is related to a(n) (increase/decrease) in behavior.
3. Within the context of reinforcement and punishment, positive refers to the(addition/subtraction) of something, and negative refers to the(addition/subtraction) of something.
2. The word positive, when combined with the words reinforcement or punishment,(does / does not) mean that the consequence is good or pleasant.Similarly, the term negative, when combined with the
1. The word positive, when combined with the words reinforcement or punishment, means only that the behavior is followed by the of something. The word negative, when combined with the words
9. A bell that signals the start of a round and therefore serves as an SD for the operant response of beginning to box may also serve as a(n) (SD/CS) for a fear response. This is an example of how
8. A stimulus in the presence of which a response is punished is called a for . It can be given the symbol .
7. Another way of thinking about the three-term contingency is that you something, something, and something.
6. The three-term contingency can also be thought of as an ABC sequence, where A stands for event, B stands for , and C stands for .
5. Using the appropriate symbols, label each component in the following three-term contingency (assume that the behavior will be strengthened):Phone rings: Answer phone ã Conversation with friend
4. A discriminative stimulus (does / does not) elicit behavior in the same manner as a CS.
3. A discriminative stimulus is said to “ for the behavior,” meaning that its presence makes the response (more/less) likely to occur.
2. A discriminative stimulus is usually indicated by the symbol .
1. The operant conditioning procedure usually consists of three components:QUICK QUIZ E(1) a d s , (2) an o response, and(3) a c .
13. Clayton stopped plugging in the toaster after he received an electric shock while doing so. This is an example of (punishment/extinction) .Manzar stopped using the toaster after it no longer made
12. Weakening a behavior through the withdrawal of reinforcement for that behavior is known as .
11. When we chastise a child for being rude, are we attempting to punish: (a) the child who was rude or (b) the child’s rude behavior?
10. When we give a dog a treat for fetching a toy, are we attempting to reinforce:(a) the behavior of fetching the toy or (b) the dog that fetched the toy)?
9. When labeling an operant conditioning procedure, punishing consequences (punishers) are given the symbol (which stands for ), while reinforcing consequences (reinforcers) are given the symbol
8. Each time Edna talked out in class, her teacher immediately came over and gave her a hug. As a result, Edna no longer talks out in class. By definition, the hug is a(n) because the behavior it
7. When Moe stuck his finger in a light socket, he received an electric shock. As a result, he now sticks his finger in the light socket as often as possible. By definition, the electric shock was a
6. Reinforcers and punishers are defined entirely by their on behavior. For this reason, the term reinforcer is often preferred to the term because the latter is too closely associated with events
5. Eliminating a dog’s tendency to jump up on visitors by scolding her when she does so is an example of , while the scolding itself is a .
4. Strengthening a roommate’s tendency toward cleanliness by thanking her when she cleans the bathroom is an example of , while the thanks itself is a .
3. The terms reinforcement and punishment refer to the pr or pr whereby a behavior is strengthened or weakened by its consequences.
2. More specifically, a reinforcer is a consequence that (precedes/follows) a behavior and (increases/decreases) the probability of that behavior. A punisher is a consequence that (precedes/follows)
1. Simply put, reinforcers are those consequences that s a behavior, while punishers are those consequences that w a behavior.
6. Operant behavior is usually defined as a(n) of responses rather than a specific response.
5. Operant responses are also simply called .
4. Classically conditioned behaviors are said to be e by the stimulus, while operant behaviors are said to be e by the organism.
3. The process of operant conditioning involves the following three components:(1) a r that produces a certain , (2) a c that serves to either increase or decrease the likelihood of the that preceded
2. Operant conditioning is similar to the principle of natural selection in that an individual’s behaviors that are (adaptive/nonadaptive) tend to increase in frequency, while behaviors that are
1. Skinner’s definition of operant conditioning differs from Thorndike’s law of effect in that it views consequences in terms of their effect upon the strength of behavior rather than whether
7. Skinner originally thought all behavior could be explained in terms of , but he eventually decided that this type of behavior could be distinguished from another, seemingly more voluntary type of
6. Skinner’s procedures are also known as fr o procedures in that the animal controls the rate at which it earns food.
5. In the original version of the Skinner box, rats earn food by p a l ; in a later version, pigeons earn a few seconds of access to food by p at an illuminated plastic disc known as a.
4. The Skinner box evolved out of Skinner’s quest for a procedure that would, among other things, yield (regular/irregular) patterns of behavior.
3. According to Thorndike, behaviors that worked were st i , while behaviors that did not work were st o .
2. Based on his research with cats, Thorndike formulated his famous of , which states that behaviors that lead to a(n) state of affairs are strengthened, while behaviors that lead to a(n) state of
1. Thorndike’s cats learned to solve the puzzle box problem (gradually/suddenly) .
3. Another name for operant conditioning is conditioning.
2. Elicited behavior is a function of what (precedes/follows)it; operant behavior is a function of what (precedes/follows) it.
1. Operant behaviors are influenced by their .QUICK QUIZ A
16. Diagram an example of a classical conditioning procedure that results in an alteration (strengthening or weakening) of immune system functioning. Diagram an example of a classical conditioning
15. Define aversion therapy. What is covert sensitization?
14. Define flooding. Be sure to mention the underlying process by which it is believed to operate. Also, what is the distinction between imaginal and in vivo versions of flooding?
13. Outline the three components of systematic desensitization.
12. What is counterconditioning? Name and define the underlying process.
11. What would be the likelihood of a child who had very little control over important events in her life of later acquiring a phobia (compared to a child who had more control over important events)?
10. Describe how selective sensitization and incubation can affect the acquisition of a phobia.
9. Describe how temperament and preparedness can affect the acquisition of a phobia. Be sure that your answer clearly indicates the difference between them.
8. Assuming that the look of fear in others can act as a US, diagram an example of observational learning in the acquisition of a phobia. Be sure to include the appropriate abbreviations (NS, US,
7. Briefly describe the Watson and Rayner experiment with Little Albert and the results obtained.
6. Describe the overexpectation effect and how the Rescorla-Wagner theory accounts for it.
5. Describe the Rescorla-Wagner theory. Describe how the RescorlaWagner theory accounts for overshadowing and blocking.
4. Describe the compensatory-response model of conditioning. How does the compensatory-response model account for drug overdoses that occur when an addict seems to have injected only a normal amount
3. Describe the preparatory-response theory of conditioning.
2. Describe stimulus-substitution theory. What is the major weakness of this theory?
1. Distinguish between S-R and S-S models of conditioning.
3. Supporting the possibility that placebo effects are classically conditioned responses, such effects are more likely to occur (following/preceding)a period of treatment with the real drug. As well,
2. Diagram the classical conditioning process in Ader and Cohen’s (1975) study of immunosuppression. Label each component using the appropriate abbreviations.
1. When Christopher entered his friend’s house, he noticed a dog dish beside the door. He soon began experiencing symptoms of asthma and assumed that the house was filled with dog dander (particles
5. Aversion therapy is sometimes carried out using stimuli rather than real stimuli. This type of treatment procedure is known as sensitization.
4. In general, aversion therapy is (more/less) effective when the unpleasant response that is elicited is biologically relevant to the problematic behavior.
3. A highly effective procedure for reducing cigarette consumption, at least temporarily, is r____________
2. A standard treatment for alcoholism is to associate the taste of alcohol with feelings of n that have been induced by consumption of an e .
1. In therapy, one attempts to reduce the attractiveness of an event by associating that event with an unpleasant stimulus.
5. Öst’s single-session procedure combines the gradualness of s d with the prolonged exposure time of f . This procedure also makes use of p m , in which the therapist demonstrates how to interact
4. Modern-day therapies for phobias are often given the general name of e -b treatments.
3. For flooding therapy to be effective, the exposure period must be of relatively(long/short) duration.
Showing 500 - 600
of 1115
1
2
3
4
5
6
7
8
9
10
11
12