Operant conditioning (also called instrumental conditioning) involves the regulation of voluntary behavior by its consequences. Thorndike first sys-temically studied operant conditioning in the late 1800's. He placed cats in puzzle boxes and measured the amount of time they took to escape to a waiting bowl of food. He found that with increasing experience, the cats escaped more quickly. Movements that resulted in being released from the box, such as stepping on a panel or clawing a loop in a string, became more frequent, whereas movements that were not followed by release became less frequent. This type of operant learning is called "trial-and-error learning," because there is no systematic attempt made to teach the behavior. Instead, the organism makes many mistakes, which become less likely over time, and sometimes hits on the solution, which then becomes more likely over time.
B. F. Skinner, beginning in the 1930's, greatly extended and systematized the study of operant conditioning. One of his major contributions was to invent an apparatus called the operant chamber, which provided a controlled environment in which behavior was automatically recorded. In the operant chamber, an animal, such as a rat, would be able to make an arbitrary response, such as pressing a small lever on the side of the chamber with its paws. The apparatus could be programmed to record the response automatically and provide a consequence, such as a bit of food, to the animal. There are several advantages to this technique. First, the chamber filters out unplanned sights and sounds that could disturb the animal and affect ongoing behavior. Second, the animal is free to make the response at any time, and so response rate can vary over a wide range as a result of any experimental manipulations. This range means that response rate is a sensitive measure to detect the effects of changes the experimenter makes. Third, the automatic control and recording means that the procedure can be repeated exactly the same way in every experimental session and that the experimenter's ideas about what should happen cannot influence the outcome. The operant conditioning chamber is used extensively today in experiments investigating the learning of a variety of species from different perspectives.
One major technique to teach new behavior is called shaping. Shaping refers to providing a consequence for successive approximations to a desired response. For example, to teach a child to tie shoelaces, a parent might start by crossing the laces, forming the loops and crossing them, and having the child do the last part of pulling the loops tight. The parent would then praise the child. The parent could then gradually have the child do more and more of the task, until the whole task is successfully completed from the start. This type of approach ensures that the task is never too far out of reach of the child's current capabilities. Shaping takes place when young children are learning language, too. At first, parents and other caregivers are overjoyed at any approximation of basic words. Over time, however, they require the sounds to be closer and closer to the final, precisely spoken performance. Shaping can be used to teach a wide variety of behaviors in humans and nonhumans. The critical feature is that the requirement for the reward is gradually increased, in pace with the developing skill. If for some reason the behavior deteriorates, then the requirement can be lowered until the person is once again successful, then proceed again through increasing levels of difficulty. In order for any consequence to be effective, it should occur immediately after the behavior and every time the behavior occurs.
Was this article helpful?