B.F. Skinner
Essay by review • September 2, 2010 • Essay • 1,419 Words (6 Pages) • 2,052 Views
B.F. Skinner
Psychologist, born in Susquhanna, Pa. He studied at Harvard, teaching there (1931-6, 1947-74). A leading behaviorist, he is a proponent of operant conditioning, and the inventor of the Skinner box for facilitating experimental observations.
B. F. Skinner's entire system is based on operant conditioning. The organism is in the process of "operating" on the environment, which in ordinary terms means it is bouncing around the world, doing what it does. During this "operating," the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant - which is the behavior occurring just before the reinforcer. This is operant conditioning: "the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future."
Say you have a dog and he's just playing around with his toys and such and then when you throw a toy at him and he catches it then you give him a treat. Then all of the sudden the dog is starting to catch toys and such as you throw it in the air or at his mouth. The operant is the behavior just prior to the reinforcer, which is the treat.
Then what if you decide to stop giving the dog treats, well he'll stop his little trick which your, the owner were enjoying. This is called extinction of the operant behavior.
Now, if you were to start showing the dog treats, then most likely he/she'll want to start doing the tricks again and a little more quickly than the dog learned at first. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the dog was reinforced for performing the tricks.
Continuous reinforcement is the original scenario: Every time that the dog does the behavior (such as performing a trick), he gets a treat.
The fixed ratio schedule was the first one Skinner discovered: If the dog did the trick three times, say, he gets a goodie. Or five times. Or twenty times. Or "x" times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc. This is a little like "piece rate" in the clothing manufacturing industry: You get paid so much for so many shirts.
Skinner also looked at variable schedules. Variable ratio means you change the "x" each time first it takes 3 tricks to get a treat, then 10, then 1, then 7 and so on. Variable interval means you keep changing the time period -- first 20 seconds, then 5, then 35, then 10 and so on.
In both cases, it keeps the dog on their little toes. With the variable interval schedule, they no longer "pace" themselves, because they no can no longer establish a "rhythm" between behavior and reward. Most importantly, these schedules are very resistant to extinction. It makes sense, if you think about it. If you haven't gotten a reinforcer for a while, well, it could just be that you are at a particularly "bad" ratio or interval.
A question Skinner had to deal with was how we get to more complex sorts of behaviors. He responded with the idea of shaping, or "the method of successive approximations." Basically, it involves first reinforcing a behavior only vaguely similar to the one desired. Once that is established, you look out for variations that come a little closer to what you want, and so on, until you have the animal performing a behavior that would never show up in ordinary life. Skinner and his students have been quite successful in teaching simple animals to do some quite extraordinary things.
I actually seen my mother use shaping on my brother once, when I now think about it. When we were younger, my brother was afraid of the dark so he liked to keep his bedroom door cracked open with the hall light shining through. So my mother started closing the door just a little bit more every night and after about a couple of weeks, he was sleeping like a baby with the door closed and lights off.
This is the same method that is used in the therapy called systematic desensitization, invented by another behaviorist named Joseph Wolpe. A person with a phobia -- say of spiders -- would be asked to come up with ten scenarios involving spiders and panic of one degree or another. The first scenario would be a very mild one -- say seeing a small spider at a great distance outdoors. The second would be a little more scary, and so on, until the tenth scenario would involve something totally terrifying -- say a tarantula climbing on your face while you're driving your car at a hundred miles an hour! The therapist will then teach you how to relax your muscles -- which is incompatible with anxiety. After you practice that for a few days, you come back and you and the therapist go through your scenarios, one step at a time, making sure you stay relaxed, backing off if necessary, until you can finally imagine the tarantula while
...
...