On March 20, 1904, American psychologist, behaviorist, author, inventor, and social philosopher Burrhus Frederic (B. F.) Skinner was born. His pioneering work in experimental psychology promoted behaviorism, shaping behavior through positive and negative reinforcement and demonstrated operant conditioning. The “Skinner box” he used in experiments from 1930 remains famous.
Burrhus Frederic Skinner was born March 20, 1904. Skinner grew up in a small town in Pennsylvania as the son of a lawyer. Skinner received his BA in English from Hamilton College in upstate New York. The young man wanted to become a writer badly. He authored several poems and short stories and built himself a study in his parents’ attic to concentrate, but it just wasn’t working out for him.
After some time, Skinner decided to go back to school, studying psychology at Harvard University and receiving his doctorate in 1931. For five years, Skinner continued his research at Harvard. In the next ten years, Skinner taught and researched at several universities across the United States before returning to Harvard in 1948, where he remained for the rest of his life.
During his research, Skinner noticed that classical conditioning did not account for the behavior most of us are interested in, such as riding a bike or writing a book. His observations led him to propose a theory about how these and similar behaviors, called operants, come about. In operant conditioning, an operant is actively emitted and produces changes in the world that alter the likelihood that the behavior will occur again. Operant conditioning has two basic purposes, increasing or decreasing the probability that a specific behavior will occur in the future. They are accomplished by adding or removing one of two basic types of stimuli, positive-pleasant or negative-aversive. That means, if the probability of a behavior is increased as a consequence of the presentation of a stimulus, that stimulus is a positive reinforcer and vise versa, if the probability of a behavior is increased as a consequence of the withdrawal of a stimulus, that stimulus is a negative reinforcer.
The famous Skinner box has a bar or pedal on one wall that, when pressed (for example by rats), causes a little mechanism to release a food pellet into the cage. So when the rat is put in the cage and it (probably accidentally) presses the bar for the first time, a food pellet falls into the cage. The operant in this scenario is the behavior just prior to the reinforcer, which is the food pellet, of course. The rat is now furiously peddling away at the bar, hoarding his pile of pellets in the corner of the cage. A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future. If the rat is not given any more pellets when pressing the bar, it will stop doing so, which is called extinction of the operant behavior. Then however, if the pellet machine is turned back on, the rat will learn the behavior much more quickly than the first time. By controlling this reinforcement together with discriminative stimuli such as lights and tones, or punishments such as electric shocks, experimenters have used the operant box to study a wide variety of topics, including schedules of reinforcement, discriminative control, delayed response, punishment, and so on.
At yovisto academic video search, you may be interested in the video lecture What positive psychology can help you become by Martin Seligman.
References and Further Reading:
-  M. Koren, B.F. Skinner: The Man Who Taught Pigeons to Play Ping-Pong and Rats to Pull Levers, at Smithsonian magazine
-  Saul McLeod: Skinner – Operant Conditioning, at Simply Psychology
-  B. F. Skinner at Wikidata
-  Pavlov and the Conditional Reflex, SciHi Blog, December 27, 2014.
-  Timeline for B. F. Skinner, via Wikidata