- 1990 – B.F. Skinner, American psychologist and philosopher, invented the Skinner box (b. 1904).
Burrhus Frederic Skinner (20 March 1904 to 08 August 1990) was an American psychologist, behaviourist, author, inventor, and social philosopher. He was a professor of psychology at Harvard University from 1958 until his retirement in 1974.
Considering free will to be an illusion, Skinner saw human action as dependent on consequences of previous actions, a theory he would articulate as the principle of reinforcement: If the consequences to an action are bad, there is a high chance the action will not be repeated; if the consequences are good, the probability of the action being repeated becomes stronger.
Skinner developed behaviour analysis, especially the philosophy of radical behaviourism, and founded the experimental analysis of behaviour, a school of experimental research psychology. He also used operant conditioning to strengthen behaviour, considering the rate of response to be the most effective measure of response strength. To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box), and to measure rate he invented the cumulative recorder. Using these tools, he and Charles Ferster produced Skinner’s most influential experimental work, outlined in their 1957 book Schedules of Reinforcement.
Skinner was a prolific author, publishing 21 books and 180 articles. He imagined the application of his ideas to the design of a human community in his 1948 utopian novel, Walden Two, while his analysis of human behaviour culminated in his 1958 work, Verbal Behaviour.
Skinner, John B. Watson and Ivan Pavlov, are considered to be the pioneers of modern behaviourism. Accordingly, a June 2002 survey listed Skinner as the most influential psychologist of the 20th century.
Education and Later Life
Skinner attended Hamilton College in New York with the intention of becoming a writer. He wrote for the school paper, but, as an atheist, he was critical of the traditional mores of his college. After receiving his Bachelor of Arts in English literature in 1926, he attended Harvard University, where he would later research and teach. While attending Harvard, a fellow student, Fred S. Keller, convinced Skinner that he could make an experimental science of the study of behaviour. This led Skinner to invent a prototype for the Skinner box and to join Keller in the creation of other tools for small experiments.
After graduation, Skinner unsuccessfully tried to write a novel while he lived with his parents, a period that he later called the “Dark Years”. He became disillusioned with his literary skills despite encouragement from the renowned poet Robert Frost, concluding that he had little world experience and no strong personal perspective from which to write. His encounter with John B. Watson’s behaviourism led him into graduate study in psychology and to the development of his own version of behaviourism.
Skinner received a PhD from Harvard in 1931, and remained there as a researcher for some years. In 1936, he went to the University of Minnesota in Minneapolis to teach. In 1945, he moved to Indiana University, where he was chair of the psychology department from 1946 to 1947, before returning to Harvard as a tenured professor in 1948. He remained at Harvard for the rest of his life. In 1973, Skinner was one of the signers of the Humanist Manifesto II.
Contributions to Psychology
Skinner referred to his approach to the study of behaviour as radical behaviourism, which originated in the early 1900s as a reaction to depth psychology and other traditional forms of psychology, which often had difficulty making predictions that could be tested experimentally. This philosophy of behavioural science assumes that behaviour is a consequence of environmental histories of reinforcement (refer to applied behaviour analysis). In his words:
The position can be stated as follows: what is felt or introspectively observed is not some nonphysical world of consciousness, mind, or mental life but the observer’s own body. This does not mean, as I shall show later, that introspection is a kind of psychological research, nor does it mean (and this is the heart of the argument) that what are felt or introspectively observed are the causes of the behavior. An organism behaves as it does because of its current structure, but most of this is out of reach of introspection. At the moment we must content ourselves, as the methodological behaviorist insists, with a person’s genetic and environment histories. What are introspectively observed are certain collateral products of those histories.… In this way we repair the major damage wrought by mentalism. When what a person does [is] attributed to what is going on inside him, investigation is brought to an end. Why explain the explanation? For twenty-five hundred years people have been preoccupied with feelings and mental life, but only recently has any interest been shown in a more precise analysis of the role of the environment. Ignorance of that role led in the first place to mental fictions, and it has been perpetuated by the explanatory practices to which they gave rise.
Foundations of Skinner’s Behaviourism
Skinner’s ideas about behaviourism were largely set forth in his first book, The Behaviour of Organisms (1938). Here, he gives a systematic description of the manner in which environmental variables control behaviour. He distinguished two sorts of behaviour which are controlled in different ways:
- Respondent behaviours are elicited by stimuli, and may be modified through respondent conditioning, often called classical (or pavlovian) conditioning, in which a neutral stimulus is paired with an eliciting stimulus. Such behaviours may be measured by their latency or strength.
- Operant behaviours are ’emitted’, meaning that initially they are not induced by any particular stimulus. They are strengthened through operant conditioning (aka instrumental conditioning), in which the occurrence of a response yields a reinforcer. Such behaviours may be measured by their rate.
Both of these sorts of behaviour had already been studied experimentally, most notably: respondents, by Ivan Pavlov; and operants, by Edward Thorndike. Skinner’s account differed in some ways from earlier ones, and was one of the first accounts to bring them under one roof.
The idea that behaviour is strengthened or weakened by its consequences raises several questions. Among the most commonly asked are these:
- Operant responses are strengthened by reinforcement, but where do they come from in the first place?
- Once it is in the organism’s repertoire, how is a response directed or controlled?
- How can very complex and seemingly novel behaviours be explained?
|Origin of Operant Behaviour||Skinner’s answer to the first question was very much like Darwin’s answer to the question of the origin of a ‘new’ bodily structure, namely, variation and selection. Similarly, the behaviour of an individual varies from moment to moment; a variation that is followed by reinforcement is strengthened and becomes prominent in that individual’s behavioural repertoire. Shaping was Skinner’s term for the gradual modification of behaviour by the reinforcement of desired variations. Skinner believed that ‘superstitious’ behaviour can arise when a response happens to be followed by reinforcement to which it is actually unrelated. This can be seen, for example, with lucky socks that athletes wear. If they wear a pair of socks once and they win, but do not wear them for the next game and they lose, this reinforces the wearing of the lucky socks during games. The more it happens, the stronger the superstition will become.|
|Control of Operant Behaviour||The second question, “how is operant behaviour controlled?” arises because, to begin with, the behaviour is “emitted” without reference to any particular stimulus. Skinner answered this question by saying that a stimulus comes to control an operant if it is present when the response is reinforced and absent when it is not. For example, if lever-pressing only brings food when a light is on, a rat, or a child, will learn to press the lever only when the light is on. Skinner summarized this relationship by saying that a discriminative stimulus (e.g. light or sound) sets the occasion for the reinforcement (food) of the operant (lever-press). This three-term contingency (stimulus-response-reinforcer) is one of Skinner’s most important concepts, and sets his theory apart from theories that use only pair-wise associations.|
|Explaining Complex Behaviour||Most behaviour of humans cannot easily be described in terms of individual responses reinforced one by one, and Skinner devoted a great deal of effort to the problem of behavioural complexity. Some complex behaviour can be seen as a sequence of relatively simple responses, and here Skinner invoked the idea of “chaining”. Chaining is based on the fact, experimentally demonstrated, that a discriminative stimulus not only sets the occasion for subsequent behaviour, but it can also reinforce a behaviour that precedes it. That is, a discriminative stimulus is also a “conditioned reinforcer”. For example, the light that sets the occasion for lever pressing may also be used to reinforce “turning around” in the presence of a noise. This results in the sequence “noise – turn-around – light – press lever – food.” Much longer chains can be built by adding more stimuli and responses.|
However, Skinner recognised that a great deal of behaviour, especially human behaviour, cannot be accounted for by gradual shaping or the construction of response sequences. Complex behaviour often appears suddenly in its final form, as when a person first finds his way to the elevator by following instructions given at the front desk. To account for such behaviour, Skinner introduced the concept of rule-governed behaviour. First, relatively simple behaviours come under the control of verbal stimuli: the child learns to “jump,” “open the book,” and so on. After a large number of responses come under such verbal control, a sequence of verbal stimuli can evoke an almost unlimited variety of complex responses.
Operant Conditioning Chamber
An operant conditioning chamber (also known as a “Skinner box”) is a laboratory apparatus used in the experimental analysis of animal behaviour. It was invented by Skinner while he was a graduate student at Harvard University. As used by Skinner, the box had a lever (for rats), or a disk in one wall (for pigeons). A press on this “manipulandum” could deliver food to the animal through an opening in the wall, and responses reinforced in this way increased in frequency. By controlling this reinforcement together with discriminative stimuli such as lights and tones, or punishments such as electric shocks, experimenters have used the operant box to study a wide variety of topics, including schedules of reinforcement, discriminative control, delayed response (“memory”), punishment, and so on. By channelling research in these directions, the operant conditioning chamber has had a huge influence on course of research in animal learning and its applications. It enabled great progress on problems that could be studied by measuring the rate, probability, or force of a simple, repeatable response. However, it discouraged the study of behavioural processes not easily conceptualized in such terms – spatial learning, in particular, which is now studied in quite different ways, for example, by the use of the water maze.