Email Warrior Science Group
   
Lt. Col. Dave Grossman's Bio Curriculum and Credentials Grossman's Articles and Peer Reviewed Publications On Killing and Stop Teaching Our Kids to Kill by Lt. Col. Dave Grossman Presentations and Training Available Audio and Video Tapes for sale Lt. Col. Dave Grossman and Killology  in the News Col. Grossman's speaking presentation calendar
  Return to Home Page Contact Warrior Science Group Site Map Search the Killology Web Site

"Behavioral Psychology"

The Birth of Behavioral Psychology

Around the turn of the century, Edward Thorndike attempted to develop an objective experimental method for testing the mechanical problem solving ability of cats and dogs. Thorndike devised a number of wooden crates which required various combinations of latches, levers, strings, and treadles to open them. A dog or a cat would be put in one of these puzzle boxes and, sooner or later, would manage to escape.

Thorndike's initial aim was to show that the anecdotal achievement of cats and dogs could be replicated in controlled, standardized circumstances. However, he soon realized that he could now measure animal intelligence using this equipment. His method was to set an animal the same task repeatedly, each time measuring the time it took to solve it. Thorndike could then compare these learning curves across different situations and species.

Thorndike was particularly interested in discovering whether his animals could learn their tasks through imitation or observation. He compared the learning curves of cats who had been given the opportunity to observe other cats escape from a box, with those who had never seen the puzzle being solved, and found no difference in their rate of learning. He obtained the same null result with dogs and, even when he showed the animals the methods of opening a box by placing their paws on the appropriate levers and so on, he found no improvement. He fell back on a much simpler, "trial-and-error" explanation of learning. Occasionally, quite by chance, an animal performs an action that frees it from the box. When the animal finds itself in the same position again, it is more likely to perform the same action again. The reward of being freed from the box somehow strengthens an association between a stimulus (being in a certain position in the box) and an appropriate action. Rewards act to strengthen these stimulus-response associations. The animal learned to solve the puzzle-box not by reflecting on possible actions and really puzzling its way out of it but by a mechanical development of actions originally made by chance. Thus, Thorndike demonstrates that the mind of a dog or cat is not capable of learning by observation but can only learn what has been personally experienced and reinforced.

By 1910 Thorndike had formalized this notion into the "Law of Effect," which essentially states that responses that are accompanied or followed by satisfaction (i.e., a reward, or what was later to be termed a reinforcement) will be more likely to reoccur, and those which are accompanied by discomfort (i.e., a punishment) will be less likely to reoccur. Thorndike extrapolated his finding to humans and subsequently maintained that, in combination with the Law of Exercise (which states that associations are strengthened by use and weakened by disuse) and the concept of instinct, the Law of Effect could explain all of human behavior in terms of the development of a myriad of stimulus-response associations.

Thorndike, his laws, and trial-and-error learning became the foundation for behavioral psychology, and the behaviorist position that human behavior could be explained entirely in terms of stimulus-response associations and the effects of reinforcers upon them. In its purest sense this new field of behavioral psychology entirely excluded cognitive concepts such as desires or goals.

John Broadhus Watson in his 1914 book, Behavior: An Introduction to Comparative Psychology, made the next major step in the development of behavioral psychology. Watson's theoretical position was even more extreme than Thorndike's. His rejection of cognition, or "mentalism," was total and he had no place for concepts such as pleasure or distress in his explanations of behavior. He essentially rejected the Law of Effect, denying that pleasure or discomfort caused stimulus-response associations to be learned. For Watson, all that was important was the frequency of occurrence of stimulus-response pairings. Reinforcers might cause some responses to occur more often in the presence of particular stimuli, but they did not act directly to cause their learning. In 1919 Watson published his second book, Psychology from the Standpoint of a Behaviorist, which established him as the founder of the American school of behaviorism.

In the 1920s behaviorism began to wane in popularity. A number of studies, particularly those with primates (which are capable of observational, monkey-see, monkey-do, learning), appeared to show flaws in the Law of Effect and to require mental representations in their explanation. But in 1938 Burrhus Friederich Skinner powerfully defended and advanced behaviorism when he published The Behavior of Organisms, which was arguably the most influential work on animal behavior of the century. B.F. Skinner resurrected the Law of Effect in more starkly behavioral terms and developed the Skinner Box, a technology that allowed sequences of behavior produced over a long time to be studied objectively, which was a great improvement on the individual learning trials of Watson and Thorndike.

Skinner developed the basic concept of "operant conditioning" which claimed that this type of learning was not the result of stimulus-response learning. For Skinner, the basic association in operant conditioning was between the operant response and the reinforcer, with a discriminative stimulus serving to signal when the association would be acted upon.

It is worth briefly comparing trial-and-error learning with classical conditioning. In in 1890s, Pavlov, a Russian physiologist, was observing the production of saliva by dogs as they were fed. He noticed that saliva was also produced when the person who fed them appeared, even though he was without food. This is not surprising. Every farm boy for thousands of years has realized that animals become excited when they hear the sounds that indicate they are about to be fed. But Pavlov carefully observed and measured one small part of the process. He paired a sound, a tone, with feeding his dogs so that the tone occurred several times right before and during the feeding. Soon the dogs salivated to the tone, as they did to the food. They had learned a new connection: tone with food or tone with saliva response.

In classical conditioning, a neutral stimulus becomes associated with an involuntary response, such as salivation or increased heart rate. But operant conditioning involves voluntary actions (such as lifting a latch, following a maze, or aiming and firing a weapon) with reinforcing or punishing events serving to alter the strength of association between the stimulus and the response.

The ability of behavioral psychology to turn voluntary motor responses into a conditioned response is demonstrated in one of Watson's early experiments which studied maze-learning, using rats in a type of maze that was simply a long, straight alley with food at the end. Watson found that once the animal was well trained at running this maze it did so almost automatically, or reflexively. Once started by the stimulus of the maze its behavior becomes a series of voluntary motor responses largely detached from stimuli in the outside world. This was made clear when Watson shortened the alleyway, which caused well trained (i.e., conditioned) rats to run straight into the end of the wall. This was known as the Kerplunk Experiment, and it demonstrates the degree to which a set of behaviorally conditioned, voluntary motor responses can become reflexive, or automatic in nature. Only a few decades after Watson ran these early, simple experiments, the world would see the tenets of behaviorism used to instill the voluntary motor responses necessary to turn close combat killing into a reflexive and automatic response.

Print this article E-mail this article ARTICLE CONTINUES  . . . Next Page

Jump to any section of this article:
Read a different article:

© 1999 by Academic Press. All rights of reproduction in any form reserved.


bio
| vitae | publications | books | presentations | audio/video | press | calendar
contact | site map | search | home

©2000 Warrior Science Group ~ All Rights Reserved.
Site designed by SculptNET Web Site Development, Inc.