Law of effect

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The law of effect basically states that “responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation.”[1] This notion is very similar to that of the evolutionary theory, if a certain character trait provides an advantage for reproduction then that trait will persist.[2]

History[edit]

This principle, discussed early on by Lloyd Morgan, is usually associated with the connectionism of Edward Thorndike, who said that if an association is followed by a “satisfying state of affairs” it will be strengthened and if it is followed by an “annoying state of affairs “ it will be weakened. [3][4]

The modern version the law of effect is conveyed by the notion of reinforcement as it is found in operant conditioning. The essential idea is that behavior can be modified by its consequences, as Thorndike found in his famous experiments with hungry cats in puzzle boxes. The cat was placed in a box that could be opened if the cat pressed a lever or pulled a loop. Thorndike noted the amount of time it took the cat to free itself on successive trials in the box. He discovered that during the first few trials the cat would respond in many ineffective ways, such as scratching at the door or the ceiling, finally freeing itself with the press or pull by trial-and-error. With each successive trial, it took the cat, on average, less and less time to escape. Thus, in modern terminology, the correct response was reinforced by its consequence, release from the box. [5]

Definition[edit]

Initially, the cat’s responses were largely instinctual, but over time, the pressing lever response was strengthened while the others were weakened

Law of effect is the belief that a pleasing after-effect strengthens the action that produced it.[6]

The law of effect was published by Edward Thorndike in 1905 and states that when an S-R association is established in instrumental conditioning between the instrumental response and the contextual stimuli that are present, the response is reinforced and the S-R association holds the sole responsibility for the occurrence of that behavior. Simply put, this means that once the stimulus and response are associated, the response is likely to occur without the stimulus being present. It holds that responses that produce a satisfying or pleasant state of affairs in a particular situation are more likely to occur again in a similar situation. Conversely, responses that produce a discomforting, annoying or unpleasant effect are less likely to occur again in the situation.

Psychologists have been interested in the factors that are important in behavior change and control since psychology emerged as a discipline. One of the first principles associated with learning and behavior was the Law of Effect, which states that behaviors that lead to satisfying outcomes are likely to be repeated, whereas behaviors that lead to undesired outcomes are less likely to recur.[7]

Thorndike’s Puzzle-Box. The graph demonstrates the general decreasing trend of the cat’s response times with each successive trial

Thorndike emphasized the importance of the situation in eliciting a response; the cat would not go about making the lever-pressing movement if it was not in the puzzle box but was merely in a place where the response had never been reinforced. The situation involves not just the cat’s location but also the stimuli it is exposed to, for example, the hunger and the desire for freedom. The cat recognizes the inside of the box, the bars, and the lever and remembers what it needs to do to produce the correct response. This shows that learning and the law of effect are context-specific.

In an influential paper, R. J. Herrnstein (1970)[8] proposed a quantitative relationship between response rate (B) and reinforcement rate (Rf):

B = k Rf / (Rf0 + Rf)

where k and Rf0 are constants. Herrnstein proposed that this formula, which he derived from the matching law he had observed in studies of concurrent schedules of reinforcement, should be regarded as a quantification of the law of effect. While the qualitative law of effect may be a tautology, this quantitative version is not.

Example[edit]

An example is often portrayed in drug addiction. When a person uses a substance for the first time and receives a positive outcome, they are likely to repeat the behavior due to the reinforcing consequence. Over time, the person's nervous system will also develop a tolerance to the drug. Thus only by increasing dosage of the drug will provide the same satisfaction, making it dangerous for the user.[9]

Thorndike’s Law of Effect can be compared to Darwin’s theory of natural selection in which successful organisms are more likely to prosper and survive to pass on their genes to the next generation, while the weaker, unsuccessful organisms are gradually replaced and “stamped out”. It can be said that the environment selects the "fittest" behavior for a situation, stamping out any unsuccessful behaviors, in the same way it selects the "fittest" individuals of a species. In an experiment that Thorndike conducted, he placed a hungry cat inside a "puzzle box", where the animal could only escape and reach the food once it could operate the latch of the door. At first the cats would scratch and claw in order to find a way out, then by chance / accident, the cat would activate the latch to open the door. On successive trials, the behaviour of the animal would become more habitual, to a point where the animal would operate without hesitation. The occurrence of the favourable outcome, reaching the food source, only strengthens the response that it produces.

Colwill and Rescorla for example made all rats complete the goal of getting food pellets and liquid sucrose in consistent sessions on identical variable-interval schedules.[10]

Influence[edit]

The law of effect provided a framework for psychologist B. F. Skinner almost half a century later on the principles of operant conditioning, “a learning process by which the effect, or consequence, of a response influences the future rate of production of that response.”[1] Skinner would later use an updated version of Thorndike’s puzzle box, called the operant chamber, or Skinner box, which has contributed immensely to our perception and understanding of the law of effect in modern society and how it relates to operant conditioning. It has allowed a researcher to study the behavior of small organisms in a controlled environment.

References[edit]

  1. ^ a b Gray, Peter. ‘'Psychology'’, Worth, NY. 6th ed. pp 108–109
  2. ^ Schacter, Gilbert, Wegner. (2011). "Psychology Second Edition" New York: Worth Publishers.
  3. ^ Thorndike, E. L. (1898, 1911) "Animal Intelligence: an Experimental Study of the Associative Processes in Animals" Psychological Monographs #8
  4. ^ A. Charles Catania. "Thorndike's Legency: Learning Selection, and the law of effect", p. 425-426. University of Mary Land Baltimore
  5. ^ Connectionism. Thorndike, Edward.Q Retrieved Dec 10, 2010
  6. ^ Boring, Edwin`. Science. 1. 77. New York: American Association for the Advancement of Science , 2005. 307. Web.
  7. ^ "Law of Effect". eNotes.com. Retrieved 2012-08-02. 
  8. ^ Herrnstein, R. J. (1970). On the law of effect. Journal of the Experimental Analysis of Behavior, 13, 243-266.
  9. ^ Neil et al., Carlson (2007). Psychology The Science Of Behaviour. New Jersey, USA: Pearson Education Canada, Inc.,. p. 516. 
  10. ^ Nevin, John (1999). "Analyzing Thorndike's Law of Effect: The Question of Stimulus - Response Bonds". Journal of the Experiment Analysis of Behaviour. p. 448.