Real Loud Noise Would Wtamu Characteristics Passe
Characteristics passed on from one generation to another are the products of heredity.
Heredity is responsible for determining one’s physical makeup, including eye color, hair color, facial features and height. The factors of heredity in effect blend the genetic makeup of the individual’s parents and other ancestors. Sometimes these characteristics do not surface for a generation or two. A child with red hair may in fact be exhibiting this characteristic from a grandparent or great-grandparent. The tiny particles that carry genetic information are called genes, which are part of larger particles called chromosomes. At conception, twenty-three chromosomes are supplied by the father and twenty-three chromosomes are supplied by the mother. Thousands of genes transmitted create similarities and differences from the child’s parents.
There is much debate as to how important heredity is in determining human development. Scientists who emphasize the importance of heredity are called nativists. Those that emphasize social aspects of development are called environmentalists. The debate as to which is more important in human development, heredity or environment, pre-dates the study of psychology.
A person who is easy to get along with or who is always pessimistic is said to have these as personality traits. Traits are a person’s feelings, attitudes, and values in different situations. However, in determining traits, what is more important: heredity vs. environment. Psychologists for the most part take the middle position; they see heredity and environment as contributing factors in personality traits. For example, a boy who grows to be seven feet tall (as the result of heredity) will not necessarily be a good basketball player unless his environment teaches him to play basketball.
Environment plays a role in human behavior as well. Events in a person’s life can mold behavior, too. Scientists have long studied responses to events. Two theories are presented in the following reading sections. Both represent how events may cause certain behaviors in animals and humans.
Classical Conditioning
Pavlov
In the early 1900’s, a Parisian physiologist conducted a series of experiments dealing with digestion. This physiologist was named Ivan Pavlov. Part of his experiments dealt with measuring the amount of saliva released by a dog in response to food. Salivation is an automatic response (behavior) or a reflex. Pavlov and his fellow experimenters began to notice that the dogs were salivating to other situations rather than food. The sight of food, the technician’s white lab coats, and even the sound of footsteps caused the dogs to salivate. Pavlov, being a good scientist, became interested in why the dogs would salivate to things other than food. His work would completely change the study of learning and form the basis for one of the most influential schools of psychology.
Pavlov developed what is now referred to as classical conditioning. Simply put, classical conditioning (or learning by association) is the process by which an originally neutral stimulus (NS) is paired with another stimulus which elicits (brings out) the desired response. After repeating this association, the animal (or human!) begins to respond to what used to be the neutral stimulus. Pavlov pairs the sound of footsteps with the only stimulus that makes the dog salivate — food. The salivation is still unconditioned because the food is still present in the Experiment.
The Experiment begins!
Repeat Step 3 over and over……………………………..
Step 4 – Footsteps with no stimulus, the dog salivates, Conditioned Response
Because the dog has associated the sound of footsteps with the food, he or she salivates (“anticipating” the food. Now the formerly neutral stimulus is a Conditioned Stimulus (CS) and the salivation is a Conditioned Response (CR).
Does this mean that “once classically conditioned, the response stays?” No, classically conditioned responses do not last forever. The conditioned response will fade if the original unconditioned stimulus (DeS) is not presented occasionally. This is called Extinction. In Pavlov’s experiment, if he continually presented the footsteps without ever again presenting the food, the dog would stop salivating at the sound of the footsteps. In effect, he or she would be unlearning the association.
Sometimes the conditioned response will reappear without any explanation. This is without the sound of footsteps or food the dog will salivate. This is called a “spontaneous recovery.” And sometimes, the animal (or human) will make a conditioned response to a similar stimulus to the one that they were originally conditioned to. This is called making a “generalization.” Perhaps the dog would salivate to the sound of a distant hammer (that sounds like footsteps!).
Finally, the animal (or human) does not respond to any stimulus that doesn’t resemble the original stimulus. This is called making a “discrimination.”
Although we have only discussed classical conditioning in dogs, it occurs in all types of animal life from the simplest to the most complex, including humans. That’s why humans have been stressed in parenthesis each time “animals” has been mentioned.
Pavlov’s work in the area of classical conditioning, although a pioneering influence in the study of learning, was greatly limited by the fact that his work dealt only with reflexive behavior in a strictly experimental situation. That is, Pavlov’s dogs acted in a passive way to stimuli that was presented to them. Certainly human learning involves much more complex behaviors because they are motivated by internal stimuli. Psychologists call this type of behavior operant behaviors because the response made operates (effects) the environment.
Thorndike
Edward L. Thorndike was the first psychologist to study operant behaviors in cats. Actually, he studied operant behaviors before Pavlov began his research on classical conditioning. Thorndike’s “law of effect” was formulated after experiments in which cats had to brush against a string in a box in order to escape from the box through a door which led to food. The “law of effect” states that a behavior (in this case, brushing against a string) tended to be repeated if it was followed by a “good effect” (escape through the door to food).
The work of Pavlov and Thorndike created the basis for the work conducted by John B. Watson, the founder of behaviorism. Behaviorism, one of the major schools of psychology, feels that only observable behavior should be studied. Other aspects of psychology, such as the mind, are not able to be studied scientifically because they are not tangible (see later readings on Freud).
John B. Watson
John B. Watson, the “father of behaviorism” is noted for his famous experiment, involving Albert, an 11 month old boy. In this experiment, Watson taught Albert to be afraid of white rats. He did this by making a loud noise every time Albert saw the white rat. Albert, who had no fear of white rats at the beginning of the experiment, soon began to associate the loud noise with the white rat and would cry when the rat was presented, even without the noise. Soon, Albert showed fear of anything that had white fur.
Do you recognize Watson’s experiment for what it is? Another example of Pavlov’s classical conditioning! It fits the Pavlovian model. Albert was not afraid of the rat, therefore the rat was a neutral stimulus (NS) because it did not cause fear. Watson knew that a real loud noise would make Albert afraid. Therefore, the noise was an unconditioned stimulus (UCS) that brings about fear, an unconditioned response (UCR).
When Albert showed fear at other furry objects, this was an example of generalization; anything with white fur (like the rat) caused Albert to be afraid.
How could Watson get rid of Albert’s response? Easy … by presenting only the rat without the loud noise. This is called “extinction.” Unfortunately, Watson never got the chance; Albert’s mother panicked and took Albert away from Watson before he could extinguish the fear from little Albert.
B. F. Skinner
B. F. Skinner, considered by many the most influential American psychologist who ever lived, expanded on Thorndike’s “law of effect” when he wrote his theory of operant conditioning. Operant conditioning is based on the idea that any behavior that is rewarded will be repeated. Thorndike’s “good effect” became known as “reinforcement” by Skinner. A “reinforcement” is any event which occurs after a behavior that increases the likelihood that the behavior will happen again.
Skinner studied white rats in an apparatus which has come to be known as the “Skinner box.” The rat was kept in the box and while exploring around would touch a bar that dropped a food pellet into a cup. After several more explorations and food rewards, the rat began to press the bar to receive the food.
There are several types of reinforcement: positive and negative. Positive reinforcement is self-explanatory: the event which occurs is in some way pleasant or desirable and therefore causes a repeat of the behavior. The food was a positive reinforcement (reward) to a hungry rat. A complement such as “you look nice today” is a positive reinforcement for dressing up. A good grade on a test is positive reinforcement for studying. The examples are endless. However, when a stimulus is taken away (and this is what the animal or human wants!), then this is negative reinforcement.
Skinner demonstrated negative reinforcement by placing a rat in a cage with an electrified floor. A mild shock was delivered until the rat touched a bar to turn it off. The rat soon learned to press the bar to turn off the shock. Skinner used negative reinforcement to study two types of training: escape and avoidance. Escape training is what occurred when the rat learned to turn off the he rat, in effect, was learning to escape the unpleasant stimulus: the shock. In avoidance training, the rat would learn to turn off the shock before it occurred if a buzzer was sounded. In effect, the rat “avoided” the electrical shock completely if it touched the bar before the shock was delivered.
In addition to positive and negative reinforcers, psychologists are also interested in primary and secondary reinforcers. Primary reinforcers are reinforcing in and of themselves. For example, the food that Skinner’s rat received could be called a primary (positive) reinforcer. A secondary reinforcer is any event or object that reinforces (brings about behavior) only when paired with a primary reinforcer. Money would be a good example of a secondary reinforcer.
Operant conditioning is used in dealing with complex behaviors common in humans and animals. Animal trainers are very familiar with the techniques used in operant conditioning. They use a process called shaping in which little “bits” of desired behavior are rewarded and eventually lead to the complete behavior. This is also called learning by successive approximations. Translated, that means the animal (or human) approximates behaviors in successive steps until it is complete.
Skinner taught a bird to walk a “figure light” by rewarding the bird’s first head movement in the right direction, and then withheld the food reward, until the bird stepped in the correct direction. Gradually, Skinner demanded more steps in the right direction before giving the reward until the bird had walked in a figure eight. This is the same technique used to train Lassie or Benji, the famous T.V. and movie dogs. Rewards can be presented in different ways. These “ways” are called “schedules.”
One final note; If an experimenter wishes to present the reward on a time schedule, then the experimenter would be using an “interval schedule.” If the experimenter presents a reward every 10 seconds, every 30 seconds, or every minute (whatever time the experimenter wants), the experimenter would be using a “ratio schedule.” The term “ratio” refers to the number of responses on the part of the organism to the reward given. A “fixed ratio” means a reward is given for a certain number of responses. For example, a 3 to 1 ratio (3: 1) would mean “one reward for every three responses. The experimenter could also change the number of responses to get a reward. This would be a “varied ratio” schedule.
Lesson 3 Review
Directions: Follow the instructions in each section below.
Part A: Define each of the following in your own words as much as possible.
1. Define each of the following:
A. Heredity
B. Nativists
C. Environmentalists
D. Traits
E. Ivan Pavlov
F. Classical conditioning
G. Reflex
H. Extinction
I. Discrimination
J. Generalization
K. E.L. Thorndike
L. J. B. Watson
M. Behaviorism
N. B. F. Skinner
O. Operant conditioning
P. Positive reinforcement
Q. Escape
R. Avoidance
S. Primary reinforcer
T. Secondary reinforcer
2. Explain how T.V. and movie animal “stars” learn to do the extraordinary tricks asked of them.
3. A high school student decides to drop a class because the teacher assigns a term paper. Why is this behavior an example of “escape conditioning?” What action on the part of the school’s administration or a counselor would reinforce this behavior? What is this called?
4. Explain John B. Watson’s experiment with “Little Albert.”
Part B: Provide the correct label for the following statements.
5. A rat learns to turn off a shock before it starts.
6. An effect that is a reinforcer in and of itself.
7. Something that reinforces behavior only when paired with a primary reinforcer.
8. Learning through successive approximations.
9. The first psychologist to study operant behavior.
10. That school of psychology that prefers to study only observable behavior.
Part C: Identify the scientist for each theory listed below.
11. Law of Effect
12. Skinner Box
13. Father of Behaviorism
14. Classical Conditional
15. Little Albert experiment
16. May be considered the most influential American psychologist