Stephen B. Hood,  Ph.D.

University of South Alabama



Review of Classical Conditioning


1.           Environmental changes may be thought of as stimuli.  Behavioral events may be thought of as responses.


2.           Certain parts of our behavior may be elicited by stimuli.  The kind of predictable relationship between stimuli and responses is called a reflex.


3.           In a reflex, the first event is the stimulus.  The second event is the response.


4.           In a reflex, both the unconditioned and the conditioned stimuli precede the response.


5.           In a reflex (classical) conditioning, a previously neutral (conditioned) stimulus is paired with an unconditioned stimulus.  After conditioning has occurred, the conditioned stimulus comes to control the response which originally was controlled by the unconditioned stimulus.


6.           After conditioning, we may substitute the conditioned stimulus for the unconditioned stimulus and elicit a similar response.



Operant Behavior


1.           Example:  A child calls.  His mother runs to him, picks him up and cuddles him.


In this example, a response by an organism (child) produced a change in the external environment (the mother went to him, picked him up and cuddled him).


2.           Definitions of an "operant".


               a)           Something which acts upon the environment.


               b)           A response that is modified by the environmental consequences of its appearance.


               c)           A response which is emitted in the presence of a stimulus which we may or may not be able to identify.  However, if the response in question can be controlled and shaped by its consequences, then it is an operant.


3.           When environmental consequences result in a increase in the frequency of occurrence of the response, then we have an example of positive reinforcement.


4.           If a response is followed by a (punishing) stimuli, then there will be a decrease in the frequency of occurrence of the response.  This is referred to as punishment.


5.           If punishing stimulus is aversive, the response may result in a reduction or removal of the punishing or aversive stimulus.  A response occurs more frequently, and results in the reduction or removal of the aversive stimulus, and is an example of negative reinforcement.


Negative reinforcement:  the increased frequency of occurrence of those responses which result in the reduction or removal of aversive stimuli.


6.           Law of Effect: 


When a response is followed by pleasurable consequences, the frequency of occurrence of that behavior will increase.


When a response is followed by punishment, the frequency of occurrence of that behavior will decrease.


               This has been referred to as the "pleasure-pain" dichotomy.


Whereas in classical conditioning the paradigm was S - R, in operant conditioning the paradigm becomes R - S; it is referred to as a contingency.


7.           To be contingent means to be dependent upon.  In a contingency, the occurrence of a stimulus is dependent upon the occurrence of a response.


Therefore, operant behavior is behavior that can alter, change, or produce an effect upon the environment.


The dialing of a telephone dial results in the ringing of the telephone corresponding to the dialed number.


8.           When the consequences of an operant behavior increase the frequency of that behavior, we call such consequences reinforcing stimuli.  When we positively reinforce a behavior we make it stronger.  Thus, a reinforcing stimulus makes a response stronger; we increase the probability of the response occurring again (more frequently).


9.           It is essential that the stimulus immediately follow the response to be (most) effective.



Escape Conditioning


10.        An aversive (loud) noise is one from which we would tend to turn away, or otherwise escape.               This is escape behavior:


                              The stimulus in this case is aversive noise.

                              The response is putting our hands over our ears.


                              In escape conditioning, the reinforcement follows the operant response.


Therefore, in escape conditioning, the removal of an aversive stimulus is reinforcing.  Stimulus removal is contingent upon a response.  If the removal of a stimulus strengthens the preceding behavior, we call the stimulus an aversive stimulus.


11.        A negative reinforcing stimulus is one from which one tries to escape.  It is called an aversive stimulus.

An increase in response strength occurs when a positive reinforcing stimulus is presented after a response.


An increase is response strength occurs when a negative reinforcing stimulus is terminated or removed after a response.


                              Stated in other words:


                              Response strength increases when a response is followed by the presentation of a positive reinforcing stimulus.


                              Response strength is increased when a response is followed by the removal of a negative reinforcing stimulus or aversive stimulus.



 General Considerations Relative to Learning:


The term learning is broader than the term conditioning since it covers a range of processes from the classical conditioned reflex to human problem solving.  Definitions of learning may be of two types:  factual definitions and theoretical definitions.  Factual definitions relate the phenomenon of learning to observable events in the physical world.  Theoretical definitions describe the essential conditions or the basic processes believed necessary for learning.


Regarding factual definitions, learning refers to a more or less permanent change in behavior which occurs as a result of practice.  Both the dependent variable (changes in behavior) and the independent variable (practice) are reasonably objective.  The term learning itself has the status of an intervening variable.  Learning is restricted to relatively permanent changes in behavior, not short term changes.




               Learning need not include performance.  For example, suppose you studied a road map on how to drive to Boston.  You might have learned the route, but until you actually drive to Boston successfully, and without getting lost, you would not have shown your learning through performance.  Learning, therefore, need not just be performance; rather, it is a change in behavior potentiality.  Through learning, you learn the capability to perform, although this capability may remain latent.

               Learning refers to long-term changes in the organism produced by practice.  Performance refers to the translation of learning into behavior.  Learning sets the upper limit on consistent behavior and performance.





Practice alone does not produce learning, but produces only fatigue or extinction.  In order to insure learning, there must also be reinforcement.  Therefore, a redefinition of learning would be:  Learning is a relatively permanent change in behavior potentiality which results from reinforced practice.






               1.           Acquisition:

                              a.           probability of occurrence

                              b.           latency of response

                              c.           rate (speed) of responding

                              d.           response magnitude

                              e.           resistance to extinction


               2.           Extinction:


                              a.           decrease in response strength with nonreinforcement.

                              b.           decrease in response strength with punishment.


3.                 Spontaneous Recovery:


                              a.           A response which has been extinguished recovers some of its strength with rest.

b.           A decrease in previously acquired behavior --

c.           An increase in previously extinguished behavior.


4.           Reinforcement Schedules


a.           Intermittent/Random

b.           Fixed Ratio                     (Ratios are based upon behaviors)

c.           Fixed Interval                (Intervals are based on time)    


d.           Variable Ratio  

                             e.           Variable Interval



SBH:\ lrngthry.sbh\: