Schedules of Reinforcement
Continuous- reinforcement is the exception rather than the rule; we more frequently see intermittent reinforcement of behavior.
Continuous: every response is reinforced
Intermittent: only some instances reinforced
Schedules of reinforcement:
Primarily concerned with intermittent relations (between behavior and its consequences)
Reinforcement schedules-rules for arranging consequences (or describe how consequences are arranged)
Defined by contingencies:
Different schedules=different conditions under which responses produce reinforcement.
Importance of schedules lies mainly in their ability to produce orderly and predictable patterns of behavior.
Schedules as baselines
Schedules as independent variables
The most common measure of schedule performance is its rate of occurrence. (responses per unite time)
Useful in the study of schedules
Show’s patterns over time.
PIP is usually a reinforcer.
Real-time, continuous record of rate of response
Steepness determines a high rate of responding.
Steady state behavior
Same/Similar patterns across organisms.
TYPES OF SCHEDULES:
Sr+ depends on a certain number of responses
Sr+ depends on the first response after a certain time period.
FR- Fixed Ratio Schedule
Generates “break run” pattern.
“run” or burst of responses followed by “break” or pause
Post-reinforcement pause (PRP)
Increases with ratio size
Fixed ration : “FR”
Number of response required follows
E.g. Schedule requires thirty lever presses for every reinforcer.
FR 30 (every 30 responses)
VR- Variable Ratio Schedule
Sr+ depends on a certain number of response
Ration requirements change unpredictably from one Sr+ to the next.
VR-30, numbers of response that deliver reinforcer will average to 30.
Little or no PRP
Variable-ratio : VR
Average number of responses required follows
E.g. on average schedule requires...