## Successive approximation See shaping

Successive discrimination. A procedure used to train differential responding. The researcher arranges the presentation of SD and SA so that one follows the other. For example, a multiple schedule is programmed such that a red light signals VI food reinforcement, and this is followed by a green light that indicates that extinction is in effect.

Superstitious behavior. Behavior that is accidentally reinforced. For example, a parent may inadvertently strengthen aggressive behavior when a child is given his or her allowance just after fighting with a playmate. Switching from one alternative to another may be accidentally reinforced on a concurrent schedule if the alternative schedule has a reinforcement setup. In this case, the organism is accidentally reinforced for a change from one schedule to another.

Symbolic matching. In a matching-to-sample task, symbolic matching involves the presentation of one class of stimuli as the sample (e.g., geometrical forms) and another set of stimuli (e.g., different line angles) as the comparisons. Reinforcement depends on an arbitrary relation (e.g., triangle = vertical).

Symmetry. When stimulus class A is shown to be interchangeable with stimulus class B (if A = B then B = A), we may say that the organism shows symmetry between the stimulus classes. After a form-to-angle discrimination (e.g., triangle = vertical) is trained, a reversal test is conducted without reinforcement using line angles as the sample and geometric shapes as the comparisons (e.g., vertical = triangle). An organism that passes the reversal test is said to demonstrate symmetry of angles and forms.

Systematic replication. A way to increase the generality of an experimental finding by conducting other experiments in which the procedures are different but are logically related to the original research. An experiment is conducted with rats to find out what happens when food pellets are presented contingent on lever pressing. The observation is that lever pressing increases when followed by food pellets. In a systematic replication, elephants step on a treadle to produce peanuts. The observation is that treadle pressing increases. Both experiments are said to show the effects of positive reinforcement contingencies on operant behavior. See also direct replication.

Tacting. A class of verbal operants whose form is regulated by specific nonverbal discriminative stimuli. For example, a child may see a cat and say, "kitty." The word tact comes from the more familiar term contact. Tacting is verbal behavior that makes contact with the environment.

Tandem schedule. A tandem schedule is two or more basic schedules (CRF, FR, FI, VI, VR) presented sequentially in which only the final link ends with primary reinforcement (or in some cases extinction) and the component schedules are not signaled by discriminative stimuli. In other words, a tandem schedule is the same as an unsignaled chain schedule.

Taste aversion learning. When a distinctive taste (e.g., flavored liquid) is paired with nausea or sickness induced by a drug, X-ray, or even physical activity, the organism shows suppression of intake of the paired flavor.

Temporal pairing. In respondent conditioning, the pairing of the CS and US in time.

Terminal behavior. On a schedule ofreinforcement, as the time for reinforcement gets close, animals engage in activities related to the presentation of the reinforcer. For example, a rat will orient toward the food cup.

Textual beahvior. A class of verbal operants regulated by verbal stimuli where there is correspondence between the stimulus and response, but no topographical similarity. The most common example of textual behavior is reading out loud. The child looks at the text SEE DICK, SEE JANE and emits the spoken words "See Dick, see Jane." The stimulus and response correspond, but the stimulus is visual and the response is vocal.

Time sampling. A method of recording used mostly in applied behavior analysis. Behavior is sampled over a long time scale. The idea is to make observations at specified times throughout the day. For example, a patient on a psychiatric ward may be observed every 30 min, as a nurse does the rounds, and instances of psychotic talk may be recorded.

Token economy. A reinforcement system based on token reinforcement; the contingencies specify when, and under what conditions, particular forms of behavior are reinforced. The system is an economy in the sense that tokens may be exchanged for goods and services, much like money is in our economy. This exchange of tokens for a variety of back-up reinforcers ensures that the tokens are conditioned reinforcers. Token economies have been used to improve the behavior of psychiatric patients, juvenile delinquents, pupils in remedial classrooms, medical patients, alcoholics, drug addicts, prisoners, nursing home residents, and retarded persons.

Tolerance. When more of a drug (US) is needed to obtain the same drug effects (UR), we talk about drug tolerance. In respondent conditioning, the counteractive effects to CSs are major components of drug tolerance.

Topography. The physical form or characteristics of the response. For example, the way that a rat presses a lever with the left paw, the hind right foot, and so on. The topography of response is related to the contingencies of reinforcement in the sense that the form of response can be broadened or restricted by the contingencies. The contingency of reinforcement may require only responses with the left paw rather than any response that activates the microswitch-under theses conditions left paw responses will predominate. Generally, topography is a function of the contingencies of reinforcement.

Total behavioral output. To solve the matching equation for the absolute rate of response (Ba), it is important to recognize that Ba + Be is equal to the total behavioral output for a given situation. Because Ba represents lever pressing and Be represents all other activity, the sum must equal all the behavior of the animal in the experimental setting. It is convenient to express this sum as the value k or the total behavioral output. The quantity k may now be substituted into the matching equation

When each side of the equation is multiplied by k, the absolute response rate (Ba) is expressed as

In this equation, rate of response rises as a hyperbolic function of the rate of reinforcement (Ra) and rate of extraneous reinforcement (Re); the value k sets the limit or maximum on this function. See quantitative law of effect.

Trace conditioning. A respondent conditioning procedure in which the CS is presented for a brief period, and after some time the US occurs. Generally, as the time between the CS and US increases, the conditioned response becomes weaker. When compared to delayed conditioning, trace conditioning is not as effective.

Transition-state performance. Behavior that is changing from one state to another as a function of a change in contingencies of reinforcement. For example, when CRF contingencies are changed to FR 10, responding is at first erratic but eventually stabilizes. See also steady-state performance.

Transitivity. When stimulus A = stimulus B and B = stimulus C, if an organism responds to stimulus A as equal to stimulus C, it is said to show transitivity. For example, if the written numbers one, two, three are equivalent to the arithmetic numbers 1, 2, and 3, the words and these arithmetic numbers are equivalent to sets {X}, {X,X}, and {X,X,X}, it logically follows that one, two, and three are equivalent to sets {X}, {X,X}, and {X,X,X} and the relationship is transitive. An organism is said to show transitivity when it passes tests for transitivity after training for symbolic matching of stimulus class A (angles) to stimulus class B (geometric forms) and B (geometric forms) to C (intensity of illumination).

Trend, as in baseline drift. A trend is a systematic decline or rise in the baseline values of the dependent variable. A drift in baseline measures can be problematic when the treatment is expected to produce a change in the same direction as the trend.

Trial-and-error learning. A term coined by Thorndike (1898, 1911) that he used to describe results from his puzzle box and maze learning experiments. Animals were said to make fewer and fewer errors over repeated trials, learning by trial and error.

Two-key procedure. On a concurrent schedule of reinforcement, the alternative schedules are presented on separate response keys.

Unconditioned reinforcer. A reinforcing stimulus that has acquired its properties as a function of species history.

Unconditioned response (UR). All organisms are born with a set of reflexes (US ^ UR). These relationships are invariant and biologically based. The behavior elicited by the US is called the unconditioned response (UR).

Unconditioned stimulus (US). All organisms are born with a set of reflexes (US ^ UR). These relationships are invariant and biologically based. The eliciting event is called the unconditioned stimulus (US).

Undermatching. In the generalized matching equation, the exponent a takes on a value less than 1. This result is described as undermatching and occurs when changes in the response ratio are less than changes in the reinforcement ratio. See also generalized matching law.

Variable interval (VI). A schedule of reinforcement in which one response is reinforced after a variable amount of time has passed. For example, on a VI 30-s schedule, the time to each reinforcement changes but the average time is 30 s.

Variable ratio (VR). A response-based schedule of reinforcement in which the number of responses required for reinforcement changes after each reinforcer is presented. The average number of responses is used to index the schedule. For example, a rat may press a lever for reinforcement 50 times, then 150, 70, 30, and 200. Adding these response requirements for a total of 500, then dividing by the number of separate response runs (5), yields the schedule value, VR 100.

Verbal behavior. Verbal behavior refers to the vocal, written, and gestural performances of a speaker, writer, or communicator. This behavior operates on the listener, reader, or observer, who arranges for reinforcement of the verbal performance. Verbal behavior often has indirect affects on the environment. This contrasts with nonverbal behavior, which usually results in direct and automatic consequences. When you walk toward an object, you come closer to it. Verbal behavior, on the other hand, works through its affects on other people. To change the position of a lamp, the speaker states "Lift the lamp at the back of the room" to a listener, who is inclined to respond. Although verbal behavior is usually equated with speaking, vocal responses are only one of its forms. For example, a person may emit gestures and body movements that indirectly operate on the environment through their effects on others. A frown sets the occasion for others to remove some aversive event, while a smile may signal the observer to behave in ways that produce positive reinforcement.

Verbal community. The contingencies that regulate verbal behavior arise from the practices of people in the verbal community. The verbal community refers to the customary ways that people reinforce the behavior of the speaker.