Operational Definition
Clear, observable, measurable description of behavior. Ex: “Aggression = hitting with an open hand.”
Direct Measurement
Behavior is observed and recorded directly. Ex: counting tantrums.
Indirect Measurement
Data from interviews or rating scales. Ex: parent questionnaire.
Product Measure
Measure permanent result of behavior. Ex: number of worksheets completed.
Occurrence (Frequency)
How often behavior occurs. Ex: 6 hand raises.
Duration
How long the behavior lasts. Ex: crying for 4 minutes.
Latency
Time between instruction and response. Ex: 8 seconds to comply.
Interresponse Time (IRT)
Time between two responses. Ex: 20 seconds between questions.
Continuous Measurement
Behavior measured during entire observation. Ex: event recording.
Discontinuous Measurement
Behavior measured during samples of time. Ex: interval recording.
Interval Recording
Record if behavior occurs during interval. Ex: talking during 10-sec intervals.
Time Sampling
Observe only at specific moments. Ex: observe every 5 minutes.
Efficiency Measurement
Measures speed/cost of learning. Ex: trials to criterion = 12.
Validity
Measures what it is supposed to measure. Ex: on-task behavior for attention.
Reliability
Consistency of measurement. Ex: IOA = 92%.
Representative Data
Data reflects real behavior across settings/times. Ex: home and school data.
Graphing Data
Display data visually. Ex: line graph.
Equal-Interval Graph
Equal spacing on axes. Ex: frequency across days.
Bar Graph
Compare categories or totals. Ex: skills mastered.
Cumulative Record
Shows total responses over time. Ex: cumulative clicks.
Data Interpretation
Analyze trends, level, and variability. Ex: behavior decreases after treatment.
Procedural Integrity
Accuracy of implementing procedures. Ex: 95% steps followed.
Dependent Variable (DV)
Behavior being measured. Ex: tantrums.
Independent Variable (IV)
Intervention applied. Ex: reinforcement schedule.
Internal Validity
Change caused by the IV. Ex: behavior changes only after treatment.
External Validity
Results generalize to others/settings. Ex: works in different classrooms.
History (Threat)
Outside events affect behavior. Ex: new teacher.
Maturation (Threat)
Natural growth causes change. Ex: child ages.
Single-Case Design
Individual serves as own control. Ex: A-B-A-B.
Repeated Measures
Continuous data collection over time. Ex: daily data.
Prediction
Expected behavior without intervention. Ex: baseline trend continues.
Verification
Confirms IV caused change. Ex: behavior returns in reversal.
Replication
Repeating effect strengthens control. Ex: multiple phases.
Group Design
Compare groups statistically. Ex: treatment vs control.
Reversal Design
IV removed and reintroduced. Ex: A-B-A.
Multiple Baseline Design
IV introduced at different times. Ex: across behaviors.
Multielement Design
Rapid alternation of conditions. Ex: compare two treatments.
Changing-Criterion Design
Gradual change in performance criteria. Ex: increase goals weekly.
Comparative Analysis
Compare interventions. Ex: praise vs tokens.
Component Analysis
Identify effective parts of intervention. Ex: which step works.
Parametric Analysis
Change intensity/dosage of IV. Ex: more reinforcement.
Ethics Principles
Benefit others; act with integrity and respect. Ex: client dignity.
Ethical Risk
Harm to client/profession. Ex: data falsification.
Professional Development
Maintain competence. Ex: CEUs, supervision.
Confidentiality
Protect client information. Ex: secure records.