Issues in multivariate assessment of a large-scale behavioral program.
Big school assessments collapse when you ignore hidden classroom forces, so measure context too, not just kid skills.
01Research in Context
What this study did
Herrnstein et al. (1979) wrote a think-piece, not an experiment. They looked at big school programs that try to measure many skills at once.
The authors asked: what hidden traps appear when you test hundreds of kids across many classrooms?
What they found
The paper lists problems, not numbers. It warns that community pressure, teacher stress, and unseen social forces can twist your data.
They say: if you only count easy behaviors, you miss the real story.
How this fits with other research
McMillan (1973) set the stage. That earlier paper told us to watch for sequence effects in small single-subject designs. J et al. widen the lens to whole schools.
Cameron et al. (1996) later proved one of the worries. They showed peer sociometrics bounce around too much to trust for one kid. This backs the 1979 warning that hidden variables wreck large data sets.
Cohen et al. (1993) and Fàbregues et al. (2022) keep the thread alive. Both shout that tools must fit real classrooms and mixed-methods must be clear. The 1979 paper reads like their early blueprint.
Why it matters
Next time you plan a district-wide social-skills screen, add cheap proxy measures for teacher stress and classroom climate. Track them along with student scores. If the hidden numbers drift, you will know your main data may be noisy before you make big decisions.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Add one quick teacher stress rating sheet to your next class-wide probe.
02At a glance
03Original abstract
Several social and research issues directly affected the development and implementation of multivariate assessment in a large community-based applied research program. Examples are drawn from experiences of the Preparation through Responsive Educational Programs Project for disruptive and skill deficient adolescents in suburban, rural, and urban junior high school settings, focusing on the assessment of academic and social skill development and long-term skill maintenance. The social context altered both project treatment and follow-up plans, requiring assessment of potentially unintended effects and decreasing consistency across sites. Future community acceptance of such programs may depend on the investigators' adaptation to diverse community pressures for program conduct and assessment and the measurement of phenomena that are not always directly observable.
Journal of applied behavior analysis, 1979 · doi:10.1901/jaba.1979.12-593