School Discipline in Virginia – Part 4 – The False Legend of PBIS Effectiveness

Catherine P. Bradshaw
Senior Associate Dean for Research & Faculty Development Professor, UVa School of Education and Human Development

by James C. Sherlock

To discover the origins of the legend that Positive Behavioral Interventions and Supports (PBIS) is effective, we have to dig into the interlocking government and ed school interest groups that fund and publish “studies” that validate their views.

The goal of the ed schools is always to capture the attention, funding and approval of the federal Department of Education (DOE) of their new bright ideas.

The method is to use DOE’s own seemingly limitless grant money and its bureaucracy’s predisposition to progressive causes to fund studies conducted by progressive “educators” that prove progressive theory.

Where is the anti-trust division of the Justice Department when we need it?

The legend of PBIS effectiveness is perhaps most founded on a famous study conducted in Maryland, the results of which were reported in 2010. The abstract claimed that the schools in the trial experienced significant reductions in student suspensions and office discipline referrals compared to the control group.

That worked until the Department’s Institute for Education Sciences What Works Clearinghouse (WWC) took a look at that study more than a decade later and found that while the study design and execution met scientific standards, it offered:

  • “No Statistically Significant Positive Findings”; and
  • that the evidence for that finding was strong.

Oops.

Many of Virginia’s school divisions have gone down the PBIS rabbit hole and continue to do so at great cost both in time and money and in opportunity costs, i.e. the ability to try interventions actually proven to work.

We’ll trace that 2010 report.

We will find that the study’s leader, now a professor at the University of Virginia’s ed school, is now on the inside of IES, chairing a What Works Clearinghouse (WWC) practice guide on positive behavior support.

Yeah, my take is the same as yours.

The swamp is eternal.

Procedural fidelity in behavioral analysis. The term Positive Behavioral Interventions and Supports (PBIS) is a term that was introduced in the 1997 amendments to the Individuals with Disabilities Education Act (IDEA).

When PBIS is implemented in the schools, it is referred to as School-Wide Positive Behavioral Interventions and Supports – sometimes the acronym is SWPBIS most often shortened to PBIS, as I have done in this series.

The  Journal of Applied Behavior Analysis (JABA, 1 Feb, 2023has just raised the profile of a major long-time issue:

Procedural fidelity is the extent to which independent variables are implemented as designed. Despite 40 years of discussion about the importance of procedural fidelity for behavioral research, reporting of fidelity data remains an uncommon practice in behavior-analytic journals.

It decries the historically low quality of standards for publication.

That same JABA article notes that procedural fidelity has been for many years a problem in school special education studies.

That 2023 JABA report says, basically, that published behavioral research has not always been scientific. It specifically calls out school psychology for “a lack of guidelines and practical considerations that may be significant barriers to including fidelity data.”

They also find that “scholars may feel particularly confident when behavior changes in the expected direction.”

That is shorthand for “developers and supporters of educational theory should not be ones who test it.”

But those are the people who are nearly always funded by the federal DOE. That is the dirty secret.

The Foundational Study of PBIS effectiveness. The foundational study was led by Professor Catherine Bradshaw, a highly honored developmental psychologist who was at John’s Hopkins at the time of that study.

She is now “Senior Associate Dean for Research & Faculty Development” at UVa’s School of Education and Human Development. 

She is also chairing a What Works Clearinghouse (WWC) practice guide on positive behavior support.

Professor Bradshaw’s “Examining the Effects of Schoolwide Positive Behavioral Interventions and Supports on Student Outcomes: Results From a Randomized Controlled Effectiveness Trial in Elementary Schoolswas published in the Journal of Positive Behavior Interventions, v12 n3 p133-148 Jul 2010.

It accelerated the adoption of PBIS across the country and is still a go-to reference for PBIS supporters.

The extract from Professor Bradshaw’s report, which is all most people will read,  claimed:

School-level longitudinal analyses indicated that the schools trained in SWPBIS implemented the model with high fidelity and experienced significant reductions in student suspensions and office discipline referrals.” [Emphasis added.]

It is the word significant with which WWC disagreed more than 11 years later.

We note that PBIS required a lot of training and a lot of support during the trial.

Schoolwide positive behavioral interventions and supports teams attend an initial 2-day summer training and annual 2-day booster training events.

All intervention schools receive at least monthly on-site support and technical assistance from a trained behavior support coach.

Professional development and technical assistance were provided to the behavior support coaches through state-coordinated training events conducted four times each year.

That is one hell of a training and on-site support load, showing how difficult PBIS is to implement competently. So it had better work.

Professor Bradshaw’s study pre-dated the establishment of IES’ What Works Clearinghouse. WWC came, last year, to a starkly different conclusion than reflected in the abstract of her 2010 work.

WWC reported that, while meeting IES standards and, even with all of that professional training and support, her study offered “no statistically significant positive findings.

Click “more outcomes” on student discipline findings to see that IES found no statistically significant differences between the PBIS schools and control group schools on either School-level suspension rate or Office Disciplinary Referrals.

That challenges the claims in the original study abstract.

Again click “more outcomes” where available and find:

  • No statistically significant differences on reading and math scores;
  • None in interpersonal competencies, student behavior, or student discipline.

So after the massive pre-trial and in-trial professional supports to the PBIS schools and the resultant focus on improving the targeted performance measures in those schools, PBIS failed to show significant academic or behavioral improvements.

Postscript. Professor Bradshaw is now on the inside of WWC taking a lead role.

I personally feel little suspense in waiting for the WWC practice guide whose development she is chairing. The one on positive behavior support.

If it suggests PBIS by name, we will have to see how WWC defines PBIS in that practice guide.  PBIS is a framework.

It didn’t score any points in Professor Bradshaw’s 2010 study.

updated Feb 4 at 2030 with clarifications about Professor Bradshaw and her 2010 study.  No conclusions were changed on those subjects.