Mo’ Money Is Not the Answer

by John Butcher

It’s been a while since I sent Jim a bang per buck analysis of school performance. Now that the 2016 SOL data are out, I’ll try to get back in the groove.

In the past I have plotted the raw division SOL pass rates vs. the annual disbursements per student. But comparing bang-for-the-buck between different school systems is a tricky business. We know, for example, that poverty impacts academic performance. As shown in the scatter graph below, economic disadvantage explains about 39% of the variation in 2016 reading test scores.


To level the playing field this year, I’ve adjusted each division’s pass rate to eliminate the effect of economic disadvantage. (I can offer an explanation in the comments, if you’d like to know the details.)


You might notice that six divisions show corrected pass rates exceed 100%. That is because their pass rates were high in the first instance and considerably higher than their average ED would predict.

That rising tide floats all boats: The adjustment also raises the City of Richmond from an actual 60% pass rate to an adjusted 79%.

As to cost, VDOE will not post the 2016 data until sometime this spring so we’ll have to make do with 2015 data for disbursements per student (using end-of-year enrollment).

On that basis, here are the 2016 division average reading pass rates, corrected for the economic disadvantage of the division’s studentbody, plotted vs. the 2015 division disbursements per student.


The fitted line suggests a slight increase in score with disbursement but there is no correlation. That is, spending more per student is not correlated with better pass rates.


Here the slope might suggest a negative correlation between pass rate and disbursement except that the R2 again tells us that there is no correlation.
The five subject average tells the same story.


These data support the same conclusion as the earlier, unadjusted numbers: At the division level in Virginia, larger disbursements per student do not correlate with better (or worse) SOL performance.

The data are here.

There are currently no comments highlighted.

7 responses to “Mo’ Money Is Not the Answer

  1. I like the analytics but as you say they can be tricky so I’d ask if you are using cost data ONLY for sol-instruction monies.

    Many schools in Virginia add local discretionary money over and above the required SOQ match and those monies are spent are courses that are not SOL so you might be showing schools spending more money on courses that are not SOL and so that money would, of course, not affect SOL scores.

    maybe you did compensate for that and I missed it.

    if not, then the more relevant analytic that would be fascinating would be to see how effective SOL money is for SOL subjects for the various schools. In other words are some schools better than others at the same SOL only instruction?

  2. John,

    Thank you for the great post. Your conclusions follow what I’ve believed instinctively and anecdotally for years, while not always understanding the process. I’ve seen meta-analyses, such as the U. Chic. study years ago showing that class size, which I’ve always considered a reasonable metaphor for spending, do not matter much until you get well below 15. But with your post, now I actually get to learn something. Not being a mathematician/statistician, it would be helpful if I could understand some of the terms.

    What does R2 refer to? Would that figure have to be somewhere over .30 to be meaningful? More generally, what does the mathematical formula shown in each graph represent.

    Is your conclusion about more or less spending per pupil only hold within school divisions or across divisions? If not, what would be needed to arrive at such conclusion across divisions? I go back to my time on a school board in Lake Forest, IL where our average teacher salary in the early 90’s was the highest in the state at well north of 60K, while Paul Adams’ Providence St Mel School in the middle of the west side Chicago ghetto was paying an average of around 19K. Paul’s results were better than our upscale North Shore results in terms of net educational advances (meaning amount of improvement in students caused by the school over when they started). 99.7 % of his kids went to college; 92% of ours did. It was the ultimate proof to me that money was not the answer, albeit without the kind of proof you offer here.


    • Oh I LIKE that idea… in fact I like ANY idea that pulls the parents closer to the school as a a central component of their and their kids lives.

      Schools do parent-involved things like carnival “field” days, holiday pageants… pizza nights… academic contests and games… so that a culture of education is fostered… , often for families whose generations have yet to attain it… Once it become embedded in the family values -the younger generation when they grow up and have their families have it and raise their kids that way.

      These are things that differentiate “good” schools that are not known when folks just look at SOL scores alonge…

      and these are things that I wonder would be done by charter/choice schools – both the parent-involving activities as well as SOL type academic performance transparency.

  3. re: ” Is your conclusion about more or less spending per pupil only hold within school divisions or across divisions? If not, what would be needed to arrive at such conclusion across divisions?”

    that’s an excellent question that I have asked before on on issues like this – and never get an answer….

    If you want some really interesting data – do it by school within each division – both funding , staffing and SOL scores.

    you’ll see some pretty surprising differences… but I doubt seriously if you’ll ever be able to get funding per school…

  4. You have to adjust the cost data for cost of living differences by location across Virginia IMO.

    • and you have to differentiate between money spent on SOL courses and money spent on courses that are not tested for SOLs.

      if you don’t do that – what you’re going to see is that wealthier school districts are going to be spending a lot more money on added-value courses – none of which are sol-tested, so it will indeed look like “mo money” does not result in higher SOL scores… sort of a “duh” concept wen you think about it… but apparently not in the original premise here.

Leave a Reply