Digging Deeper on the Link Between Spending and Educational Achievement

OLYMPUS DIGITAL CAMERA

A few days ago I posted data showing that K-12 spending in Virginia was 10.2% lower in 2014 than it was in 2008, yet National Assessment of Educational Progress (NAEP) scores for Virginia students had climbed over the same period. Given the evidence that Virginia schools showed they could do more with less, I asked, is it possible that money is not the main constraint holding back educational achievement?

In a blog post yesterday, the Commonwealth Institute argues that the cuts have hurt academic performance, particularly in poor school districts where budget cuts have hit the hardest. While overall performance has improved since the recession, it has suffered for low-income students, black students and English-as-a-second-language students.

For example, since 2007, 4th grade reading scores on the National Assessment of Educational Progress (NAEP) have gone up for all Virginians on average by 2 points, increasing 4 points for White students and 9 points for students that do not qualify for free or reduced lunch. Yet, they have fallen by 5 points for Black students, 2 points for students that receive free or reduced lunch, and an astounding 21 points for English language learners. Average eighth grade math scores show a similar divergence.

That’s a fair point worth closer examination. I’d like to dig deeper — are there other variables that could explain the divergent performance between racial/ethnic groups? The Commonwealth Institute divides the school districts into quintiles ranked by the size of the state budget cuts. How directly do those budget cuts correlate with declining test scores for poor, black and ESL students? We can’t conduct that kind of analysis with NAEP scores, but we can with SOL scores. I’d love to see that analysis.

— JAB

There are currently no comments highlighted.

4 responses to “Digging Deeper on the Link Between Spending and Educational Achievement

  1. I would want to see how much time a parent is spending with the child, how much TV they are allowed to watch, and whether a father was in the home or not.

  2. I agree, those are probably the most critical variables of all. But unless they changed significantly between 2008 and 2014, they don’t explain the decline in poor, black and ESL performance during that time.

    • and if you’re going to make that a premise – you have to measure it – get the metrics and demonstrate it. It’s does no one any good to make unsubstantiated conjecture. Keep in mind – many rich folks in this country send their kids to private 24/7 boarding schools to get a “good” education where the parent is not involved…

      how does that “work”? would you be in favor of such schools for low income who do not want to do that parental part?

  3. you’ve got a couple of three “big” issues to dig into.

    1. – first, note on your funding chart – this: ” Combined state and local school funding per student,”

    why is this important?

    because , first we don’t know if the State short-funded or the locality. In Virginia the State typically “funds” the SOQs AND requires a local match.

    So which side short-funded?

    2. – do you know what funding is used to pay for what things that are tested?

    In Virginia – where most school districts ADD – MORE local funding than the state requires – that money typically does not go to pay for core SOL subjects that are tested but rather other things.

    If schools cut local funding were they cutting funding for SOL-tested subjects? Probably not.

    3. – NAEP testing is not done at all schools – In Virginia it’s done at 99 schools out of hundreds. To give an idea how little this is – Henrico County alone has more than 70 elementary school including several that failed to get accreditation this year. How many of these schools tested NAEP and were any of them the ones that failed accreditation?

    so basically there are huge flaws in looking at total funding – that pays for things that are not testing – and then trying to use NAEP testing data to represent overall performance of schools in general.

    I support efforts to drill down on performance data and am frustrated by the lack of transparency but not having that transparency and going forward leads to badly flawed analysis and conclusions.

Leave a Reply