Big Data and Power to the People

school_gradesby James A. Bacon

In the previous post, John Butcher brought to light some incredibly important data long secreted in the Virginia Department of Education — Student Growth Percentile (SGP) scores. There are two aspects to this story. First, the data will bring unprecedented accountability to Virginia schools and school divisions. Second, it is a cautionary tale of how the educational establishment resists transparency that makes that accountability possible.

Brian Davison, who works in business intelligence and software management, had two children in the Loudoun County public school system when he filed a Freedom of Information Act request in 2014 to obtain the score. The Loudoun County school superintendent rejected the request, but Davison won in a lawsuit filed against the Virginia Department of Education in Richmond Circuit Court.

The publication of raw SOL scores never resulted in much accountability. SOL numbers are so heavily influenced by the socio-economic status of students (accounting for about 55% to 60% of the variability between schools) that school administrators could plausibly argue that they aren’t responsible for low scores — the economic disadvantage of their student body is. SGP scores get around that excuse by comparing the progress of students, regardless of socio-economic status. In effect, it measures education value added.

As Butcher’s post clearly shows, some school systems out-perform the norm by wide margins, while others under-perform. An analysis of individual schools probably would show the same thing, as would an analysis of individual teachers.

There are many idiosyncratic reasons why a particular student might lag or surge ahead in his or her performance in a given year, so one has to be extremely careful drawing conclusions from small numbers. That makes the data somewhat problematic for assessing the performance of teachers, especially young teachers who have taught only a year or two. However, after enough years, a teacher should have taught enough students that statistically valid conclusions can be drawn about his or her effectiveness. Another issue: Data should be anonymized in order to protect the privacy of school students.

In very rough numbers, schools teachers, principals and administrators account for roughly 40% to 45% of the variability in student performance. No one expects them to perform miracles, but there is little doubt that they can do better. A critical step is identifying which teachers are consistently doing well and which ones are doing badly in order to incentivize the good ones to stay and the bad ones to leave. Another step is identifying which principals are doing a good job, like Tina McCay at Goochland Elementary School (mentioned here). Finally, voters need data to judge the performance of senior school division administrators and school board members.

I’m doubt the story will end here. There will be endless haggling over how to interpret the numbers — and that’s how it should be. But be not mistaken: This is a game changer. Citizens and parents now have a tool of unprecedented power to cut through the dodging and weaving, the hedging and prevaricating, to hold educators accountable. Now let’s go out and use it!

There are currently no comments highlighted.

11 responses to “Big Data and Power to the People

  1. no one teacher is going to be found responsible for bad scores for a whole school or even bad scores for whole groups … and as Jim is pointing out – other players – principles and administrators are going to be on the dime for patterns that span classes and groups – rather than – say – asserting than all the 3rd and 4th grade teachers are “bad”. Someone is going to ask WHY all of are “bad” to start with.

    so I see this as not a tool to identify individual “bad” teachers so much as a tool to identify school and school district policies for kids that are behind and what has been done about it beyond claiming an individual teacher or entire groups of teachers are responsible.

    Take Lynchburg, for instance, the stomping grounds of Hill City Jim – where performance – across the board in most (not all) of the schools is less than wonderful with accreditation problems for many of the schools.

    If you take the data – SGP data (which I’m quite sure VDOE already has) – then do you think the problem is going to be identified as most of school system inhabited by “bad” teachers?

    take a place like Henrico where some of the schools are very good and some are not good. Do we think the SGP data is going to identify at the “bad” schools that “bad” teachers are the reason?

    My suspects are that at the end of the day this durable canard about “bad teachers” is going to disappear in a cloud of smoke.

    • I would draw a parallel with good/bad teachers, schools and school divisions with good/bad reporters, editors and newsrooms (a phenomenon with which I am familiar). There are good and bad journalists. But it’s the responsibility of the editors to bring the best out in them, just as it’s the job of the executive editor to create an overall environment and culture for the newsroom and make sure his/her editors are doing their jobs. You have to have accountability at all levels.

      When it comes to schools, I wouldn’t want to pre-judge where the problems are. I would dare say that each school division is distinct, with strengths and deficits at different levels of the organization. What this tool gives us for the first time is the ability to hone in on the hot spots, wherever they may be.

  2. How sad that Loudoun County had to be sued in order to cough up this data. This is another recurring theme in Virginia – willful opaqueness. Everybody thinks VPAP is such a great site. Then you see that it is overseen by a council of vested interests. Then you have to enter a Captcha token in order to prove you are “not a bot”. Why would a bot bother VPAP? I don’t know for sure but it might be that letting anybody easily get ahold of all that data might make the kind of statistical analysis performed by John Butcher a lot easier. And, after the McDonnell affair, God only knows what secrets a thorough analysis of that data might reveal.

  3. re: bots – they eat up resources and some are “probes”… for future potential attacks, spam, etc.

    re: “However, after enough years, a teacher should have taught enough students that statistically valid conclusions can be drawn about his or her effectiveness. Another issue: Data should be anonymized in order to protect the privacy of school students.”

    I do not see where the name of a particular teacher is any less a privacy issue than the name of the students, in fact just a severe violation of privacy as the students. where did this idea come from that the public can see the performance record of ANY employee to start with?

    If that’s the motivation behind the FOIA – it’s, in my view, totally wrongheaded and inappropriate… nutty…

    what this data is going to show instead is how a particular school does with it’s economically disadvantaged , minorities, and regular students compared to other schools – in the same district or other districts and I suspect this is
    why VDOE and the schools themselves are opposed to providing the data and I’d be more than shocked if they provided the data in a form that it’s possible to identify specific teachers.

    I predict , no school, is going to try to explain SGP data for a whole school or even a grade class in terms of A, or one or two lower performing teachers much less a group of them. All that would do is cause more questions as to why the Principal and Administration did not act – at the time they starting seeing adverse data.

    sorry – if that’s the idea behind this – it’s just plain wrong.

  4. “Where did this idea come from that the public can see the performance record of ANY employee to start with?”
    These are teachers we’re talking about. These are employees of the public; paid by your taxes and mine. Why shouldn’t we hold them publicly, individually, accountable?
    Yes, that will expose them to criticism, and defense, from the politically-motivated. Why is that not a good thing?

  5. I have a deeper problem with this mania for empirical statistics. School systems should be constantly evaluating performance, of course, and it is wrong to keep public school stats away form the public.

    But I wonder if the teacher’s unions and organizations have a point about taking this all to far. Are you teaching the child or teaching the test? I know brilliant people who are academic whizzes but do poorly at multiple choice or other exams.

    Intensive data drilling might work for software programmers (like the man who sought the Loudoun stats) but the fetish for numbers ignores the realm of ideas which are the heart (or brains) of education. How can stats measure creativity? Expressiveness? Deep thinking?

    I don’t think they can. What’s truly scary is that the numbers maniacs want to use data to assess individual teachers. This would hold them to standard few others have to face. What is also discouraging about blog items like this is that they tend to jump off from a view that teachers are lazy and incompetent, especially if they are at public schools.

    Where did that idea come from? It’s like saying all lawyers are crooks.

  6. “It’s like saying all lawyers are crooks.”

    Where do you get that? It’s like saying some lawyers are really good, some really suck and some are in between. Most consumers would like to have that information.

    Parents would like to see the same information about teachers. The problem I foresee is that if everyone knew who the best teachers were, a lot of pushy parents would start demanding that *their* precious child be assigned to their class. I don’t know how you’d sort that out. Maybe you’d keep the teacher data confidential to everyone except the school administration, which could use the data to manage for better performance.

    • “Parents would like to see the same information about teachers. The problem I foresee is that if everyone knew who the best teachers were, a lot of pushy parents would start demanding that *their* precious child be assigned to their class.”

      So the problem is that it would actually give the consumers better access to choice?

      I do agree the teacher data should be kept confidential, mostly because publicizing something that I imagine would have a decent amount of year-to-year volatility can lead to poor decision making by both the public and administration.

      Instead, keep the data confidential, but open to administrative review under two conditions:

      1) Biennial or triennial performance reviews on a rolling basis to look at teacher outputs over time and see who’s doing well and who needs assistance.

      2) In the instance of parental complaints about teacher performance.

  7. And the SPG is a type of value-added measurement, which the American Statistical Association has spoken out against as useful indicator of anything. So, we should all be taking anything it has to say about individual teachers – or much of anything – with a grain of salt.

    http://dianeravitch.net/2014/04/12/breaking-news-american-statistical-association-issues-caution-on-use-of-vam/

    “Most VAM studies find that teachers account for about 1% to 14% of the variability in test scores, and that the majority of opportunities for quality improvement are found in the system-level conditions. Ranking teachers by their VAM scores can have unintended consequences that reduce quality.” The ASA points out: “This is not saying that teachers have little effect on students, but that variation among teachers accounts for a small part of the variation in scores. The majority of the variation in test scores is attributable to factors outside of the teacher’s control such as student and family background, poverty, curriculum, and unmeasured influences.”

  8. The problem I have with restricting disclosure of this data is far less than the problem I have with compelling a lot of awkward explanations of the data. People understand that the teacher must deal with the kids he’s dealt each year; but what about the stats for that teacher year after year after year? The public are not all idiots; we are all confronted with, familiar with, data with high short term variability that nevertheless support long term conclusions, and people eventually come to understand that difference — or ought to, even if they try to ignore the trends. Take the weather for instance, versus long term climate.

Leave a Reply