Bacon's Rebellion

Education Data Reporting is Very Expensive, Time Consuming, Important and Flawed in Virginia

by James C. Sherlock

In government, there is seldom a dispassionate, objective assessment of what works and what doesn’t. Just eternal programs joined by new ones.

We citizens all hope that in governance and budgeting Virginia rewards programs that prove effective and efficient and enforces a cut line for those that do not. But we know that happens far less often than it should.

We also know that the Governor, his budget people and the General Assembly would appreciate objectively measured cut lines. So would agency budget directors.

In education, the government needs three things it largely does not have:

Education. The last time Virginia’s educational data collection system was inspected, Virginia’s Office of Inspector General agreed with me. And that report was on Direct Aid to Education alone. It was a bloodbath relative to data quality.

I hope some gains have been made since then, but the data requirements also have exploded. Nothing has fixed the very heavy burden of the reporting. Based upon my investigations and reporting, the data quality situation is still very bad.

Virginia’s rapidly proliferating K-12 education programs are a big part of the problem. The number of new programs requiring reporting approved by the Virginia Board of Education (BOE) in the past year and a half must have crushed the all-time record. If administrative overhead is on the BOE checklist in new program approvals, there is no evidence of it being honored.

The data VDOE receives appear from my research and reporting to be time-late and worse than useless for purposes of program assessment. I say worse than useless because they often look good enough to use — and are used.

I will discuss the problem and then suggest an approach that will attempt to remedy the problem at both ends:

I will segue to a terrific op-ed by Matt Hurt in The Roanoke Times. The title, “Adding more and more initiatives worsens education outcomes” captures the point.

We started out with the expectation to make sure our students could read, write, and do math, and then all kinds of other things were added.

In recent years, the initiatives (not all inclusive) have been added to educators’ plates by the Virginia General Assembly and the Virginia Board of Education, in addition to everything that was already there. As of fall 2021, this was the list: through-year “growth” assessments; balanced assessment plans; implementation of new educator evaluation standards, new social emotional standards, cultural competency training and new computer science standards; implementation of model policies concerning the treatment of transgender students; the Virginia Kindergarten readiness program; federal programs tied to pandemic relief funds; revisions to locally awarded verified credits; new assessment program for students with significant cognitive disabilities; and implementation of the Virginia inclusive self-assessment and action planning tool.

These are mostly Virginia ed-school — primarily UVa and VCU — bright ideas. Some are actually trying to help. Most are trying to enforce woke doctrine at the K-12 level. They don’t care whether their programs work and don’t want them measured.

VDOE and the BOE, in the absence of reliable data about what is going on in the schools, can be more influenced by the education schools with a dog in the fight that form their “expert” panels. The ringleaders there are, again, the ed schools, dominated by UVa and VCU.

To that I will add that VDOE last year created almost instantly a very large online education program, Virtual Virginia.

Measurable objectives. Matt tells a story of the Superintendent of Wise County Schools who, without the money to hire extra staff to manage all of those programs and their reporting, was working 12-hour days, which were not even enough. Then he writes:

Our leadership in Richmond (governor, board of education and General Assembly) really need to carefully consider our educational priorities and develop a unified hierarchy which contains measurable objectives. Any program or initiative not aligned to the top one or two priorities should be tabled until the top priorities are met. (Emphasis added)

How does VDOE get good data for those “measurable objectives”?

Measurable objectives have to be defined. Once defined, the data requirements can be programmed into reporting software. So can the basic assessments of the data.

Nearly all data that winds up at the Governor’s office, the General Assembly and the federal government (and school districts for district-initiated reports) originate from individual schools. The data requirements are overwhelming the schools. We cannot trust the results in aggregate, even if and when measurable objectives are set.

So we need to integrate and greatly simplify the process.

What do we do? Virginia has 132 school districts that want (or maybe not in the smaller districts) to get their hands on the data from their district in order to massage it.

Do it. Not rocket science, but that approach will work.

These foundational principles will ensure the reports generated will be more accurate and complete than currently. And they will be generated with a huge reduction in work hours at every level.

How do we accomplish that?

VDOE will need to hire a contractor to develop and run the architecture at this scale. The beltway is thick with them.

That contractor can describe the enterprise architecture at every level and test and continuously improve it in a pilot.

More complete, more accurate, faster and cheaper data with far less human work are pretty good goals for data gathered at the scale of over 2,100 public schools in Virginia.

Eliminating programs that don’t make the cut will be enabled because decision makers, having described measurable objectives, will be able to rely on the data to assess them.

Laws? This may take a law change. If so, change it.

Money? It will be expensive to set up and run. But, translating time into costs, less expensive than what we do now.

Ask the federal Department of Education, an end user for much of the data, to fund it as a demonstration project for the rest of the states. I think they would leap to do it.

Results. Done properly, this approach can greatly reduce the data entry requirements for schools and divisions, eliminate manual report compilation by schools and divisions in most instances, and improve the quality of reports.

I don’t know who will oppose it, but some group will.

Ask them why.

Exit mobile version