Virginia Reading Test Scores Plunge

by James A. Bacon

Reading scores of Virginia students taking the National Assessment of Educational Progress (NAEP) tests, a national standardized test, plummeted this year, and math scores declined as well. The average reading scores of Virginia fourth- and eighth-grade students on the national tests fell by four and six points, respectively. The average math scores for percent proficient fell by two points for both grades.

In releasing the results, Superintendent of Public Instruction James Lane acknowledged that Virginia has a big problem. “The latest NAEP results — coupled with the declines we have seen during the last several years on our state reading tests — underscore the importance of the efforts already underway at the state and local levels to strengthen reading instruction for all students,” he said.

He implied that factors other than the Progressive “racial equity” policies he has championed are to blame. Rather, he said, “We must … recognize that Virginia’s schools are enrolling increasing numbers of students whose learning is impacted by poverty and trauma.”

The solution: $950 million in money for “equitable supports and services for all of the students who need them.”

The NAEP tests, based upon a representative sampling of approximately 4,600 4th graders and 4,300 8th graders, are used to compare the average educational performance of the 50 states and District of Columbia. Virginia students have traditionally scored above the national average, and they did so again in 2019. Virginia is one of 17 states that saw declines in performance in fourth-grade reading and one of 31 states that saw declines in eighth-grade reading.

NAEP defines “proficiency” on the tests as “solid academic performance … over challenging subject matter.” By this measure, Virginia 4th-grade scores, which had been trending higher since 1992 before plateauing in 2013, also took a sharp dive, as can be seen in the graphs shown here.

Earlier this month the state Board of Education proposed new public schools Standards of Quality (SOQs) that include increased funding for reading specialists and the creation of an “equity fund” that would provide an additional $131.9 million in state support for schools serving significant numbers of children in poverty. The fund would pay to recruit and retain experienced and effective teachers and other professional staff in high-poverty schools and provide additional intervention and remediation services for students.

Bacon’s bottom line: Declining NAEP scores over the past two years do little to inspire confidence in the Northam administration’s diagnosis of what ails Virginia’s educational system and how to improve it. But Lane has gamely used the failure as justification for entrusting the system with another $950 million in tax dollars.

Lane blames the “increasing numbers of students whose learning is impacted by poverty and trauma.” While it is true that Virginia public school students come disproportionately from economically disadvantaged backgrounds, Lane presented no evidence of a surge in the number of poor kids or “disabled” kids suffering from poverty-related trauma taking the NAEP exams between 2017 and 2019.

What is the data?

According to Virginia Department of Education data, the percentage of Virginia public school students classified as “disadvantaged” taking the English Standards of Learning exams did tick up slightly between the 2016-17 school year and 208-19: from 39.3% to 41.7% in continuation of a long-term trend.  However, it is also true that if we focus just on disadvantaged students, the SOL English pass rate declined during that same period from 66.7% to 64.7%. In other words, the problem is bigger than the fact that there are more disadvantaged students.

Regarding poverty-related trauma, Lane was likely referring to the commonly accepted view that traumatic events in childhood can cause emotional disturbances later in life that affect behavior and learning. VDOE’s Special Education Child Count data set indicates that the number of children in Virginia public schools classified as “disabled” (which encompasses those with trauma-related disabilities as a sub-set) actually declined slightly over the 1o-year period between 2006 and 2016 (the most recent year for which published data is available): from 173,000 students to 169,000 students.

More recent data found in the VDOE’s “Test Results Build-a-Table” database suggest that the percentage of disabled students who took the English SOLs increased slightly, from 15.3% of the test-taking population in 2016-17 to 15.8% in 2018-19. I question, however, whether these modest changes explain the nose-dive in reading test scores. At the very least, the link between poverty/trauma and plunging test scores over a two-year period is less than self-evident.

I find it interesting that Lane did not take credit for one nugget of positive news in the otherwise dismal results: The gap in black-white NAEP scores shrank slightly between 2017 and 2019 — from 27 points to 24 points. Normally, that would be considered good news. A possible reason for omitting that positive datum: It would undercut the Northam administration’s case that the reading problem is concentrated in school districts with large populations of poor students and requires remedies focused on those districts.

Whatever the reasons for the sharp drop in scores, it is clear that Virginia’s educational system has taken a wrong turn. Virginians need to take a deep, data-driven look at what has gone wrong, and we need to be willing to revise or even abandon worn-out nostrums for what works and what doesn’t.

There are currently no comments highlighted.

7 responses to “Virginia Reading Test Scores Plunge

  1. NAEP has always been a curious thing to me because the “representative sample” is not the same each time and I have no idea how they determine what schools to test from year to year.

    This is not only about economically disadvantaged – it’s about “average” kids also – in the quest to get their kids on a college track – most parents do NOT want their kids to get “bad” grades… they want their kids to have a good QCA so they steer away from the tougher courses and tougher teachers.

    All except the Asians….

    The public schools pretty much give the parents what they want – it’s the easier path.

    What would be very COOL would be for the non-public schools to give their own kids the equivalent of the NAEP and release those scores to the public so we can decide in alternatives to public schools are better.

  2. This is about too much time with digital screens and too little time with books, magazines and other printed matter. That problem is besetting education in all settings. Reading scores are down when reading itself is down. Single best thing about my early education was living overseas with no TV……NAEP is the best we’ve got, the one test that will give you apples to apples comparison results state by state.

  3. Here is an important question on this NAEP reading Proficiency Chart that shows that ONLY 33% OF VIRGINIA’S EIGHT GRADERS ARE ABLE TO READ AT OR ABOVE THE NATIONAL 8TH GRADE LEVEL.

    If that is true then why should we believe that its true that 12th grade kids in Virginia pass Virginia’s Standards of Learning (SOL) at far more than double the NAEP rate (at 8th grade), namely:

    “Reading: 78% pass rate, down 2 percentage points from the previous year.
    Writing: 76% pass rate, down 2 percentage points.”


    In fact, what normally happens after the 8th grade is that disadvantaged kids and other poor learners (whether advantaged or not), fall even further behind their grade level achievement after the 8th grade. This happens for well known reasons. Thus the majority of American kids are no where even close to “college ready” after they “graduate” from 12th grade, assuming they did not drop out altogether from schooling before then.

    In short, what do 12th grade NAEP proficiency charts tell us about Virginia students who graduate? And how do those figures compare to Virginia’s own SOL charts, and what do the latter have to do with telling us about College readiness? Can we believe them? If so, why?

  4. “2015 12th graders reading at 12th grade level nationally per NEAP tests – In 2015, thirty-seven percent of twelfth-grade students performed at or above the Proficient achievement level in reading, according to NEAP test results.

    These test results include following percentage breakdowns for students whose parents had varient educational levels:

    18% pass rate for students whose parents did not finish high school.

    24% pass rate for students whose parents did finish high school.

    36% pass rate for those whose parents had some education after high school.


    Whata remarkable record of gross failure. No wonder most kids learn nothing in college. Now, if we compare Virginia students proficiency rates in 12th grade to their grade level, we will see how honest or dishonest Virginia’s SOL testing is. Good luck finding it.

    Now too, we know why 12th grade NEAP testing results are so hard to find, and often are not published at all, including since 2015.


    • Given these NAEP test results, the key question is can Virginians trust the reported results of the Virginia Department of Education’s SOL tests? The answer is a resounding NO. This tests cannot be trusted. Why? Because they are grossly inflated.

      For example, compare the following SOL results with NAEP results reported in Jim Bacon’s August 13, 2019 post entitled “Latest SOLS: More Declines in Reading, Writing”:

      “Here are the top-line results for the state:

      Reading: 78% pass rate, down 2 percentage points from the previous year.
      Writing: 76% pass rate, down 2 percentage points.
      Math: 82% pass rate, up 5 percentage points.
      Science: 81% pass rate, unchanged
      History/social science: 80% pass, down 4 percentage points

      Asians, as usual, out-performed all other racial/ethnic groups, followed by whites, Hispanics, and blacks. Despite a heavy emphasis by the Northam administration to address racial inequities in schools, the black-white achievement gap grew wider last year in reading and writing, while remaining the same for science.

      VDOE instituted two main changes to its testing. First, it reduced the number of tests high school students must pass to graduate. Under the revised regulations, explains the VDOE press release, “students who meet the testing requirement in a content area do not have to take another test in the subject unless additional testing is required for the school to comply with federal testing requirements. Previously, high school students continued to take end-of-course tests even if they had already earned the credits in the content area necessary to graduate.”

      “The reduction in high school testing is most apparent in history where there is not a federal requirement that students take at least one test in the subject in high school,” VDOE spokesman Charles Pyle told Bacon’s Rebellion. The Every Student Succeeds Act “requires that students take at least one test in reading, math and science during high school.”

      Second, VDOE introduced new tests and standards for math. Some educators have expressed concern that the math standards were watered down. (See “Did the State Reduce the Rigor of Math SOLs?”)

      School Superintendent James Lane said VDOE staff will collaborate with school divisions to address the achievement gaps in reading, especially in the elementary grades. VDOE will work with schools and divisions that did not see declines in reading performance in order to identify best practices and successful strategies for improving reading skills. The effort will include a review of the effectiveness of interventions to assist young readers not reading at grade level.

      “School divisions must ensure that all children receive research-based reading instruction — beginning in kindergarten — that addresses their specific needs, and that students are reading at grade level by the end of the third grade,” Lane said. “This includes making sure that students read a variety of challenging content, including non-fiction and literature that expands vistas and vocabularies. We must meet students where they are, but we must also move them to where they need to be: reading at grade level or above and ready for success in the 21st century.”

      Obviously, VA’s SOL numbers are bogus. They inflate real test results by a factor of two. Surely this is an effort to buttress repeated claims within VA’s educational cartel that some 70% of Virginia’s high school graduates are “college ready” when only some 37% could possible meet that test even under NAEP’s watered down definition of “college ready.”

      Simply put the Virginia Department of Educations SOL test results grossly mislead parents, students, and the public paying the bills that support a failing system.

      • Yup. ever since Virginia set higher standards for its SOLs four years ago, leading to plummeting test scores, the educational establishment has been loosening the standards piece by piece. It is very difficult to make year-to-year comparisons in SOL scores for the purpose of tracking trends because no two years are directly comparable. Without the NAEP, we wouldn’t know what was happening.

    • In his 2016 book Why Knowledge Matters, Rescuing Our Children From Failed Educational Theories, published by Harvard Education Press, imminent UVaA Professor E. D. Hirsch Jr. explains in great detail and with authority why these state standards and tests are so inherently unfair to all students taking those state tests, and how they retard and hobble all students learning how to read, and how that failure and flaw in state tests in practice greatly and often fatality destroys the students’ ability to read well with good comprehension while those tests also hobble the students’ ability to learn the knowledge and vocabulary that are the one critical and essential key to educating students of all grades, and allowing them to advance thought the full education cycle their talents would otherwise allow.

      This extract is taken from pages 22 to 26 of Professor E. D. Hirsch Jr.’s book:

      “The better reading tests are technically reliable and valid. The Gate-MacGinitie reading test, the Iowa Test of Basic Skills, and the National Assessment of Educational Process (NEAP) are good measures of average reading abilities of large groups of students. … All these well-calibrated tests are probing the average level of a person’s reading fluency and vocabulary size. Such well established tests are the means by which we confidently know about the average reading abilities of our students. That is how we know that our nine and thirteen year olds have improved and our seventeen year olds have not.

      But these tests as currently used by the schools have hindered, not raised, mature reading skills. That is to say they are technically valid but educationally invalid … if a test actually hurts education, then it is ultimately invalid for schooling. Current reading tests, by giving reading the misleading appearance that they are testing generalized how to skills that don’t exist, cause schools to engage in self defeating practices. They are consequentially invalid.

      The defect lies less in the tests themselves than in the scientific shortcomings of the empty state standards on which they are based. The standards have misled the schools regarding the nature of reading skill. By focusing on main-idea finding, the standards promote the myth that there is a generalized main idea finding skill, which if practiced and developed will enhance reading ability. But if that were so of mature reading ability, the current generation of students would be performing better on the tests, for they all have been well schooled in main idea finding.

      The test questions about main idea and inference making imply the misleading message that they are probing all purpose strategies and skills of predicting, summarizing, and “inferencing.” But they are doing no such thing. The tests are probing knowledge and vocabulary. … Once decoding has been mastered and fluency attained, relevant knowledge becomes the chief component of reading skill. Every cognitive scientists … (knows that.) … The test makers are implying a lie. By the form of their questions they suggest that they are probing formal skills. But, no matter how well trained students become in main idea finding, the student with the smaller relevant vocabulary and knowledge is the one who will fare worse on the test. (And obsessive focus on the invalid tests narrows … schooling, pushing our (knowledge learning subjects such as history), social studies, science and the arts, … (while) placing unfruitful stress on the (students). …

      Defective Reading Standards

      State tests in math have been based on specific content standards, but the situation is vastly different in reading, where test makers in their own defense can rightly say: “How is it possible to create a test that encourages the imparting of concrete knowledge when the standards on which such tests must be based are content free and encourage the teaching and testing of general skills.

      The making of standards have decided that while it is politically feasible to create a definite content guide in math, fierce controversy would follow if they created a definite content guide in reading. So, American makers of standards have felt themselves forced into content cop-outs in reading. … I’ll illustrate this with a few examples comparing the two standards. Here are some current Texas math standards… Here are some Texas reading standards … My own state of Virginia is more forthright about the inherent repetitiousness and emptiness of its reading standards:

      Grade 2: Identify the main idea.”
      Grade 3: Identify the main idea.”
      Grade 4: Identify the main idea.”
      Grade 5: Identify the main idea.”
      Grade 6: Identify the main idea.”
      Grade 7: Identify the main idea.”
      Grade 8: Identify the main idea.”

      … Such standards are not just empty; they are deeply flawed. The notion that skill in finding the main idea can take the place of content is worse than empty, it’s actively misleading. There is NO reliable main idea finding skill. If readers understand a passage, they will reliably answer the test question about the main idea. If they don’t understand the passage, they won’t. Moreover, in good complex writing there isn’t a single main idea. What is the main idea of the Pledge of Allegiance. Aren’t there a least three. These empty standards were created out of political expediency. The makers of standards and tests have built up an artificial construct, politically painless for the makers of the standards and of the tests, but based on a faulty and unproductive picture of reading comprehension …” END QUOTE.

Leave a Reply