Will the nation’s “gold standard” test turn to dross?

As the Huffington Post just reported, the National Assessment of Educational Progress (NAEP), which often is billed as the “gold standard” for education testing, is making a major transition from its former paper-and-pencil testing format to a new, digital administration. However, the NAEP’s transition to digital testing has many thoughtful observers holding their breath. Click the "Read more" link to see why.

Not long ago, the Partnership for Assessment of Readiness for College and Careers (PARCC), which is one of the two major Common Core State Standards testing consortia, announced that results from the same tests were different for students who used paper and pencil test forms versus those who completed the PARCC assessment online. Students who completed PARCC online scored lower, raising some serious questions about whether online testing really is as transparent as some would suggest.

The NAEP could easily be heading into similar, clouded waters.

I am told that the NAEP will use a specific type of digital tablet operating in a special, closed network in each school for its testing, which begins in early 2017.However, it seems likely that any students who are experienced with the specific digital tablets to be used will have an advantage over students who are unfamiliar with these particular digital instruments and lack practice in their operation. Just think about the adjustment period most of us have experienced after sitting down with a new computer, tablet, or smart phone. How would you like to go through such an adjustment under testing conditions?

When it comes to digital testing, the differences with paper-and-pencil testing can be considerable.

For example, Education Week reports that in Rhode Island, “42.5 percent of the students who took the PARCC English/language arts exam on paper scored proficient, compared with 34 percent of those who took the test by computer.”

An official from that state cited varying degrees of student readiness for technology as a major reason for the differences.

Getting closer to the heart of things, a 2001 study based on the NAEP itself concluded that administering the assessment digitally made the NAEP test much harder. This study also showed that specific familiarity with the technology used was an issue, as well.

A more recent, 2014 study involving digital administration of a NAEP Writing Assessment also found that students with more digital savvy did notably better.

I am told that the NAEP will run some backup pencil-and-paper testing in 2017 to hopefully work around any problems that show up. But, the quality and quantity of that backup testing could also become issues. After all, as the Huffington Post article points out, new approaches can be applied with digital testing that simply are not available with paper type tests. So, even companion paper-and-pencil testing might not be able to recover good trend lines if the majority of the NAEP testing effort in 2017 goes awry.

Thus, as we approach the opening of the 2017 NAEP math and reading testing window, a lot of people, myself included, are holding their breath to see if the nation’s “gold standard” turns to dross right at the time when the Common Core State Standards effort is maturing and the nation badly needs solid answers about how well this controversial set of education standards is actually working.