According to the state comptroller, both Questar Assessment Inc. and the Tennessee Department of Education (TDOE) hold fault for the massive failure of the 2018 state assessment tests given in April, largely due to a lack of communication, both internally and externally, and a lack of proper monitoring procedures.

This past April, the TNReady online testing process for Tennessee students experienced an unparalleled failure with myriad problems.

According to Tullahoma City Schools Director of Curriculum Susan Fanning, tests were given to the wrong students, while some students’ work was not saved before the system crashed and more.

“Last year, with the online piece, it crashed,” she said. “It shut down. It froze. Tests got lost. Wrong tests were given to students when they would log in.”

The report, which was released on Wednesday, also gives recommendations on how all parties involved in administering successful student assessments can improve future testing.

“At minimum,” the report states, those involved must “improve the implementation of online testing platforms, the procurement of vendor services and internal and external communication between all parties.”

 

Findings

Included in the comptroller’s findings were numerous failures on the part of the state department of education, including not “adequately” monitoring any system changes from previous vendors, an insufficient “annual work plan” with Questar that made it “less effective for contract management” and overall checks on the vendor that would ensure testing would go smoothly.

 

Internal and external communication

One of the main problems the audit found was a lack of proper communication between all parties involved with the online testing.

According to the report, there were four primary departmental groups tasked with the administration of the TNReady tests. The four groups were Content, Assessment and Design; Assessment Logistics; Information Technology; and Psychometrician.

These internal groups did not check in with one another, which contributed to the systemwide failures of the testing platform, according to the report, because any one group’s decisions could have an impact on one or more other groups.

For example, Questar made a change to the online testing platform involving a text-to-speech accommodation, which was a contributing factor in the testing issues this spring. Department officials were not notified of the change, nor was it “communicated … (to) top leadership or relative groups that needed to know.”

“This omission highlighted our concern about the lack of proper internal communication among department staff, key department groups and top management,” the report states.

Additionally, the audit states the four key groups’ lack of sufficient communication and different monitoring methodologies made it difficult to gather “all relevant information needed to evaluate the project as a whole.”

External communication was also poor, according to the report, as there was a lack of communication about the department’s expectations for platform changes on Questar’s end.

Again referencing the change to the text-to-speech component, the report states “Questar should have formally communicated this change to top department management,” though it failed to do so.

Overall, the report admonishes the department for conducting contract communications through phone calls and “emails that the department failed to retain.”

“This lack of documentation impacted our ability to fully understand what occurred, when it occurred and how it ultimately affected the testing experience of students and school personnel,” the report states.

While phone conversations clearly need to happen, the report adds, there should be a process in the future to fully document phone conversations in order to “adequately capture all key contract decisions made.”

Documentation will be vital moving forward, according to the report.

“Without a formal written communication process … the department cannot ensure transparency, nor can management ensure that critical decisions … have been properly vetted and approved,” it states. “The department’s ability to achieve accountability and transparency is vital for maintaining the trust of all stakeholders.

 

Implementation concerns

When it comes to the “implementation of online testing” for student assessments, the report states “the department’s push to implement online tests may have been overly ambitious.”

While TDOE is required to administer student assessments each year, it is not required to use online testing procedures. Given the state’s consistent problems with online testing, the report states the department cornered itself into entering contracts with companies without taking “adequate time to respond to and resolve potential issues with assessment vendors,” which led to widespread failures of the system.

The example highlighted was the 2015-2016 school year testing vendor, Measurement Inc. Online testing also experienced a multitude of failures during that school year, prompting TDOE to abruptly cancel its contract with the vendor and “enter into an emergency procurement option” to get a new online testing vendor for the next school year.

“We have concerns that the department has proceeded with large-scale procurements involving millions of dollars under intense time constraints,” the report reads.

Given the concerns and consistent failures of the online testing portions, the auditor believes the department “should reconsider the timing of full implementation for online assessments.”

The comptroller’s office put out a survey to teachers and administrators about this past year’s TNReady assessments, asking them for their thoughts on the whole process.

Overall, the survey said, the more problems unfolded and the longer they continued, both teachers and students alike became frustrated.

“Some respondents believed that online testing problems could result in students not performing to the best of their abilities on assessments, especially if they lack confidence in the process,” the report reads.

One comment from a teacher supplied in the report backs up that concern.

“After the first [two] days of nothing but problems, many students gave up,” the comment reads. “When they took subsequent parts of the tests, they flew through them, clicking any answer at all simply to get the test finished before it crashed again. Students were so discouraged by the entire process that [they] gave up trying.”

 

Procurement

There were two concerns when it came to the overall contract negotiations for vendor contracts.

First, the original request for proposals (RFP) was too “broad in scope.” Because the original RFP encompassed “all aspects of assessment” – including the test administration, scoring, reporting, analysis and test development – the audit claims the “breadth of the RFP” may have limited the number of vendors that could have bid.

Only five proposals were received from the original RFP, including Measurement Inc. and Questar. Other bidders were CTB/McGraw Hill, Vantage Labs (USA) LLC and NCS Pearson Inc.

Second, the audit was concerned with how quickly the department entered into a contract with Questar after Measurement Inc. failed in the 2015-2016 school year.

Because the department wanted to quickly sign a new contract in order to have online testing ready for the next school year, it chose Questar, in part, “because its proposal was the next highest score from the original RFP process.” Questar was also “one of the only vendors that big on the original RFP that could deliver the assessment on such a short timeframe,” according to Commissioner of Education Candice McQueen, the report states.

Rather than immediately contract with Questar, “the department could have considered the option of issuing a new RFP for online assessments,” according to the report, and only sign an emergency procurement agreement for paper testing in the interim.

“In addition to directly addressing the issues the department had with Measurement Inc., a new RFP event would have given new vendors the opportunity to respond,” the report states.

 

Other findings

In addition to the implementation, procurement and communication problems, the audit also found that Questar “failed to sufficiently staff customer support, resulting in lengthy call wait times and high rates of abandoned calls for districts seeking assistance.”

According to the report, a “tactical operations center” of customer service representatives for Questar were available between the hours of 7 a.m. and 4:30 p.m. Central time. Questar’s Chief Operating Officer explained that Questar had eight full-time/seasonal staff members and 12 temporary staff members dedicated to the Tennessee testing window.

However, during the first four days of testing, between April 16 and April 19, technical support call times were nearly an hour. With long wait times, the number of abandoned calls ranged from 87 on April 19 to 206 on April 17.

The TNReady survey echoed these claims, with multiple comments about the technical support line being “ridiculous.”

“Once we were testing, wait times to ask questions were from 15 minutes upwards to 55 minutes,” one comment reads. “That is a long time for a student to sit waiting to resume.”

The audit’s recommendation to TDOE includes working with Questar and future vendors on establishing “clear benchmarks for acceptable customer support experiences.”

The benchmarks need to address both call wait times and problem resolution times. Call centers should also be “appropriately staffed” in order to keep wait times low.

Questar is also reported as failing to “adequately track and inform districts about the progress of test recoveries, leaving districts in the dark about whether students had completed their tests.”

When students had testing failures, there was little to no information for teachers as to whether or not the students’ answers were submitted properly.

According to one survey comment from a teacher, “I don’t know if students who were kicked out of the system were able to submit their tests, because there was no way to check that.”

According to initial communication from Questar, student test data was supposed to take between 24 and 48 hours to become visible in the testing platform; however, that number grew to 72 hours “due to the volume of requests received,” though neither the department nor Questar were able to tell districts the information they needed.

The audit’s recommendation is that in the future, adequate test recovery systems should be stipulated in the contract. There should also be in place a procedure for test vendors to retrieve testing information should any problems happen in the future.

The full audit can be found on the comptroller’s website, www.comptroller.tn.gov/sa.

Erin McCullough may be reached at emccullough@tullahomanews.com.