ATP/CCSSO Publication Cited in GAO Report
Operational Best Practices for Large-Scale Statewide Assessment Programs utilized as basis for survey on cheating and test irregularities in state assessments.
The U.S. Government Accountability Office (GAO) this week released the results of a 50-state survey focusing on what kind of oversight and resources are in place at the state level aimed at addressing testing irregularities, security breaches and cheating by test takers and school officials. According to the report, in the past two years 40 states have detected potential cheating on standardized exams, with 32 of those states confirming at least one instance of cheating, and 32 state canceling or invalidating scores from individual test takers, schools or districts.
“To address our objectives, we designed and administered a web-based survey of testing administrators in the 50 states and the District of Columbia. We conducted our survey from November 2012 to January 2013, and received a response rate of 100 percent. We relied on the Operational Best Practices for Statewide Large-Scale Assessment Programs guide as a basis to design the survey,” Linda Calbom, GAO Western Regional Director, reported in describing the scope and methodology of the study.
In summary, the survey indicated that although all states reported having policies and procedures that reflected leading practices to prevent testing irregularities, states varied over which categories of leading practices they actually utilized. “For example,” Calbom states in the report, “22 states reported having all of the leading practices for security training, but four states reported having none of the practices in this category.”
She added, “Although state officials reported having a variety of security policies and procedures in place, many reported feeling vulnerable to cheating at some point during the testing process.”
The full GAO report can be accessed here. For HTML version click here:
For a .pdf copy of the report, click here.
NCES Releases Report on Testing Irregularities
The National Center for Education Statistics released the results of its study of state and local testing irregularities on February 12:
Based on input provided by comments to the Department of Education’s Request for Information (January 17, 2012) and the NCES Symposium held on February 28, 2012, the report represents the Department’s effort to “gather and share information about practices and policies that have been used to prevent, detect and respond to irregularities” in K-12 statewide assessments. The investigation was begun in response to the teacher/administrator cheating scandals in Atlanta, DC and other cities and districts around the country.
Although ATP submitted a lengthy response to the RFI, and several ATP members participated in the Symposium (College Board, Pearson, Caveon), as did CCSSO, the report fails to mention or cite the CCSSO/ATP Operational Best Practices in Large-Scale Statewide Assessments (2010). In fact, the report states that there is no “library of best practices” that could help SEAs and LEAs prevent, detect and respond to irregularities. ATP, in its comments, told the Department that the Operational Best Practices addresses test security in a myriad of situations that occur between a state and its service provider, but we noted that the document is not aimed directly at local agencies and districts, where these cheating problems arise, who must implement practices and policies adopted by their individual states and deal with disciplinary issues created by their employees (e.g., teachers, principals, district testing officials).
The only recognition of ATP in the report is its adoption of ATP’s definition of “technology-based assessments.” ATP pointed out that it is not accurate to call all non-paper-based assessments “online” since many tests are not actually delivered over the Internet. The concept of “technology-based assessments” is a primary focus of the upcoming revision of the Operational Best Practices, expected to be released this summer. Indeed, ATP and CCSSO announced this week the opening of a public comment period on the new version of the document – the public comment period runs through April 6, 2013 (see related story on the ATP Homepage).