Three Procedures to Ensure Accurate Results on Standardized Assessments
By Caroline Fahmy
State departments of education and testing vendors diligently work to ensure that assessment results for standardized state assessments are accurate. After all, the purpose of assessing students is to take action on the results, whether it be for accountability or student learning, and relying on data that is incomplete or invalid is pointless.
Assessment programs have strict standardized procedures in place to check data and to help district and school site testing coordinators manage the implementation of the assessment process. Over many years of processing large-scale student assessments, we have found three procedures that are especially helpful in ensuring that test results are more accurate.
- Train district and school personnel.
One measure districts can take to make an assessment program successful is to make sure the correct people have the necessary training. Each assessment program will have specific procedures for administering tests, sharing required data files, and, if the test is paper-based, for moving those papers around. Knowing the details of testing procedures prior to starting the assessment process is crucial for success.
Before testing begins, districts can make sure that the test vendor has accurate contact information for the district assessment coordinator. This helps to ensure that the right person will be receiving procedural information and updates, and that there will be no time delays due to information getting lost or directed to the wrong person.
It’s also important for district and school assessment coordinators to take advantage of state- and vendor-provided training. Sometimes this will involve in-person training sessions, but online tutorials may offer another training option as well. Even ongoing assessment programs can have changes in procedure from year to year, so training remains important even for veterans of the process. For example, boxes of paper tests that are waiting to be scored could sit for a long time if no one at the district realizes that shipping procedures have changed.
- Provide valid student data.
Districts can improve the accuracy of their assessment results by providing valid and complete student-level data to the test vendor. There are many relevant examples. Duplicate, missing, or incorrect student ID numbers can create problems for accurate reporting. No one wants to find out after testing that scores have been matched incorrectly with students. Missing student demographic information (such as gender, ethnicity, or primary language) makes reporting and evaluating test results for subgroups difficult.
Assessment vendors run many checks to ensure that the student-level data is valid and complete. Often, vendors create software programs to flag possible errors (such as missing critical data or invalid student ID) that also give authorized district administrators the opportunity to use an online program to review and edit the data. This review step is especially important and helpful to improving the validity of student data.
- Prevent file-processing errors.
District staff can help to prevent processing errors by preparing data files that exactly meet the published specifications or file record layout(s).
When computers process data files, details are very important. The computer is expecting data fields to be valid in the sense that the information will follow a set of rules specified in the file record layout. When the data does not follow the rules of the layout, the assessment reports can be inaccurate or delayed.
For example, a leading zero in a data field could get lost in file translation so that the computer misinterprets a county, district, or school code in that data field. As a result, the computer might assign those students to an incorrect school or district. Similarly, if the data columns are switched, the computer might be looking for gender information and actually find a student’s middle initial. The reports for the gender subgroup would be wildly inaccurate, and student-level reports would contain incorrect names.
Student information systems (SIS) have simplified many data-gathering and storage tasks, but they do not always export files in ways that meet state assessment specifications. Districts must take steps to ensure that the files they are transferring to the test vendor or the state meet the exact specifications of each assessment program. Although states and vendors often try to keep record layouts the same for ongoing programs, small changes are possible, and, therefore, districts may need to adjust their procedures.
Assessment coordinators should feel comfortable in contacting the test vendor for assistance with questions regarding data file formatting. Sometimes talking with an actual person is the quickest and most effective way to resolve questions that arise prior to, during, or after test administration. It is much better to act on correct information than to act on inaccurate assumptions.
Effective implementation of an assessment program comes down to accurate data. Any problems with student data may increase processing time and can compromise the validity of the results and, thus, interventions or decisions made about those results.
For the last 22 years, Caroline Fahmy has served as president and CEO of Educational Data Systems, responsible for the success, growth, and development of the company. For 11 years prior to that, she was vice president, responsible for the company’s finance and accounting departments and directing the statewide assessment division. She has managed many large-scale assessment and research projects, including multiple statewide assessments in Maine, Massachusetts, Oregon, and California, as well as the research and development of new software products and services.