Passing the Test of Time

Passing the Test of Time

College of Education researcher Yang Liu is studying what response times reveal about test and survey takers' behavior.(Photo by John T. Consoli)
College of Education researcher Yang Liu is studying what response times reveal about test and survey takers' behavior.
(Photo by John T. Consoli)

Generations of test takers have scratched their heads, chewed their pencils and finally, hesitantly taken a stab and hoped for the best. Or, having done all their homework, quickly and confidently moved from question to question. In either case, all that mattered is the answer—did they get it right or wrong?

But thanks to modern technology, how test takers and survey respondents answer questions—specifically how long they take—is a new area of research that may be able to help address issues like how to identify unmotivated survey respondents who may not give valid answers, or even suggest new tactics to catch test cheaters.

As assessments and surveys have largely shifted from pen and paper to computerized forms, evaluators have gained the ability to analyze individuals’ response times to questions.

“When students take a test, it’s not just the student’s ability that matters. It’s not just that one single factor that’s going to affect test taking behavior,” said Yang Liu, an assistant professor in the University of Maryland College of Education. “Analyzing response times helps us to better understand what students are doing when they are taking the test—such as whether they are affected by fatigue or randomly guessing.”

With support from a $185,314 grant from the National Science Foundation, Liu is developing a more flexible statistical model that incorporates these digitally recorded response times into an analysis of individual performances.

Assessing response times to test questions may aid in detection of cheating and other aberrant test-taking behavior, he said.

“If students have pre-knowledge of a question, their behavior will be different from a typical student who hasn’t seen the question before,” Liu said. “The student’s familiarity with the question might not be reflected in their answer, but may be indicated by their response times.”

“Understanding test takers’ behaviors and how their performance reflects their abilities is foundational to developing and evaluating assessments,” said College of Education Dean Jennifer King Rice.

By the end of the two-year NSF grant period, Liu hopes to release an open-source software package with a theoretical model to assess response times that will be freely available to researchers, testing companies and others. Liu will also work with Patricia Alexander, a professor in the College of Education to demonstrate the degree to which the proposed model improves the measurement of relational reasoning. It's the ability—central to human intelligence—to draw logical conclusions about how things relate to each other, such as patterns in test questions.

“Existing models that account for response times are usually developed for specific purposes,” Liu said, pointing out that his software would apply in different contexts, from college or graduate school entrance exams like the SAT or GRE—where fast answers seem to indicate higher performance—to surveys, where quick responses often mean respondents aren’t really paying attention.

“The development of this model is designed to increase how people understand the role of response time in different cognitive tests, which may help benefit the practice of test development,” said Liu.

This article originally appeared in Maryland Today.

November 26, 2018


Prev  Next

Connect

Twitter     LinkedIn     RSS Feed

    Division of Research
    University of Maryland
    College Park, MD 20742-1541

    Email: vpr@.umd.edu

        

    Did You Know

    Turtle Image

    UMD is the only major public research university inside the Washington, DC beltway!!