Back to overview

An Evaluation of DIF Tests in Multistage Tests for Continuous Covariates

Type of publication Peer-reviewed
Publikationsform Original article (peer-reviewed)
Author Debelak Rudolf, Debeer Dries,
Project Evaluation of Score-Based Tests in Educational Measurement
Show all

Original article (peer-reviewed)

Journal Psych
Publisher MDPI} {AG
Volume (Issue) 3(4)
Page(s) 619 - 639
Title of proceedings Psych
DOI 10.3390/psych3040040

Open Access

Type of Open Access Publisher (Gold Open Access)


Multistage tests are a widely used and efficient type of test presentation that aims to provide accurate ability estimates while keeping the test relatively short. Multistage tests typically rely on the psychometric framework of item response theory. Violations of item response models and other assumptions underlying a multistage test, such as differential item functioning, can lead to inaccurate ability estimates and unfair measurements. There is a practical need for methods to detect problematic model violations to avoid these issues. This study compares and evaluates three methods for the detection of differential item functioning with regard to continuous person covariates in data from multistage tests: a linear logistic regression test and two adaptations of a recently proposed score-based DIF test. While all tests show a satisfactory Type I error rate, the score-based tests show greater power against three types of DIF effects.