Contents Online
Statistics and Its Interface
Volume 12 (2019)
Number 4
A method of Q-matrix validation based symmetrised Kullback–Leibler divergence for the DINA model
Pages: 537 – 547
DOI: https://dx.doi.org/10.4310/SII.2019.v12.n4.a4
Authors
Abstract
Q-matrix validation is one of the most vital parts in cognitive diagnosis, as the misspecification of Q-matrix may seriously influence the model fit and lead to incorrect classifications of examinees. In this paper, we propose a symmetrised Kullback–Leibler divergence- (SKLD-) based method to validate misspecified Q-matrix with a combination of K-means clustering. Three simulation studies are conducted to evaluate the sensitivity and specificity of the proposed method compared with that based on log odds ratio (LOR) and item discrimination index (IDI). The results show that the SKLD-based method could efficiently identify and validate misspecified elements in Q-matrix, and at the same time retain those correct ones. What’s more, two real data sets are employed to further illustrate the performance of SKLD-based method.
Keywords
Q-matrix, misspecification, DINA model, SKLD, IDI, LOR, K-means clustering
This work is supported by the National Natural Science Foundation of China (grant number 11571069).
Received 13 June 2018
Accepted 7 April 2019
Published 18 July 2019