CITE Open Conference Systems, Empowering Communities, Innovating Learning, Learning to Innovate

Font Size: 
A Psychometric Analysis of Information Literacy: Insights from the Information Literacy Performance Assessment in Hong Kong
Frank Reichert, James Zhang, Maggie Zhao, Jimmy de la Torre, Nancy Law

Last modified: 2017-05-10


Digital Literacy is an important capacity for students’ learning in our rapidly changing world. In Hong Kong, the Information Literacy Performance Assessment (ILPA) addressed the importance of students’ digital literacy early on. However, this rich source of assessment data on digital literacy has been left underexploited. Therefore, our analysis examines the ILPA data in order to acquire new knowledge about the dimensionality of digital literacy, and to provide valuable insights that can guide future studies in the field (esp. conceptualization, operationalization, data collection, etc.). The ILPA assessed primary and secondary school students’ information literacy in several domains using a seven-dimensional theoretical framework. However, the applicability of that framework has not been tested with the ILPA data; instead, a unidimensional model was applied for practical purposes (without testing for unidimensionality). The present analysis addresses this deficit and examines students’ performance in technical information literacy. This study explores its dimensionality across both groups of students, and examines the differences between primary and secondary school students. By applying unidimensional item response theory (IRT), we first fit a unidimensional model of information literacy. As this model lacks empirical fit to the data, we subsequently apply exploratory and multi-dimensional item response theory (MIRT) to examine multi-dimensional models of information literacy. Based on multiple fit indices, the dimensionality of information literacy is determined. This model is then further examined in multiple group analyses that compare the applicability of the identified model among primary and secondary students, in order to test for invariance in students’ technical information literacy performance. We achieve this by the stepwise application of parameter constraints for means, variances, slopes, and thresholds. Our analyses provide important implications for future studies on digital literacy for students’ learning in Hong Kong and other countries. On the one hand, the results of the dimensional analysis can guide future studies regarding the specific aspects (dimensions) of information literacy that shall be assessed. Our results also inform instrument development and enhance instrumentation and item calibration to improve the accuracy of digital literacy testing. On the other hand, our multiple group analyses help researchers distinguish age effects in comprehending digital literacy and development, which is meaningful for researchers who intend to develop age-invariant assessments of digital literacy