Login / Signup

Transformer-Based Deep Neural Language Modeling for Construct-Specific Automatic Item Generation.

Björn E HommelFranz-Josef M WollangVeronika KotovaHannes ZacherStefan C Schmukle
Published in: Psychometrika (2021)
Algorithmic automatic item generation can be used to obtain large quantities of cognitive items in the domains of knowledge and aptitude testing. However, conventional item models used by template-based automatic item generation techniques are not ideal for the creation of items for non-cognitive constructs. Progress in this area has been made recently by employing long short-term memory recurrent neural networks to produce word sequences that syntactically resemble items typically found in personality questionnaires. To date, such items have been produced unconditionally, without the possibility of selectively targeting personality domains. In this article, we offer a brief synopsis on past developments in natural language processing and explain why the automatic generation of construct-specific items has become attainable only due to recent technological progress. We propose that pre-trained causal transformer models can be fine-tuned to achieve this task using implicit parameterization in conjunction with conditional generation. We demonstrate this method in a tutorial-like fashion and finally compare aspects of validity in human- and machine-authored items using empirical data. Our study finds that approximately two-thirds of the automatically generated items show good psychometric properties (factor loadings above .40) and that one-third even have properties equivalent to established and highly curated human-authored items. Our work thus demonstrates the practical use of deep neural networks for non-cognitive automatic item generation.
Keyphrases
  • neural network
  • psychometric properties
  • deep learning
  • endothelial cells
  • machine learning
  • healthcare
  • autism spectrum disorder
  • induced pluripotent stem cells
  • big data
  • working memory