Login / Signup

Validation of two measures for assessing English vocabulary knowledge on web-based testing platforms: brief assessments.

Lee DrownNikole GiovannoneDavid B PisoniRachel M Theodore
Published in: Linguistics vanguard : multimodal online journal (2023)
Two measures for assessing English vocabulary knowledge, the Vocabulary Size Test (VST) and the Word Familiarity Test (WordFAM), were recently validated for web-based administration. An analysis of the psychometric properties of these assessments revealed high internal consistency, suggesting that stable assessment could be achieved with fewer test items. Because researchers may use these assessments in conjunction with other experimental tasks, the utility may be enhanced if they are shorter in duration. To this end, two "brief" versions of the VST and the WordFAM were developed and submitted to validation testing. Each version consisted of approximately half of the items from the full assessment, with novel items across each brief version. Participants ( n  = 85) completed one brief version of both the VST and the WordFAM at session one, followed by the other brief version of each assessment at session two. The results showed high test-retest reliability for both the VST ( r  = 0.68) and the WordFAM ( r  = 0.82). The assessments also showed moderate convergent validity (ranging from r  = 0.38 to 0.59), indicative of assessment validity. This work provides open-source English vocabulary knowledge assessments with normative data that researchers can use to foster high quality data collection in web-based environments.
Keyphrases
  • psychometric properties
  • healthcare
  • high intensity
  • electronic health record
  • big data
  • working memory
  • transcranial direct current stimulation
  • machine learning
  • deep learning