Detailanzeige
Titel |
Rapid guessing rates across administration mode and test setting |
---|---|
Autoren |
Kröhne, Ulf ![]() ![]() ![]() ![]() ![]() ![]() |
Originalveröffentlichung | Psychological test and assessment modeling 62 (2020) 2, S. 147-177 ![]() |
Dokument | Volltext (867 KB) |
Lizenz des Dokumentes | Deutsches Urheberrecht |
Schlagwörter (Deutsch) | Test; Bewertung; Innovation; Validität; Technologiebasiertes Testen; Design; Testkonstruktion; Testverfahren; Wirkung; Verhalten; Logdatei; Experiment; Student; Vergleichsuntersuchung |
Teildisziplin | Empirische Bildungsforschung |
Dokumentart | Aufsatz (Zeitschrift) |
ISSN | 2190-0493; 21900493 |
Sprache | Englisch |
Erscheinungsjahr | 2020 |
Begutachtungsstatus | Peer-Review |
Abstract (Englisch): | Rapid guessing can threaten measurement invariance and the validity of large-scale assessments, which are often conducted under low-stakes conditions. Comparing measures collected under different administration modes or in different test settings necessitates that rapid guessing rates also be comparable. Response time thresholds can be used to identify rapid guessing behavior. Using data from an experiment embedded in an assessment of university students as part of the National Educational Panel Study (NEPS), we show that rapid guessing rates can differ across modes. Specifically, rapid guessing rates are found to be higher for un-proctored individual online assessment. It is also shown that rapid guessing rates differ across different groups of students and are related to properties of the test design. No relationship between dropout behavior and rapid guessing rates was found. (DIPF/Orig.) |
weitere Beiträge dieser Zeitschrift | Psychological test and assessment modeling Jahr: 2020 |
Statistik | ![]() |
Prüfsummen | Prüfsummenvergleich als Unversehrtheitsnachweis |
Eintrag erfolgte am | 05.01.2022 |
Quellenangabe | Kröhne, Ulf; Deribo, Tobias; Goldhammer, Frank: Rapid guessing rates across administration mode and test setting - In: Psychological test and assessment modeling 62 (2020) 2, S. 147-177 - URN: urn:nbn:de:0111-pedocs-236307 - DOI: 10.25656/01:23630 |