Norwegian version

Challenging the design principle of mixed-worded questionnaire scales

The project focuses on the design principle of combining both positively and negatively worded items in questionnaire scales (e.g., “I usually do well in mathematics” and “I am just not good at mathematics”). The assumption is that mixed-worded scales have unintended consequences for the quality of the data.

This project addresses the questionnaire design principle of combining positively and negatively worded items in the same, mixed-worded questionnaire scales, such as “I usually do well in mathematics” and “I am just not good at mathematics” in the TIMSS (Trends in International Mathematics and Science Study) mathematics self-concept scale.

While this mixed wording principle aims to keep respondents’ attention up and foster better, more valid results, we argue that it might have unintended consequences.

Building on previous work (Steinmann et al., 2022a, 2022b), we reason that some respondents might mishandle the mixed item wording due to a lack of attention or a lack of reading, language, or cognitive skills.

Those respondents could then give inconsistent responses, that is, either agree or disagree with both positively and negatively worded items despite their opposite wording.

The aim of this project is to investigate the phenomenon of inconsistent respondents using TIMSS 2019 data from grades 4 and 8.

Specifically, we investigate two research questions: Which students are more likely to respond inconsistently to mixed-worded questionnaire scales? Which countries have larger shares of inconsistent respondents to mixed-worded questionnaire scales? 

We test the hypotheses that students who are younger, have lower achievement scores, and do not (frequently) speak the test language at home are more likely to respond inconsistently to mixed-worded questionnaire scales.

At the country level, we test the hypotheses that there are larger shares of inconsistent respondents in grade 4 as compared to grade 8 samples, in countries with lower mean achievement levels, and with larger shares of students who do not (frequently) speak the test language at home. 

The study speaks to the issue that mixed wording might introduce measurement non-invariance between groups of students and countries, reduce the cross-cultural comparability, and compromise the validity of the data.

Thus, the study questions if mixed-worded scales should be used in questionnaires if the target population includes respondents who might have poor reading, language, or cognitive skills.

The project is thus of interest for everyone developing, evaluating, or using questionnaires in empirical analyses.

International Association for the Evaluation of Educational Achievement (IEA). 

  • Participants at OsloMet

    Loading ...
  • Partner institutions

    • Oslo Metropolitan University
    • University of Oslo, Professor Dr. Johan Braeken, Centre for Educational Measurement (CEMO)