Graded Response Method: Does Question Type Influence the Assessment of Critical Thinking?
Abstract
Graded Response Method (GRM) is an alternative to multiple-choice testing where students rank options according
to their relevance to the question. GRM requires discrimination and inference between statements and is a
cost-effective critical thinking assessment in large courses where open-ended answers are not feasible. This study
examined critical thinking assessment in GRM versus open-ended and multiple-choice questions composed from
Bloom’s taxonomy in an introductory undergraduate course in anthropology and archaeology (N=53students).
Critical thinking was operationalized as the ability to assess a question with evidence to support or evaluate
arguments (Ennis, 1993). We predicted that students who performed well on multiple-choice from Bloom’s
taxonomy levels 4-6 and open-ended questions would perform well on GRM involving similar concepts. High
performing students on GRM were predicted to have higher course grades. The null hypothesis was question type
would not have an effect on critical thinking assessment. In two quizzes, there was weak correlation between GRM
and open-ended questions (R2=0.15), however there was strong correlation in the exam (R2=0.56). Correlations were
consistently higher between GRM and multiple-choice from Bloom’s taxonomy levels 4-6 (R2=0.23,0.31,0.21)
versus levels 1-3 (R2=0.13,0.29,0.18). GRM is a viable alternative to multiple-choice in critical thinking assessment
without added resources and grading efforts.
to their relevance to the question. GRM requires discrimination and inference between statements and is a
cost-effective critical thinking assessment in large courses where open-ended answers are not feasible. This study
examined critical thinking assessment in GRM versus open-ended and multiple-choice questions composed from
Bloom’s taxonomy in an introductory undergraduate course in anthropology and archaeology (N=53students).
Critical thinking was operationalized as the ability to assess a question with evidence to support or evaluate
arguments (Ennis, 1993). We predicted that students who performed well on multiple-choice from Bloom’s
taxonomy levels 4-6 and open-ended questions would perform well on GRM involving similar concepts. High
performing students on GRM were predicted to have higher course grades. The null hypothesis was question type
would not have an effect on critical thinking assessment. In two quizzes, there was weak correlation between GRM
and open-ended questions (R2=0.15), however there was strong correlation in the exam (R2=0.56). Correlations were
consistently higher between GRM and multiple-choice from Bloom’s taxonomy levels 4-6 (R2=0.23,0.31,0.21)
versus levels 1-3 (R2=0.13,0.29,0.18). GRM is a viable alternative to multiple-choice in critical thinking assessment
without added resources and grading efforts.
Full Text:
PDFDOI: https://doi.org/10.5430/jct.v8n1p1
Refbacks
- There are currently no refbacks.
Copyright (c) 2019 Sherry Fukuzawa, Michael deBraga
This work is licensed under a Creative Commons Attribution 4.0 International License.
Journal of Curriculum and Teaching ISSN 1927-2677 (Print) ISSN 1927-2685 (Online) Email: jct@sciedupress.com
Copyright © Sciedu Press
To make sure that you can receive messages from us, please add the 'Sciedu.ca' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.