Advertisement

Developing Multiple-Choice Questions for Anatomy Examinations

  • Andrew R. ThompsonEmail author
  • Polly R. Husmann
Chapter
  • 13 Downloads

Abstract

Multiple-choice questions (MCQs) can be an efficient and reliable way to assess student learning, but they are difficult and time consuming to construct. When developing an MCQ, one must avoid common flaws (e.g., negative phrasing or grammatical cues) that can lead to poor question performance. At the same time, the question must evaluate content that is driven by course and/or session learning objectives. Targeting specific cognitive levels with questions can be achieved using Bloom’s taxonomy, both on written and laboratory examinations. Along with these general guidelines, anatomy educators teaching in healthcare fields face additional challenges when it comes to item construction, such as creating a question stem and answer options that are plausible and highlight clinically relevant information. Following an examination, it is critical that psychometric data on question performance is considered to help highlight overlooked flaws or questions that were not written clearly. This chapter will guide the reader through all the necessary steps from the initial development of a question to interpreting its performance on an examination. While becoming a good question writer takes time and practice, the content outlined in this chapter will provide a solid foundation that can be built upon with practical experience.

Keywords

Item writing Assessment Item analysis Practical examinations Discrimination index 

References

  1. 1.
    Levine HG, McGuire CH, Nattress LW. The validity of multiple choice achievement tests as measures of competence in medicine. Am Educ Res J. 1970;7(1):69–82.CrossRefGoogle Scholar
  2. 2.
    Vahalia KV, Subramaniam K, Marks SC, De Souza EJ. The use of multiple-choice tests in anatomy: common pitfalls and how to avoid them. Clin Anat. 1995;8(1):61–5.PubMedCrossRefGoogle Scholar
  3. 3.
    Paniagua MA, Swygert KA. Constructing written test questions for the basic and clinical sciences. Philadelphia: National Board of Medical Examiners; 2016.Google Scholar
  4. 4.
    Knowles MS. Andragogy in action. 1st ed. San Francisco: Jossey-Bass; 1984.Google Scholar
  5. 5.
    Newble D, Entwistle N. Learning styles and approaches: implications for medical education. Med Educ. 1986;20:162–75.PubMedCrossRefGoogle Scholar
  6. 6.
    Stanger-Hall KF. Multiple-choice exams: an obstacle for higher-level thinking in introductory science classes. CBE-Life Sci Educ. 2012;11(3):294–306.PubMedPubMedCentralCrossRefGoogle Scholar
  7. 7.
    Wood T. Assessment not only drives learning, it may also help learning. Med Educ. 2009;43(1):5–6.PubMedCrossRefGoogle Scholar
  8. 8.
    Tan CM, Thanaraj K. Influence of context and preferred learning environments: approaches to studying physiology. Med Educ. 1993;27(2):143–59.PubMedCrossRefGoogle Scholar
  9. 9.
    Ross ME, Green SB, Salisbury-Glennon JD, Tollefson N. College students’ study strategies as a function testing: an investigation into metacognitive self-regulation. Innov High Educ. 2006;30(5):361–75.CrossRefGoogle Scholar
  10. 10.
    Thompson AR, Kelso RS, Ward PJ, Wines K, Hanna JB. Assessment driven learning: the use of higher-order and discipline-integrated questions on gross anatomy practical examinations. Med Sci Educ. 2016;26(4):587–96.CrossRefGoogle Scholar
  11. 11.
    Nnodim JO. Multiple-choice testing in anatomy. Med Educ. 1992;26(4):301–9.PubMedCrossRefGoogle Scholar
  12. 12.
    Vyas R, Supe A. Multiple choice questions: a literature review on the optimal number of options. Natl Med J India. 2008;21(3):130–3.PubMedGoogle Scholar
  13. 13.
    Tarrant M, Ware J. A comparison of the psychometric properties of three- and four-option multiple-choice questions in nursing assessments. Nurse Educ Today. 2010;30(6):539–43.PubMedCrossRefGoogle Scholar
  14. 14.
    Tarrant M, Ware J. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Med Educ. 2008;42(2):198–206.PubMedCrossRefGoogle Scholar
  15. 15.
    Wood EJ. What are extended matching sets questions? Biosci Educ. 2003;1(1):1–8.Google Scholar
  16. 16.
    Epstein R. Assessment in medical education. N Engl J Med. 2007;356(4):387–96.PubMedCrossRefGoogle Scholar
  17. 17.
    Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ. 2007;7(1):1–7.CrossRefGoogle Scholar
  18. 18.
    Clifton S, Schriner C. Assessing the quality of multiple choice test items. Nurse Educ. 2010;35(1):12–6.PubMedCrossRefGoogle Scholar
  19. 19.
    Thompson AR, O’Loughlin VD. The blooming anatomy tool (BAT): a discipline-specific rubric for utilizing Bloom’s taxonomy in the design and evaluation of assessments in the anatomical sciences. Anat Sci Educ. 2015;8(6):493–501.PubMedCrossRefGoogle Scholar
  20. 20.
    Anderson L, Krathwohl D, Airasian P, Cruikshank K, Mayer R, Pintrich P, et al. A taxonomy for learning, teaching and assessing: a revision of Bloom’s taxonomy of educational objectives. New York: Longman; 2001.Google Scholar
  21. 21.
    Anderson L, Sosniak L, editors. Bloom’s taxonomy: a forty-year retrospective. Chicago: University of Chicago Press; 1994.Google Scholar
  22. 22.
    Bloom B, Englehard M, Furst E, Hill W, Karathwohl D. Taxonomy of educational objectives: cognitive domain. New York: McKay; 1956.Google Scholar
  23. 23.
    Burns ER. “Anatomizing” reversed: use of examination questions that foster use of higher order learning skills by students. Anat Sci Educ. 2010;3(6):330–4.PubMedCrossRefGoogle Scholar
  24. 24.
    LeBlanc V, Cox MA. Interpretation of the point-biserial correlation coefficient in the context of a school examination. The Quant Meth Psych. 2017;13(1):46–56.CrossRefGoogle Scholar
  25. 25.
    McGahee TW, Ball J. How to read and really use an item analysis. Nurse Educ. 2009;34(4):166–71.PubMedCrossRefGoogle Scholar
  26. 26.
    Jastrow H, Hollinderbäumer A. On the use and value of new media and how medical students assess their effectiveness in learning anatomy. Anat Rec B: The New Anatomist. 2004;280B(1):20–9.CrossRefGoogle Scholar
  27. 27.
    Gross MM, Wright MC, Anderson OS. Effects of image-based and text-based active learning exercises on student examination performance in a musculoskeletal anatomy course. Anat Sci Educ. 2017;10(5):444–55.PubMedCrossRefGoogle Scholar
  28. 28.
    Mayer RE. The Cambridge handbook of multimedia learning. 2nd ed. New York: Cambridge University Press; 2014.CrossRefGoogle Scholar
  29. 29.
    Notebaert AJ. The effect of images on item statistics in multiple choice anatomy examinations. Anat Sci Educ. 2017;10(1):68–78.PubMedCrossRefGoogle Scholar
  30. 30.
    Vorstenbosch MATM, Klaassen TPFM, Kooloos JGM, Bolhuis SM, Laan RFJM. Do images influence assessment in anatomy? Exploring the effect of images on item difficulty and item discrimination. Anat Sci Educ. 2013;6(1):29–41.PubMedCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Department of Medical EducationUniversity of Cincinnati College of MedicineCincinnatiUSA
  2. 2.Anatomy and Cell BiologyIndiana University School of MedicineBloomingtonUSA

Personalised recommendations