AI Summary of Peer-Reviewed Research

This page presents an AI-generated summary of a published research paper. The original authors did not write or review this article. [See full disclosure ↓]

Publishing process signals: STRONG — reflects the venue and review process. — venue and review process.

Student-created multiple-choice items were comparable to faculty items

A close-up overhead view of two open notebooks or test papers with handwritten answers and diagrams, a pencil resting on the lower page, and a highlighter visible between the pages, suggesting active studying or test completion.
Research area:Social SciencesEducationInnovative Teaching Methods

What the study found

Student-created multiple-choice questions from a general chemistry 1 course were found to have comparable, and in some cases fewer, item writing flaws than faculty-created questions.

Why the authors say this matters

The authors suggest that learner-sourcing, where students create assessment items, may help address instructors' concerns about poor item quality, and that training students to identify item writing flaws could support the use of student-created items on exams.

What the researchers tested

The researchers evaluated student-created multiple-choice items from a general chemistry 1 course using the Item Writing Flaws Evaluation Instrument (IWFEI), a tool for identifying problems in test-question writing. They compared these items with faculty-created multiple-choice items that had already been used on exams.

What worked and what didn't

The data set showed that student-created items contained comparable, if not fewer, item writing flaws than faculty-created items. The abstract does not report detailed breakdowns of which specific flaws were more or less common.

What to keep in mind

The abstract does not describe detailed limitations, and the findings are based on a general chemistry 1 course and the specific set of items studied.

Key points

  • Student-created multiple-choice questions had comparable, and sometimes fewer, item writing flaws than faculty-created questions.
  • The study examined items from a general chemistry 1 course.
  • Researchers used the Item Writing Flaws Evaluation Instrument (IWFEI) to assess question quality.
  • The comparison was made against faculty-created items that had already been used on exams.
  • The authors suggest training students to identify item writing flaws may help support learner-sourcing.

Disclosure

Research title:
Student-created multiple-choice items were comparable to faculty items
Authors:
Emmanuel Luna Jimenez, Jared Breakall, Christopher Randles
Institutions:
University of Central Florida, Snow College
Publication date:
2026-01-28
OpenAlex record:
View
AI provenance: This post was generated by OpenAI. The original authors did not write or review this post.