AI Summary of Peer-Reviewed Research
This page presents an AI-generated summary of a published research paper. The original authors did not write or review this article. See full disclosure ↓
Publication Signals show what we were able to verify about where this research was published.MODERATECore publication signals for this source were verified. Publication Signals reflect the source’s verifiable credentials, not the quality of the research.
- ✔ Peer-reviewed source
- ✔ Published in indexed journal
- ✔ No retraction or integrity flags
Overview
This study operationalizes Classical Test Theory (CTT) and Item Response Theory (IRT) frameworks within spreadsheet software to support classroom-based psychometric analysis. The work demonstrates that standard spreadsheet applications can be configured to compute and interpret item-level diagnostic indices beyond aggregate test scores, with potential applicability to competence-based assessment design and item refinement in educational settings.
Methods and approach
Two datasets were constructed to illustrate the methodological workflow: a primary simulated dataset comprising 40 students and 20 dichotomous items, and a secondary dataset of 10 students and 10 items for interpretability demonstration. Spreadsheet formulas were developed to compute CTT indices (item difficulty, discrimination indices, corrected item–total correlations) and descriptive statistics. IRT parameter estimation was implemented using forward-calculation approaches for 1-parameter logistic (1PL), 2-parameter logistic (2PL), and 3-parameter logistic (3PL) models. Item Characteristic Curves (ICCs) were generated to visualize the relationship between latent ability and item response probability.
Key Findings
Spreadsheet-based computations successfully produced CTT and IRT indices consistent with psychometric theory. The approach enabled identification of item functioning patterns, including weak-performing items and problematic distractors. ICC visualization facilitated interpretation of item discrimination and guessing parameters. Results indicated that spreadsheets can support construction of small-scale item banks aligned with competence-based curricula and enable teachers to move beyond reliance on total scores for assessment interpretation.
Implications
The demonstration establishes spreadsheets as a viable tool for in-service psychometric analysis in resource-constrained or low-technology educational contexts. By reducing barriers to psychometric literacy among practicing educators, the approach contributes to implementation of evidence-based item revision and test development processes. The accessibility of spreadsheet-based analysis aligns with SDG 4 objectives by promoting equitable access to assessment methodologies without dependency on specialized statistical software.
Disclosure
- Research title: Operationalising CTT and IRT in Spreadsheets: A Methodological Demonstration for Classroom Assessment
- Authors: António Faria, Guilhermina Lobato Miranda
- Publication date: 2026-02-24
- DOI: https://doi.org/10.3390/analytics5010012
- OpenAlex record: View
- PDF: Download
- Image credit: Photo by cottonbro studio on Pexels (Source • License)
- Disclosure: This post was generated by Claude (Anthropic). The original authors did not write or review this post.
Get the weekly research newsletter
Stay current with peer-reviewed research without reading academic papers — one filtered digest, every Friday.


