AI Summary of Peer-Reviewed Research

This page presents an AI-generated summary of a published research paper. The original authors did not write or review this article. [See full disclosure ↓]

Publishing process signals: STRONG — reflects the venue and review process. — venue and review process.

Students valued an AI learning assistant but had ethical concerns

A person wearing a red jacket and hat sits at a desk with a laptop in a modern, open-plan office or classroom space with pink/coral-colored walls, surrounded by other workstations and people in a collaborative work environment.
Research area:Engineering ethicsComputer Science ApplicationsHigher Education Practises and Engagement

What the study found

Students valued the Educational AI Hub, an AI-powered learning framework, for its accessibility and comfort, but they also had ethical concerns about its use. Nearly half of the students said it was easier to use than asking instructors or teaching assistants for help.

Why the authors say this matters

The authors conclude that understanding how students engage with generative AI in higher education is important for responsible adoption. The study suggests that usability, ethical transparency, and faculty guidance are important for promoting meaningful AI engagement.

What the researchers tested

The researchers studied the Educational AI Hub in undergraduate civil and environmental engineering courses at a large R1 public university. They used a mixed-methods design with pre- and post-surveys, system usage logs, and qualitative analysis of students' AI interactions.

What worked and what didn't

The tool was most helpful for completing homework and understanding concepts. Views on its instructional quality were mixed, and ethical uncertainty about institutional policy and academic integrity was a key barrier to fuller engagement. Students generally saw AI as a supplement rather than a replacement for human instruction.

What to keep in mind

The abstract does not describe detailed limitations beyond the study scope. The findings come from 71 students in two courses, with over 600 AI interactions and 100 survey responses.

Key points

  • Students valued the AI assistant for accessibility and comfort.
  • Nearly half said it was easier to use than asking instructors or teaching assistants for help.
  • The tool was most helpful for homework and understanding concepts.
  • Students had mixed views on the assistant's instructional quality.
  • Ethical uncertainty about policy and academic integrity limited engagement.

Disclosure

Research title:
Students valued an AI learning assistant but had ethical concerns
Authors:
Ramteja Sajja, Yusuf Sermet, Brian Fodale, İbrahim Demir
Institutions:
Tulane University, University of Iowa
Publication date:
2026-02-06
OpenAlex record:
View
AI provenance: This post was generated by OpenAI. The original authors did not write or review this post.