Exploring Generative AI assisted feedback writing for students’ written responses to a physics conceptual question with prompt engineering and few-shot learning

posted in: reading | 0
Abstract: Instructor’s feedback plays a critical role in students’ development of conceptual understanding and reasoning skills. However, grading student written responses and providing personalized feedback can take a substantial amount of time. In this study, we explore using GPT-3.5 to write feedback to student written responses to conceptual questions with prompt engineering and few-shot learning techniques. In stage one, we used a small portion (n=20) of the student responses on one conceptual question to iteratively train GPT. Four of the responses paired with human-written feedback were included in the prompt as examples for GPT. We tasked GPT to generate feedback to the other 16 responses, and we refined the prompt after several iterations. In stage two, we gave four student researchers the 16 responses as well as two versions of feedback, one written by the authors and the other by GPT. Students were asked to rate the correctness and usefulness of each feedback, and to indicate which one was generated by GPT. The results showed that students tended to rate the feedback by human and GPT equally on correctness, but they all rated the feedback by GPT as more useful. Additionally, the successful rates of identifying GPT’s feedback were low, ranging from 0.1 to 0.6. In stage three, we tasked GPT to generate feedback to the rest of the student responses (n=65). The feedback was rated by four instructors based on the extent of modification needed if they were to give the feedback to students. All the instructors rated approximately 70% of the feedback statements needing only minor or no modification. This study demonstrated the feasibility of using Generative AI as an assistant to generating feedback for student written responses with only a relatively small number of examples. An AI assistance can be one of the solutions to substantially reduce time spent on grading student written responses.

arxiv.org/abs/2311.06180v1










Ryan Watkins