TY - GEN
T1 - Making the Most of Repetitive Mistakes: An Investigation into Heuristics for Selecting and Applying Feedback to Programming Coursework
AU - Howell, Roger
AU - Wong, Shun H
N1 - © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
PY - 2019/1/16
Y1 - 2019/1/16
N2 - In the acquisition of software-development skills, feedback that pinpoints errors and explains means of improvement is important in achieving a good student learning experience. However, it is not feasible to manually provide timely, consistent, and helpful feedback for large or complex coursework tasks, and/or to large cohorts of students. While tools exist to provide feedback to student submissions, their automation is typically limited to reporting either test pass or failure or generating feedback to very simple programming tasks. Anecdotal experience indicates that clusters of students tend to make similar mistakes and/or successes within their coursework. Do feedback comments applied to students' work support this claim and, if so, to what extent is this the case? How might this be exploited to improve the assessment process and the quality of feedback given to students? To help answer these questions, we have examined feedback given to coursework submissions to a UK level 5, university-level, data structures and algorithms course to determine heuristics used to trigger particular feedback comments that are common between submissions and cohorts. This paper reports our results and discusses how the identified heuristics may be used to promote timeliness and consistency of feedback without jeopardising the quality.
AB - In the acquisition of software-development skills, feedback that pinpoints errors and explains means of improvement is important in achieving a good student learning experience. However, it is not feasible to manually provide timely, consistent, and helpful feedback for large or complex coursework tasks, and/or to large cohorts of students. While tools exist to provide feedback to student submissions, their automation is typically limited to reporting either test pass or failure or generating feedback to very simple programming tasks. Anecdotal experience indicates that clusters of students tend to make similar mistakes and/or successes within their coursework. Do feedback comments applied to students' work support this claim and, if so, to what extent is this the case? How might this be exploited to improve the assessment process and the quality of feedback given to students? To help answer these questions, we have examined feedback given to coursework submissions to a UK level 5, university-level, data structures and algorithms course to determine heuristics used to trigger particular feedback comments that are common between submissions and cohorts. This paper reports our results and discusses how the identified heuristics may be used to promote timeliness and consistency of feedback without jeopardising the quality.
KW - computer aided feedback
KW - coursework assessment
KW - static analysis
KW - technology-enhanced learning
UR - https://ieeexplore.ieee.org/document/8615128
UR - http://www.scopus.com/inward/record.url?scp=85062059520&partnerID=8YFLogxK
U2 - 10.1109/TALE.2018.8615128
DO - 10.1109/TALE.2018.8615128
M3 - Conference publication
SN - 978-1-5386-6523-7
T3 - 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)
SP - 286
EP - 293
BT - Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering, TALE 2018
A2 - Lee, Mark J.W.
A2 - Nikolic, Sasha
A2 - Wong, Gary K.W.
A2 - Shen, Jun
A2 - Ros, Montserrat
A2 - Lei, Leon C. U.
A2 - Venkatarayalu, Neelakantam
PB - IEEE
T2 - 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)
Y2 - 4 December 2018 through 7 December 2018
ER -