In the acquisition of software-development skills, feedback that pinpoints errors and explains means of improvement is important in achieving a good student learning experience. However, it is not feasible to manually provide timely, consistent, and helpful feedback for large or complex coursework tasks, and/or to large cohorts of students. While tools exist to provide feedback to student submissions, their automation is typically limited to reporting either test pass or failure or generating feedback to very simple programming tasks. Anecdotal experience indicates that clusters of students tend to make similar mistakes and/or successes within their coursework. Do feedback comments applied to students' work support this claim and, if so, to what extent is this the case? How might this be exploited to improve the assessment process and the quality of feedback given to students? To help answer these questions, we have examined feedback given to coursework submissions to a UK level 5, university-level, data structures and algorithms course to determine heuristics used to trigger particular feedback comments that are common between submissions and cohorts. This paper reports our results and discusses how the identified heuristics may be used to promote timeliness and consistency of feedback without jeopardising the quality.
|Title of host publication||Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering, TALE 2018|
|Editors||Mark J.W. Lee, Sasha Nikolic, Gary K.W. Wong, Jun Shen, Montserrat Ros, Leon C. U. Lei, Neelakantam Venkatarayalu|
|Number of pages||8|
|Publication status||Published - 16 Jan 2019|
|Event||2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE) - Wollongong, Australia|
Duration: 4 Dec 2018 → 7 Dec 2018
|Name||2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)|
|Conference||2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)|
|Period||4/12/18 → 7/12/18|
Bibliographical note© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
- computer aided feedback
- coursework assessment
- static analysis
- technology-enhanced learning
Howell, R., & Wong, S. H. (2019). Making the Most of Repetitive Mistakes: An Investigation into Heuristics for Selecting and Applying Feedback to Programming Coursework. In M. J. W. Lee, S. Nikolic, G. K. W. Wong, J. Shen, M. Ros, L. C. U. Lei, & N. Venkatarayalu (Eds.), Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering, TALE 2018 (pp. 286-293).  (2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)). IEEE. https://doi.org/10.1109/TALE.2018.8615128