Rubric for the Quality of Answers to Student Queries about CodeGlobal
Novice programmers need adequate support to succeed in their courses. This support requires both pedagogical content knowledge and general pedagogical knowledge. These requirements apply to all support staff, e.g., instructors and teaching assistants (TAs). Here we focus on support in the form of answers to student queries about code. We have developed a rubric to assess the quality of answers provided by support staff. In this paper, we present the theoretical framework behind the rubric, the full rubric itself, and two evaluation approaches. First, we evaluated the rubric internally by using it to assess 85 written answers from TAs. From these, we included two sample excerpts and their evaluation using the rubric. Second, our external evaluation included interviews with experts (n=13), which we analyzed using qualitative content analysis. These interviews revealed positive aspects, aspects that could be improved, and other areas of application such as support for reflection.
Thu 21 MarDisplayed time zone: Pacific Time (US & Canada) change
10:45 - 12:00 | Assessment & GradingPapers at Meeting Rooms B110-112 Chair(s): Amanpreet Kapoor University of Florida, USA | ||
10:45 25mTalk | Diverging assessments: What, Why, and ExperiencesGlobal Papers Amin Sakzad Monash University, David Paul University of New England, Judy Sheard Monash University, Ljiljana Brankovic University of New England, Matthew P. Skerritt RMIT University, Nan Li University of Wollongong, Australia, Sepehr Minagar Monash University, Simon , William Billingsley University of New England DOI | ||
11:10 25mTalk | Mechanical TA 2: Peer Grading With TA and Algorithmic SupportGlobal Papers DOI | ||
11:35 25mTalk | Rubric for the Quality of Answers to Student Queries about CodeGlobal Papers Svana Esche Technical University of Darmstadt DOI |