Investigating Patterns of Pre-service Teachers’ Written Feedback on Procedure-based Mathematics Assessment Items

This study investigates patterns exhibited by pre-service teachers (PSTs) while practicing feedback in response to students’ solutions on a procedure-based mathematics assessment. First, we developed an analytical framework for understanding mathematics PSTs’ written feedback. Second, we looked into how a learning module on a multimedia platform influenced PSTs’ feedback, and identified the ways in which PSTs subsequently revised their initial written feedback. From this analysis, we derived an operational list of emergent patterns of PSTs’ written and content-specific feedback. Our findings suggest that PSTs, in general, are likely to improve their written feedback when they engage in reviewing student work, and have the chance to write and revise their feedback. We also discuss some patterns of PSTs’ initial feedback that did change with practice, and other patterns that persisted, suggesting the need for further guidance and practice in providing students with written feedback.


INTRODUCTION
Assessment is the process of gathering and interpreting evidence related to student learning, towards the goal of improving instruction. Therefore, a teacher's assessment practices serve as a crucial link among teaching strategies, learning activities, and learning outcomes and ultimately promote a productive cycle of teaching and learning in the classroom (Black & Wiliam, 1998;Hattie & Timperley, 2007). In mathematics education, research has indicated that effective formative assessment strategies enable teachers to shift their aims from merely correcting and grading students' work to achieving a better understanding of their thinking and moving them forward in their learning (Allsopp et al., 2008;Collins, 2012).
Rather than the numerical scores and letter grades of summative assessments, well-designed formative assessments offer non-evaluative, specific, timely, and personalized feedback that help each student meet learning goals; and the ability to provide such feedback is a critical pedagogical skill (Gearhart & Saxe, 2004;Jenkins, 2010). To that end, it is important that mathematics teacher educators provide pre-service teachers (PSTs) with meaningful opportunities to develop knowledge of the nature and purposes of formative assessment and provide each student with personally relevant and informative feedback that guides him/her through difficulties in understanding. In particular, descriptive and detailed written feedback from the teacher can provide the student concrete steps for making progress. However, although there is a large body of research on student error analysis (see Bray, 2011;Peng & Luo, 2009;Radatz, 1979;Rushton, 2018), and a variety of strategies for formative assessments in mathematics classrooms have been suggested (Gallego-Arrufat & Dandis, 2014;Hernandez-Martinez et al., 2011;Jaafar & Lin, 2017;Schoenfeld, 2015;Taras, 2010), few studies on mathematics teachers' written feedback were conducted. Of the few studies, Bee and Kaur (2014) found that mathematics teachers have several weaknesses in this area and little attention has been paid to how teacher education programs can support PSTs' development of written feedback skills.
To address this research gap, we posited that practicing content-specific written feedback can help engage PSTs in analysing student work and devising teaching strategies to help learners overcome misconceptions. As a beginning step in this line of inquiry, in this study, we investigated the patterns PSTs exhibited in their efforts to provide detailed and descriptive feedback on students' mathematical solutions to procedure-based assessment items.
As both researchers and mathematics teacher educators, we asked ourselves, "What common and uncommon patterns do pre-service teachers demonstrate in their written feedback on students' mathematical problem solutions?" In particular, we wondered, "What factors, such as purpose or need, are related to the way pre-service teachers provide written feedback?" and "What factors might influence pre-service teachers to improve their written feedback skills?" To pursue these questions, we first developed a framework for analysing PSTs' written feedback comments by reviewing the literature on formative assessment and feedback practices and developing an operational list of emergent patterns of written feedback. Second, we engaged PSTs in an online module designed to foster emerging feedback skills on the LessonSketch platform (available at www.lessonsketch.org), after which they revised their initial feedback. Third, we identified the ways in which the PSTs improved their written feedback through revision as well as their needs for further improvement. The findings from this analysis can inform teacher educators' efforts to promote pre-service teachers' understanding of what does and does not work in formative assessment and to develop their skills in providing pedagogically effective written feedback.

THEORETICAL FRAMEWORK
To determine changes in the ways in which pre-service teachers provide written feedback on students' solutions to procedure-based mathematics items, we took into account the novice status of our participants and developed the theoretical framework on which we based our hierarchy of levels of written feedback accordingly.

FUNCTION OF FEEDBACK
Overall, in forming our framework, we presumed that, as mathematics teacher candidates, PSTs should learn to provide learners with content-specific feedback going beyond general comments on the correctness or incorrectness of their solutions and guiding them to clearer understanding. Toward this end, we focused on three specific functions of feedback: cognitive development, meta-cognitive development, and learning motivation (Narciss, 2008). We also assumed that PSTs have limited information on the characteristics of learners and would therefore benefit from an emphasis on learners' cognition, as shown in their mathematical work, as the basis for feedback. Thus, the content of feedback should be informative as well as evaluative. Second, the feedback should also function as a tool to nurture learners' meta-cognition and guide them on the path to further learning. Third, feedback should motivate students to engage in deep learning.
To assess whether and how our participants incorporated these three functions in their written feedback, we formulated the continuum of effectiveness of teacher feedback shown in Table 1.
Below, we provide a brief review of the literature on teacher feedback in general and specifically in mathematics education. In reviewing the literature, we found that early in their programs PSTs may find the concept of feedback as communication challenging because they have had limited opportunities to teach, assess, and use feedback to communicate with students. Thus, we acknowledged the need for instruction on teacher feedback to begin in a controlled context, with a focus on how to provide written feedback on students'

FORMATIVE ASSESSMENT PRACTICE
The research related to educational assessment has focused on conceptualizing its nature and the various contexts in which it occurs (Black & Wiliam, 1998. Over time, this focus has shifted from evaluating learning outcomes, to using assessment data to inform instruction and providing students feedback to enhance their learning processes (Santos & Cai, 2016;Taras, 2010). Black and Wiliam (1998) defined formative assessment as serving two purposes: first, as a means of collecting evidence of students' understanding or learning progress, and second, as a source of information to improve teaching and provide effective learning activities. Similarly, Bennett (2011) asserted that formative assessment informs teachers of what students already know, what their knowledge gaps and misconceptions are, and whether and how they reflect on and monitor their own progress.
Traditionally, mathematics assessments have been perceived as assessments of learning (Dandis, 2013), in which the teacher alerts students to their mistakes through marking and grading and generates summative data of student performance for reporting purposes. However, with the emerging approach to assessment as a way to enhance learning, there has been increasing effort to share a variety of formative assessment strategies and practices (Earl, 2013;Schoenfeld, 2015). The goal of formative assessment in mathematics is to help students improve their learning by giving them feedback. In this vein, researchers also recognize the importance of students' perceptions of the feedback they receive as a way to glimpse the teacher's point of view (Bansilal, James, & Naidoo, 2009). Because how students view feedback is critical to their learning, much emphasis has been placed on the quality of teachers' feedback (Sadler, 1998) as well as a way for them to understand the state of each student's knowledge by employing strategies to elicit student understanding and methods to interpret students' products (Bishop et al., 2014;Leatham et al., 2015). Thus, formative assessment feedback is related to the effectiveness of teachers' questioning and more generally to the teacher's role as a facilitator of learning, which depends on his/her ability to observe students' learning and attend to their thinking (Lee, 2018;Lee & Cross Francis, 2018;Peterson & Leatham, 2009;Stockero, Rupnow, & Pascoe, 2015).

EFFECTIVE FEEDBACK PRACTICE
The purpose of feedback has often been explained as either informing learners about their current status of learning in comparison with a lesson's objective, or encouraging them to narrow the gap between their present and ideal performance (Sadler, 1989cited in Santos & Cai, 2016. Research has shown that students' learning improves when they get informative and constructive feedback on their work that clearly relates to the learning goals (Crisp, 2007;Gregory & Kuzmich, 2004). Feedback has been found to significantly affect students' learning achievement by giving them information they use to monitor their work, such as what they are doing well, where they need to improve, and what they should do next (Callingham, 2008;Volante, 2010).
In addition, feedback is more effective when it presents achievable goals with a high degree of sensitivity to self-esteem (McFarlin & Blascovich, 1981) or provides new viewpoints or interpretations with encouragement for students to offer their own independent and creative ideas (Abtahi, 2014). By contrast, the impact of feedback on learning achievement is low when it is focused on praise, rewards, or punishment (Hattie & Timperley, 2007). More specific to mathematics education, Dias and Santos (2010) found that there is a close relationship between the nature of tasks and the length of teacher's written feedback, suggesting that mathematics teachers might adapt the way they provide feedback to whether the purpose of the task is to evaluate knowledge (i.e., short feedback) or to support problem-solving or discovery (i.e., long feedback). More recently, Santos and Cai's (2016) characterization of feedback as a communication process emphasized the importance of feedback as a dialogical process.

WRITTEN DESCRIPTIVE FEEDBACK
Written descriptive feedback, usually in combination with a letter grade, has been the primary method of teacher feedback on writing tasks in language arts education (Goldstein, 2006). Hyland (1998) argued that to be effective, such feedback should be clear and specific and followed with the opportunity for students to use it in revising their writing.
Because students' mathematical work is also largely written, the same principle of the effectiveness of informative written feedback to improve their learning applies. Hence, the efficacy of written feedback is of ongoing interest to mathematics education researchers (Sadler, 2010), To date, mathematics education researchers have largely weighed in on the efficacy of written feedback specific to student errors on tests and assignments, rather than on the more general issue of how different types of written feedback might affect student learning. Students' learning can be positively affected through feedback based on analysis of their work, and that is explicitly related to learning goals and criteria for success (Shepard, 2006). Thus, more research on the ways in which the teacher education curriculum affects how pre-service teachers conceptualize written feedback and develop feedback skills is warranted.

Participants and Contexts
Participants in this study were 42 elementary PSTs and 40 secondary PSTs at two sites, a large southern university and a large southwestern university in the US. All were in either their junior or their senior year of teacher preparation programs and enrolled in either an elementary or a secondary mathematics methods course that was jointly designed and taught by the authors. Both mathematics methods courses were designed to support PSTs' development of knowledge needed for teaching upper elementary or secondary mathematics, which included content, pedagogical, and curriculum knowledge relevant to the teaching and learning of mathematics.

Data Sources and Implementation
A learning module was implemented in two sections of each of the elementary and secondary mathematics methods courses towards the end of the fall semester of 2015 and again in the spring semester of 2016. The module included a series of five tasks (see Figure 1) that was intended to lead students to greater understanding of the nature and uses of constructive formative feedback and give them the opportunity to use this understanding in revising their initial feedback on a hypothetical student's work. The study did not utilize control groups. In the first task of the module, PSTs read graphic frames in a comics format which scripted a conversation between a methods professor and a pre-service teacher about feedback (see Figure 2). Then the PSTs described important attributes of written feedback to students' mathematical work.
In the second task, the PSTs reviewed the strengths and weaknesses of a sample student's solution to the question: What is the slope of the line defined by the equation 8x + 2y = 5? The PSTs then composed written feedback for the student. In the third task, the PSTs compared their feedback with the feedback of other PSTs on the same work (see Figure 3). In the fourth task, the PSTs reviewed a scripted dialogue (see Figure 4) designed to help them reflect on meaningful feedback comments. In the last task, the PSTs revised their initial feedback.

DATA ANALYSIS
To analyse the changes between PSTs' initial and revised feedback, we used the inductive content analysis approach (Grbich, 2007). Initially we organized raw data into a spreadsheet, read the responses, and created open-codes for each PST's response. After that, we created main categories and identified their sub-categories. Then, drawing on the literature about effective feedback practices as well as levels of feedback skills, we developed an analytical framework (see Table 2). Finally, we revised the initial analytical framework to accommodate the following two conditions: (a) the task setting was technology-based and (b) PSTs provided a sample of feedback in response to student solutions for a procedure-based mathematics assessment item. More specifically, because the PSTs were to provide written feedback on the LessonSketch platform using a keyboard, visual representations or algebraic notations could not be included, so the criteria for each level were focused on the sophistication of verbal responses. Also, the sub-category of 1B was added because the students' solutions were to a procedure-based assessment item.
We interpreted the data using mixed methods. To answer the first research question, we coded PSTs' feedback using the revised framework (see Table 2). To ensure reliability of coding, we asked a third

Frequencies of Feedback Levels
Before the elementary PSTs had completed the feedback module on LessonSketch, their most common level (see Figure 2) was level 2 (36%), followed by level 3 (30%) and level 4 (17%). For example, when initially asked to provide feedback, elementary PSTs at level 2 gave such comments as, "The answer was incorrect because slope is the m of the equation, not mx," which simply identified the error without guidance or encouragement. While those at level 3 provided such feedback as, "Good job! You almost got the correct answer and you showed your work so I could see what you were thinking. However, the variable x is not a part of the slope. In y = mx+b, m is the slope and x represents the independent variable on the graph," which acknowledged what the student did right before correcting the error.
After the elementary PSTs had completed the module, level 5, which had not been represented earlier, now had the highest frequency (33%), again followed by level 3 (26%) and 4 (17%). For instance, PSTs at level 4 provided feedback such as, "You set up the equation by dividing each side of the equation by the same number in order to find the slope and clearly showed your thought process by setting up the equation in a y = mx+b format. However, you seem to have difficulty distinguishing between coefficients and variables." Such detailed feedback emphasized the student's valid thinking and was informative as to what to improve without specifying the answer.
For secondary PSTs, before the module, level 3 (47%) had the highest frequency followed by level 2 (24%). For example, before completing the module, secondary PSTs at level 2 provided such cursory feedback as, "your answer is not in the right form," and PSTs at level 3 provided directive feedback such as, "Not quite there yet. The question asks for a slope value and you wrote how many x's. Do you need an x as part of your answer?" However, after the module, the levels with highest frequencies were level 3 and level 4 (see Table  3). For example, PSTs at level 4 provided explanatory feedback such as, "It's great you know how to isolate the y. But it looks like you don't know how to state the slope from the two actual points. It must be in number," while PSTs at level 5 provided expanded feedback that invited further reflection and dialog such as, "Hey great job of converting the standard form into the slope-intercept form. The slope should be in numbers. What is the difference between -4 and -4x? What does -4 tell you about the line graph? What does -4x tell you about the graph of y? Think about these questions and come see me during lunch. Glad to give the points back to you if you can answer these questions successfully."

Changes in PSTs' Feedback Levels
Overall, more than half of the PSTs (58% of the elementary PSTs and 59% of the secondary PSTs) demonstrated improvement in providing feedback after completing the module, while 40% of PSTs remained at their pre-module levels (see Table 4). These results demonstrate that the module helped most PSTs learn to investigate students' thinking and move them forward in their learning, rather than simply provide praise and fix errors. Also, the module's revision process appeared to help the PSTs develop their abilities to provide feedback that focused on students' strengths and areas of improvement for further learning by encouraging them to reflect on their thinking. More specifically, PSTs who demonstrated an increase in their feedback levels had progressed beyond simply pointing out students' errors and providing correct answers to offering feedback that gave students credit when due while focusing on opportunities for new learning or encouraging students to reflect on their own thinking. For example, an elementary PST who improved from Level 3 to 5 provided the following statements: I would tell the student that he began his work with the correct steps, but I would ask him to include the formula that he used (y = mx+b). I would tell him that his answer was nearly correct, but that slope is always rise over run so his answer should be -4/1. (Pre-module level, Level 3) I would tell the student that he began his work with the correct steps, but I would ask him to include the formula that he used (y = mx+b). I would tell him that his answer was very close to being correct. I would ask him to go back to his class notes and review the meaning of slope. By doing so I would hope that the student realizes that slope should be rise over run and that he should indicate the answers in that manner in the future. I would also ask him to identify what the x means in his answer so he can realize why it shouldn't be included in the answer. I changed my feedback because I had included the correct answer for one of them. However, I learned that to guide students' cognitive process it's better to ask guiding questions so they are able to think about their thinking rather than just giving them the answer. (Post-module level, Level 5) As another example, a secondary PST who improved from Level 2 to 4 provided the following statements: Just let him know the slope value is -4, not -4x. (Pre-module level, Level 2) I never knew some students might think -4x is the slope. If you use the formula, rise over run, it is easy to see the slope should be a number. But the question is in the standard form, and he does know how to convert it to a different form. That's going to be his strength. Maybe he knew the concept and everything, just was not careful at the end. Or maybe he doesn't know what slope really is. I should ask him to review the concept of slope and try the question again. Should I give him two questions? (1) What is the slope when you have (2, 5) and (5, 11)? (2) Find the slope of this function, 3x -4y + 10 = 0. (Post-module level, Level 4)

DISCUSSION
Teacher educators and researchers have an interest in practice-based education for PSTs because future educators can benefit from envisioning how to enact what they learn in their university methods courses. Grossman, Hammerness, and McDonald (2009) identified three common components of their practice-based preparation programs: (1) decomposing complex practices of the profession into constituent parts; (2) demonstrating the professional practices in real settings by using written cases, video cases, artefacts, or animations that emphasize specific aspects of interactions between students and teachers; and (3) enacting approximations of professional practices. The module implemented in this study provides authenticated experiences by using scenario-based feedback tasks in mathematics methods courses. The tasks are designed to involve PSTs in a situation in which the teacher reviews students' mathematical work and provides written feedback intended to help them learn and motivate them to pursue further progress.
Overall, we found that a majority of both elementary and secondary PSTs achieved higher levels of competence in their ability to craft feedback comments that supported students' learning and reflective thinking. Also they responded positively to opportunities to review comments of peers, as well as to having the opportunity to revise their comments. While this study definitely supports a curriculum that nurtures PSTs' emerging written feedback skills and supports their development of strategies that encourage students conceptual learning, we do not argue that our particular modules are suitable for all PSTs. Rather, we argue more generally in favour of engaging PSTs in reviewing students' mathematical solutions and practicing writing and revising feedback to achieve higher levels of response. In designing such a curriculum, what is particularly important is the development of curricular materials based on the following ideas. First, the curriculum should provide materials that encourage PSTs to develop an interest in the teaching skills necessary to provide productive feedback. We used comic-based visuals and informal dialogues to motivate our participants. Second, the curriculum should focus on opportunities for teacher educators and PSTs to co-construct various feedback comments with clear reference points to students' solutions; this practice should also be accompanied by discussion of how these comments communicate awareness of students' strengths and weaknesses and how that awareness prompts students' further thinking and reasoning. Although it may feel awkward at first to balance the amounts of attention given to strengths, weaknesses, and future directions, we believe it is very important to acquire these basic skills. Third, the focus of the curriculum should be on how high-quality feedback helps students identify the next steps in their own learning. We believe this is a rich area in which PSTs may struggle at first but gradually improve as they recognize their role in guiding students to pursue learning beyond the mastery of procedures. We found that the module's revision process, in particular, challenged PSTs to provide feedback that leveraged students' strengths and to use their curricular and content knowledge to craft comments about specific ways for students to reflect on their learning and improve.
Traditionally, mathematics teacher educators have relied largely on a lecture-seminar format, which typically involves assigning research articles followed by either round-table discussion or having students write reflection papers; this same practice is likely to be the way most PSTs are engaged during methods courses. Given the current disconnect between learners' need for effective feedback and teachers' base knowledge of how to provide it, as well as the level of interest in developing PSTs skills for providing effective feedback, we recommend providing multiple, meaningful experiences to practice crafting teacher comments on their own, which can then serve as the basis for discussion in a methods course. Such discussion can also support a model for incorporating theoretical knowledge into the re-construction of comments, thereby establishing a pattern of improving feedback skills through revision. Thus, we suggest a shift in the structure of teacher education curriculum regarding assessment from a model of considering written comments as a teacher's duty to a model that views composing written feedback as a valuable opportunity for teachers to develop knowledge, so that the curriculum incorporates a cycle of opportunity for PSTs to compose teacher comments, engage in the analysis of various teacher comments, and reflect on their own comments through revision.
As for the specific module in our study, we caution that some PSTs may respond negatively to the comics format of the graphic frames we utilized. However, LessonSketch as an online multimedia platform was helpful in creating optimal learning conditions. For example, we found such negative reactions were ameliorated when students observed the graphics privately rather than in a whole group setting, which was where objections to watching a comics display arose. We also found some differences in the development of feedback practices between elementary PSTs and secondary PSTs. Initially most PSTs had level 2 or 3 feedback skills, but a number of elementary PSTs were able to demonstrate level 5 after they had completed the module. The secondary PSTs' progress was more incremental, suggesting that the latter tended to stay within the scope of the mathematical concept at hand, rather than promote metacognition or new learning. Our hunch is that secondary PSTs may perceive feedback as an opportunity to engage in immediate content learning (i.e., levels 3 and 4) while overlooking the role of feedback as a way to motivate students to revisit the work and think on their own (i.e., level 5 and beyond). This result suggests that we need a differentiated approach based on PSTs' current levels, which is designed to leverage them to higher levels. For example, those who struggle to write comments that motivate students to reflect on their learning could benefit from a feedback task that asks them to promote students' metacognition instead of content knowledge.

IMPLICATIONS
This research has implications for designers of mathematics education courses, as well as for researchers pursuing a deeper understanding of PSTs' written feedback skills. For example, we found that a majority of PSTs easily advanced past levels 1 and 2 when they realized their feedback was too general and consisted mostly of praise. This rapid advancement suggests that the teacher education curriculum should focus on helping PSTs achieve higher level feedback skills. It should also focus on helping PSTs compose appropriate feedback comments for various purposes (i.e., descriptive vs. evaluative vs. affective) and revise their strategies depending on how students are likely to respond.
Our future studies will investigate whether and how procedure-based learning tasks could restrict opportunities for teachers to provide quality feedback, and whether and how open-ended items (see Yachina, Gorev, & Nurgaliyeva, 2015 for a use of open tasks to assess meta-subject skills) can better provide a meaningful space for PSTs to develop written feedback skills. On a related note, we plan to refine our research to identify the types of methods content information that directly contribute to PSTs' development of feedback skills by enabling them to (1) create teacher messages that motivate students to think more deeply about mathematics and (2) to go beyond the correctness of a solution. Ultimately, we are interested in researching the types of learning opportunities in teacher education through which PSTs can develop the skills necessary to examine student work and plan for the next steps in their learning. These learning opportunities will highlight the importance of written feedback as an integral part of effective and affective teacher language that motivates students to refine and extend their thinking and reasoning.

ACKNOWLEDGEMENT
This work was funded in part by NSF Grant DRL-1316241 to D. Chazan and P. Herbst, while the corresponding author was a LessonSketch Research and Development Fellow. All opinions are those of the author and do not necessarily represent the views of the PIs of the grant or of the National Science Foundation.

Disclosure statement
No potential conflict of interest was reported by the authors.

Notes on contributors
Mi Yeon Lee -Arizona State University, Tempe, Arizona, USA.