Costa Rican Preservice Mathematics Teachers’ Readiness to Teach

Mathematics teachers ’ knowledge for teaching mathematics has been broadly studied in recent years, and many theoretical frameworks and instruments have been created to measure and improve knowledge and competencies to teach mathematics. The knowledge gained from these studies has been crucial in understanding and determining what mathematics teachers should learn. In Costa Rica, there is a lack of regulations regarding the training that mathematics teachers receive and the knowledge and competencies they acquire in the different teacher education programs. This study investigates the knowledge for teaching mathematics of Costa Rican preservice teachers using the Teacher Education and Development Study in Mathematics (TEDS-M) instrument to identify strengths and weaknesses in their training. A mixed-method analysis of the responses of 79 participants revealed that they were well prepared for cognitive application skills but showed weaknesses in the development of reasoning skills. Additionally, the solutions highlighted significant deficiencies in participants ’ monitoring of their own work and in the ability to provide feedback on students ’ work. We hope that our findings could inform universities and policy makers to improve the quality of teacher education programs. Classroom (COACTIV) project (Kunter et al., 2013), the Learning Mathematics for Teaching Project (LMT) at the University of Michigan (Hill et al., 2008), and the international comparative study of the International Association for the Evaluation of Educational Achievement (IEA) Teacher Education and Development Study in Mathematics (TEDS-M). Each study has its own theoretical framework, but all of them have been built on using the categories defined by Shulman (1986) in his seminal work on teacher content knowledge (Kaarstein, 2014). The three studies mentioned above also differ in context, type of participants (COACTIV and LMT with in-service teachers and TEDS-M with preservice teachers), expert members, and project goals. However, all of them have offered insights into the categories of content knowledge needed for teaching mathematics. As Kaarstein (2014) stated in her comparison study, all three frameworks coincide in including “ (i) knowledge about content and student; (ii) knowledge about content and teaching/instruction; (iii) knowledge about planning for teaching the content; and (…) (iv) curricular knowledge ” (p. 40). The frameworks and results from these studies provide a good starting point for investigating mathematics teacher ’ s knowledge. proficiency recalling formulas, providing explanations of their work, and selecting valid strategies for solving problems.


INTRODUCTION
The quality of mathematics teaching offered in Costa Rican secondary schools has been questioned in recent years (Programa Estado de la Nación [PEN], 2019). One of the reasons for this is the poor performance demonstrated by students in national and international tests. However, the latest reports (PEN, 2019;Román & Lentini, 2018) have pointed out major weaknesses in Costa Rican teacher policies ranging from teacher training, particularly the lack of qualification frameworks and quality standards, to their hiring and performance assessment. In Costa Rica, there are eight public and private universities that offer programs for becoming a mathematics teacher, which vary in content, duration, and quality. Nevertheless, as stated by Román and Lentini (2018), "there is no reference framework that guides teacher training programs around minimum skills and common goals" (p. 22), and there is little control over the implications of the differences in the quality of teacher preparation.
Most studies conducted in Costa Rica with mathematics teachers have focused on in-service teachers. A very representative one is the diagnosis conducted by the Ministry of Public Education (MEP) in 2010 with the participation of 1,733 in-service mathematics teachers, which revealed worrying results (MEP, 2011). In this study, the in-service mathematics teachers from public institutions solved multiple choice items about topics studied in the secondary school curriculum; the items were part of the national test for students in last high school year. The results revealed that 43.4% (N=1,733) of the participants performed below the average, evidencing heterogeneous results and differences in mathematical knowledge. Moreover, the mathematics teachers who graduated from public universities obtained a better score than the ones from private institutions, suggesting differences in the teacher education programs (TEPs). Other studies (e.g., Alfaro et al., 2013;Chaves, 2003) have revealed deficiencies in TEPs, such as the lack of connections between mathematics and education courses. Thus, it is necessary to investigate the contents, quality of training, and effectiveness of different mathematics TEPs in Costa Rica, which are reflected in the knowledge for teaching mathematics that preservice teachers have at the end of their university training. This need is consistent with the research interest in the mathematics education community regarding the content and acquisition of knowledge a teacher must have to teach mathematics (Carrillo, 2011).
Many studies have been researching the professional knowledge or competencies necessary for teaching mathematics (Blömeke & Delaney, 2012;Kaiser et al., 2017). Studies have focused on identifying and distinguishing categories of mathematics knowledge with the intention of finding ways to develop it Carrillo et al., 2018) or to measure it (e.g., Tatoo et al., 2008). Some studies for measuring mathematics teachers' professional knowledge include the German Cognitive Activation in the Classroom (COACTIV) project , the Learning Mathematics for Teaching Project (LMT) at the University of Michigan (Hill et al., 2008), and the international comparative study of the International Association for the Evaluation of Educational Achievement (IEA) Teacher Education and Development Study in Mathematics (TEDS-M). Each study has its own theoretical framework, but all of them have been built on using the categories defined by Shulman (1986) in his seminal work on teacher content knowledge (Kaarstein, 2014). The three studies mentioned above also differ in context, type of participants (COACTIV and LMT with in-service teachers and TEDS-M with preservice teachers), expert members, and project goals. However, all of them have offered insights into the categories of content knowledge needed for teaching mathematics. As Kaarstein (2014) stated in her comparison study, all three frameworks coincide in including "(i) knowledge about content and student; (ii) knowledge about content and teaching/instruction; (iii) knowledge about planning for teaching the content; and (…) (iv) curricular knowledge" (p. 40). The frameworks and results from these studies provide a good starting point for investigating mathematics teacher's knowledge.
In this study, we used the TEDS-M instrument, since it assesses the preparation of secondary school mathematics teachers in training, as well as the level and depth of teaching knowledge related to mathematics and the mathematical knowledge that they achieve at the end of their TEP (Tatoo et al., 2008). We aimed to describe future Costa Rican secondary school teachers' knowledge for teaching mathematics, fill the knowledge gap on this subject, and determine the strengths and weaknesses of the existing TEPs. This information may be useful for policy makers to modify their teacher policies and for universities to update their TEPs, with the ultimate goal of improving the quality of mathematics education in the country.

MATHEMATICS TEACHERS' PROFESSIONAL KNOWLEDGE AND COMPETENCES
The study of the knowledge and competencies necessary for teaching mathematics, as well as the means and tools to develop them, is very complex; thus, there is no single way to define or operationalize them. As Hoover et al. (2016) stated, it does not exist a "theoretically grounded, well defined, and shared conception" (p. 3) of mathematical knowledge for teaching. However, many frameworks developed on this issue take into consideration the categories of content knowledge for teaching defined by Shulman (1986). According to Shulman (1986), content knowledge for teaching is one of the domains in which teacher knowledge can be divided; others could be, for instance, knowledge of "individual differences among students, of generic methods of classroom organization and management, of the history and philosophy of education, and of school finance and administration" (p. 10). Shulman (1986) proposed dividing content knowledge for teaching into three categories: content knowledge, pedagogical content knowledge, and curricular knowledge. Frameworks such as the Mathematics Knowledge for Teaching developed by Ball et al. (2008), the Professional Knowledge of Secondary School Mathematics Teachers , the knowledge for teaching mathematics (Tatoo et al., 2008), and more recently the Mathematics Teacher's Specialised Knowledge model by Carrillo et al. (2018) have their grounds on Shulman's (1986) categories.
The knowledge for teaching mathematics (Tatoo et al., 2008) is the framework of the TEDS-M. The study aims to survey the effectiveness of TEPs around the world, and for that the expert team developed a framework inspired by the teacher education standards of the participant countries and previous works (e.g., Ball et al., 2008;Schmidt et al., 2007). On the basis of the teaching knowledge categories of Shulman (1986), the TEDS-M study includes the facets of mathematical content knowledge (MCK), mathematical pedagogical content knowledge (MPCK), and curricular knowledge, which are divided into different subdomains.
The MCK has the contents subdomains of algebra, numbers, data, and geometry and the cognitive subdomains of knowing, applying, and reasoning (Figure 1), that were defined following the Trends in International Mathematics and Science Study (TIMMS) conceptualization (Tatoo et al., 2008). The knowing subdomain includes the skills of lower order thinking level (Thompson, 2008) such as recalling definitions and properties, carry out algorithmic procedures, recognize mathematical objects, and classify them according to their properties. In the applying subdomain, participants are expected to apply the knowledge from the knowing subdomain (Hsieh et al., 2014), which includes the skills to select appropriate solution strategies or methods to solve routine problems and use different representations of mathematical objects depending on the context. The most demanding tasks are under the reasoning subdomain, which requires the participants to analyze situations and provide justifications for given statements or their own work, make generalizations and solving nonroutine problems are also part of this subdomain (Tatoo et al., 2008).
The framework of the MPCK was built under the assumption that teacher competencies are linked to classroom situations, and therefore to decide which features of teaching mathematics were fundamental, the TEDS-M team designed the MPCK problems based on the standards of national teacher training programs (Blömeke & Delaney, 2012). According to Kaiser et al. (2017) reaching a consensus about the MPCK that meets the context differences was a big challenge and therefore there are less items on this facet than in MCK. In the TEDS-M framework, the MPCK and the curricular knowledge facets were merged, and three subdomains were defined (Figure 1). In the first subdomain, the framework includes the elements of mathematical curricular knowledge, such as setting appropriate learning goals, knowledge of different assessment formats, and comprehensive knowledge of the curriculum. This subdomain was combined with planning knowledge for the teaching and learning of mathematics, a subdomain related to the "preactive" actions that teachers must perform before teaching, including planning classroom activities, predicting the typical responses and misconceptions of students, and linking teaching methods and instructional design. The third subdomain, enacting mathematics for teaching and learning, encompasses the interactive elements of the teacher's role. For example, it includes the teachers' analysis and evaluation of students' solutions to mathematical exercises, the process of explaining the mathematical concepts and procedures, their ability to generate fruitful questions and the provision of feedback (Tatoo et al., 2008).
Similar to other frameworks (e.g., Ball et al., 2008;Carrillo et al., 2018), the TEDS-M framework attempts to separate the MCK and MPCK categories. However, these categories are not mutually exclusive (Döhrmann et al., 2012, Kaarstein, 2014, and MPCK generally requires MCK (Potari & da Ponte, 2017). Kilpatrick et al. (2015) consider that mathematics teaching cannot be conceived as a matter of knowing the mathematics and knowing how to teach; instead, it is a more complex process. Considering this, they developed the Mathematical Understanding for Secondary Teaching (MUST) framework.
The MUST framework was developed out of the analysis of classroom episodes of prospective and practicing secondary teachers as well as teacher educators at tertiary level. Kilpatrick et al. (2015) acknowledge that the mathematical understanding required for teaching mathematics in secondary schools is different from that required in other professions. Notably, this framework conceptualizes mathematical understanding instead of mathematical knowledge making it more flexible given that "knowledge may be seen as static and something that cannot be directly observed, whereas understanding can be viewed as the use of the knowledge one has (…) Also, because of its nature, a teacher's understanding grows and deepens on the course of his or her career" (Kilpatrick et al., 2015, p. 10).
The MUST framework presents elements of mathematical understanding useful for secondary teachers from three perspectives: mathematical proficiency, mathematical activity, and mathematical context of teaching (Figure 2). These three perspectives allow observing different aspects and characteristics of a specific phenomenon, which means that they are interactive. According to Kilpatrick et al. (2015) the mathematical understanding of secondary teachers can be characterized by understanding the overall mathematics capacities relevant for teaching, competencies to enact the actions typical to the teaching job, and settings in which they will use their mathematical capacities and practice those actions.
The mathematical proficiency perspective is multifaceted and includes aspects of mathematical knowledge and skills required by mathematics teachers, such as those defined for students' mathematical proficiency in earlier studies (Kilpatrick et al., 2001). However, the perspectives of the MUST framework demonstrate the in-depth knowledge that teachers must have of high school mathematics, but also of mathematics learned before and those to be learned later. To foster students' mathematical proficiency, teachers need to comprehensively understand this. Table 1 lists some skills corresponding to the aspects of the mathematical proficiency perspective relevant for this study.
The aspect of productive disposition refers to noticing the importance of mathematical activity and being able to recognize it and use it in situations outside the classroom. On the other hand, having historical and cultural knowledge of mathematics  (Kilpatrick et al., 2015) improves the conceptual understanding of teachers in such a way that they can be aware of the epistemology of mathematical ideas and design the best way to teach about them. Kilpatrick et al. (2015) conceive the perspective of the mathematical activity as the actions that are developed with the mathematical objects: notice, reason, and create. Within these three actions, they found specific elements. Mathematical noticing involves recognizing the structures of mathematical systems, symbols, and arguments, as well as noticing the connections within and outside of mathematics. Mathematical reasoning includes the skills of justifying and demonstrating, conjecturing, generalizing, restricting, and expanding; the teacher should always keep in mind the teaching activity and design classes using these actions such that students have a better understanding. Finally, the mathematical creation aspect includes the skills to represent, define, and manipulate mathematical objects in the most appropriate way according to the learning situation (Kilpatrick et al., 2015). As future teachers were not asked to demonstrate these skills in the TEDS-M test, they were not analyzed in the solutions.
The third perspective, the mathematical context of teaching, is about bringing to action the knowledge of the mathematical proficiency perspective and the skills of the mathematical activity into the classroom, focused on helping the students develop their mathematical understanding. The aspects of this perspective are exploring mathematical ideas, accessing and understanding the students' mathematical thinking using appropriate questions or analyzing their discourse, knowing and using the curriculum to plan the classes, assessing the mathematical knowledge of learners, determining their level of understanding, and reflecting on the mathematics in one's practice (Kilpatrick et al., 2015). All these aspects are directly related to the teaching activity in the classroom; thus, in the questionnaire responses, we could identify only one of them.
Using the knowledge for teaching mathematics and the MUST frameworks, we analyzed future Costa Rican teachers' responses to the TEDS-M questionnaire. The first framework provided a general overview of the participants' cognitive and teaching-related skills from the MCK and MPCK perspectives. On the other hand, the MUST framework allowed us to gain insights into the mathematical understandings and professional competences evidenced by the participants, providing the opportunity to identify the weaknesses and strengths in their TEPs.

Research Questions
This study evaluated the knowledge for teaching mathematics of future Costa Rican secondary teachers, considering their cognitive and teaching-related skills as well as their mathematical understanding for teaching. To do this, we used the MUST framework in addition to the TEDS-M theoretical framework to provide a more detailed and in-depth analysis. With this work, we hoped to answer the following questions: 1. What is the performance shown by Costa Rican preservice mathematics teachers in the TEDS-M questionnaire?
a. What is their performance in the knowing, applying, and reasoning subdomains?
b. What is their performance in the enacting and curriculum and planning skills? 2. How are the mathematical understanding competences shown in Costa Rican preservice mathematics teachers' responses to the TEDS-M questionnaire?

METHODOLOGY
The present study includes qualitative and quantitative research methods; thus, it follows the characteristics of a mixedmethods research. In this research method the qualitative strand complements the weaknesses of the quantitative part and vice versa, leading to a "best data explanation and best understanding for the studied research phenomena" (Maarouf, 2019, p. 3). Considering that, we chose the mixed-method research because it allows us to answer the research questions and approach the research problem from a more complete point of view.

Research Context
Secondary math teachers in Costa Rica are trained in programs that include content in mathematics, pedagogy, and mathematics pedagogy. Depending on the university, a teacher education program leading to a bachelor's degree can take from two and a half years in private institutions to four years in public institutions. Thus, the programs between universities vary in duration, but also in the contents covered and the quality of instruction (Alfaro et al., 2013). Differences in TEPs are not monitored Table 1. Aspects of mathematical proficiency considered in this study (original source: Kilpatrick et al., 2015) Conceptual understanding (Knowing why) -Understand and use mathematical concepts in various contexts -Monitor own's and students' work -Understand, identify, and use connections in mathematics -Formulate proofs -Remember and reconstruct methods Procedural fluency (Knowing how and when) -Quickly recall and accurately execute procedures and algorithms Strategic competence (Knowing heuristics) -Select strategies for solving problems -Have a flexible approach -Generate, evaluate, and implement problem-solving strategies -Know various solution strategies Adaptive reasoning -Provide valid explanations and justifications during or after teacher training. The main hiring entity, which is the Costa Rican Ministry of Education, does not conduct interviews or evaluations of content knowledge or pedagogical skills for future teachers (Román & Lentini, 2018). In this context, it is challenging to have a picture of teachers' knowledge and teaching skills when they finish the programs and go to the classrooms to teach.

Participants and Data Collection
The sample of this study is formed by 79 participants who were preservice mathematics teachers enrolled in a TEPs in Costa Rica. We considered only the preservice teachers in their last year of the bachelor or licentiate degree. Most participants were men (44 of 79), and the average age was about 24 years. The preservice teachers were informed that participation in the study was voluntary, that their performance on the questionnaire would not affect their grades and that the data would be treated confidentially. For selecting the sample, the eight Costa Rican universities, public and private, with active mathematics TEPs were invited to participate. However, our sample was reduced to the preservice teachers of only four TEPs, due to lack of interest in participating or logistical aspects. There are two TEPs from the same university, nevertheless, as they are from different campuses, we considered them to be two universities. This is because, although the study program was the same, the teacher educators, learning opportunities, and number of students were different. Therefore, we counted four universities-coded as A, B, C, and D. The questionnaire was administered to seven groups, and the distribution of the participants by university can be seen in Table  2. The role of the researcher in data collection, was the one of instrument implementer. The instrument was completed on paper and pencil, and the participants had 3 hours to complete the four parts. In this article, we only discuss Part 3. The data were coded according to TEDS-M guidelines (Brese & Tatto, 2012).

Instrument
The questionnaire used for data collection presents 13 exercises that are subdivided into 31 items. The items correspond to those released by the TEDS-M study (Brese & Tatto, 2012), except item MFC703 which was excluded due to problems about the intelligibility of the task. The items are classified in the domains of MCK and MPCK, and in the subdomains: content (algebra, numbers, geometry, and data), cognitive (knowing, applying, and reasoning) and teaching-related skills (enacting, and curriculum, and planning). There were three types of formats for items: constructed response seven items, multiple choice two items, and complex multiple choice 22 items ( Table 3).
The items were translated from English to Spanish by the author, and both the translation and the contextualization, were validated by three Costa Rican mathematics educators not related to the study. IEA granted the respective permissions to use the items in this study, which already were validated by international experts. We did not intend to duplicate TEDS-M study; there were many differences between that and our study-for instance, the time for solving the test-that must be considered before comparing with TEDS-M data.

Data Analysis
We performed quantitative and qualitative analysis of the preservice teachers' responses to the TEDS-M items. For quantitative analysis, responses were coded using the TEDS-M keys and scoring guide (Brese & Tatto, 2012), and statistical tests were performed. In addition, using the participants' grade in all the items, that is, the MCK and MPCK, we divided them into quartiles. Q1 has the participants with scores below 58.8, the ones with grades from 58.8 and below 73.53 were in Q2. Q3 is composed by the preservice teachers with scores between 73.53, included, and 79.41. Finally, Q4 has all the participants with grades equal or over 79.41. The qualitative section was made using direct content analysis (Hsieh & Shannon, 2005)-that is, an analysis guided by the theory of the MUST framework as a lens to study the solutions of the participants. For the content analysis, we read the solutions of the 13 exercises in the 79 questionnaires and identified the following: solution strategies, content knowledge and procedural difficulties, as well as other characteristic of importance, such as the additional drawings and annotations made in the tasks that were not of complex response. Next, following the definitions and skills associated with each perspective of the MUST, the annotations were related to the relevant aspects. Thereafter, in each aspect, the annotations were analyzed and merged according to their characteristics. Consequently, four aspects for the mathematical proficiency perspective and one for the mathematical context perspective were evident from the solutions

RESULTS
First, we determined the quantitative results of the cognitive subdomains and the teaching-related skills, using descriptive statistics and statistical tests, with which we intend to answer research questions 1a and 1b. For the second research question, we wrote the results of the qualitative analysis of the participants' solutions according to the categories of the MUST framework. Considering the structure of the questionnaire some aspects of mathematical understanding of secondary education cannot be observed, such as the aspects related to the mathematical activity perspective.

Preservice Teachers' Performance on Cognitive and Teaching-Related Skills Subdomains
As detailed in Table 3, there are 22 MCK items in the questionnaire across reasoning (n=6), applying (n=10), and knowing (n=6) categories, according to the TEDS-M categorization. For the nine MPCK teaching-related skills items, there are 5 that correspond to the enacting category and 4 to the curriculum and planning one. Considering the average performance in the questionnaire, including MCK and MPCK parts, we divided the participants into quartiles, as explained before, to observe the patterns by subdomain, with the aim to gain insights into the Costa Rican preservice teachers' strengths and weaknesses in the cognitive areas. In addition, significant differences in the performance were calculated by university.
Overall, the participants had a better performance in the applying items, followed by the knowing ones (Figure 3). The cognitive domain with lower rates was reasoning. Considering the competences assessed by each cognitive domain, as defined in the theoretical framework, it makes sense that reasoning tasks were more demanding that the applying ones, especially because they required to deal with nonroutine problems and writing proofs. However, following the same logic, the knowing items must be easier to solve than the applying ones, but this was not the case with our participants.
The general pattern of performance is repeated in all quartiles except the fourth, where the participants performed slightly better in the reasoning subdomain than in the applying subdomain, which suggests that they acquired the skills in the higher cognitive levels. On the other hand, the difference between the reasoning and the other two subdomains, in quartiles one and two is large, implies little engagement of these groups in the more demanding tasks.
The Wilcoxon signed-rank test revealed that the results of the applying items were significantly higher than the reasoning ones, (Z=-3.45, p<0.05). Similarly, the outcomes of the knowing items were significantly higher than the reasoning (Z=-2.4, p<0.05). The difference between applying and knowing was not statistically significant. The performance in reasoning subdomain was significantly lower than that in the other two.
Regarding the MPCK subdomains the participants performed better in the curriculum and planning area. This pattern, as observed in Figure 4, is the same in the all the quartiles except in Q3, where the enacting subdomain is slightly higher. In Q1, the results show a difference of 17 points between the variables, a greater gap than in the remaining quartiles. However, the Wilcoxon signed-rank test revealed that the differences between the enacting and curriculum and planning areas were not statistically significant. Notably, the MPCK competences are difficult to assess in a paper and pencil questionnaire, because that knowledge is   Döhrmann et al. (2012), the TEDS-M study had a stronger focus on reporting about MCK facets than about the MPCK ones; therefore, there are fewer items measuring the latter. Nevertheless, from an overall perspective, the participants performed better in the curriculum and planning items that assessed if they could identify the previous knowledge required to teach certain topic; than in the ones in where they were asked to think about secondary students' possible difficulties, assessing solutions or understanding the reasons for mistakes.
To identify whether the pattern was the same across all TEPs, we performed a statistical test to study whether there were statistically significant differences between performance in subdomains and universities. A Kruskal-Wallis H test revealed that the distribution of the applying, and curriculum and planning subdomains between universities were not significantly different. Nevertheless, the same test revealed that the results distributions of the knowing subdomain (χ 2 (3, N=79)=9.093, p≤0.05), the reasoning subdomain (χ 2 (3, N=79)=17.242, p≤0.001), and the enacting subdomain (χ 2 (3, N=79)=9.821, p≤0.05), were significantly different among universities. In Figure 5 it is possible to observe it in the average performance of the cognitive and teachingrelated skills subdomains, by university.
Regarding the cognitive subdomains, the average performance by universities follows the same pattern as the performance by quartiles (Figure 5), with applying the best category, followed by knowing and reasoning. Only Univ. A presented a different pattern, with better results in knowing, next in reasoning and then in applying, holding the best performance of all universities in the first two subdomains. The Wilcoxon signed-rank test revealed that the only significant differences between subdomains within the same university are among the applying and reasoning subdomains in universities B(Z=-2.1, p<0.05) and D (Z=-3.27, p<0.001). With those results one can infer that the mathematics TEPs in Costa Rica have a focus on applying competencies and need to work more on reaching higher cognitive levels as reasoning.
Considering the teaching-related skill subdomains, the pattern is also the same as the one in the analysis by quartiles, better results in the curriculum and planning subdomain than in the enacting one, except for Univ. C. The Wilcoxon Signed-rank test proved that the participants from Univ. C performed significantly better in the enacting subdomain than in the curriculum and planning one (Z=-2.02, p<0.05). The same test revealed that in the case of Univ. B, the curriculum and planning performance was significantly better than the enacting subdomain (Z=-2.384, p<0.05).
The results in this section have revealed that the Costa Rican preservice teachers are good at solving items in the applying cognitive subdomain, and that they are better at the curriculum and planning tasks that in the enacting ones. At the same time, it has indicated that the participants have a weakness in the reasoning items. Moreover, the outcomes indicate the fact that there are significant differences in the performance between TEPs, specifically the reasoning, the knowing and the enacting subdomain. These can be interpreted as differences in the quality of the programs offered in Costa Rica.
The following section will allow us to perform a more detailed analysis of the respondents' understandings through the qualitative analysis of their answers using the MUST framework.

Analysis of Preservice Teachers' Solutions Using MUST Framework
As described in section "mathematics teachers' professional knowledge and competences," the framework mathematical understanding for secondary teachers has three perspectives. We will analyze the preservice teachers' responses trying to identify the aspects of each perspective and providing a description on how the aspects are present. First, we will comment on the mathematical understanding perspective with the aspects conceptual understanding, procedural fluency, adaptive reasoning, and strategic competence. Then we will discuss one aspect of the mathematical context for teaching perspective namely assess the mathematical knowledge of learners.

Conceptual understanding
The aspect of conceptual understanding is the most evident in the solutions of the participants, four skills were tracked from this category. They will be discussed in the following paragraphs.

Understand and use mathematical concepts in various contexts:
The skill of understand and use mathematical concepts in various contexts in a proper way, was observed in different situations. First, in the use of mathematical properties and definitions. In Exercise 704, a geometry task that requires to determine the lengths of the segments of a parallelogram, the participants used the parallelogram properties to investigate the lengths of the rhomboid's segments and for measuring the angles. Similarly, they needed to recur to the definition of an irrational number to decide if , √2 and 22 divided by 7 belong to that set off numbers, in Exercise 610. For that, the preservice teachers demonstrated that they were clear about the first two numbers, however, the decision if "the result to divide 22 by 7" was irrational all the time was not easy for them. When reviewing their annotations in the questionnaire, we observed that they did the division but did not notice that there were repeating decimals, nor did they remember that, by definition, all the numbers ⁄ ( , ∈ ℤ, ≠ 0) could be known directly as rational numbers. Other example of this case was evident when the participants used as part of the proof of Exercise 814 the "zeroabsorption property." The exercise consists of proving whether it is true that when operating two 4×4 matrices using an operation that multiplies input by input, the result is zero, at least one of the matrices must necessarily be the null matrix. Thus, the idea of using the zero-absorption property was correct, but it should be used in the multiplications of the inputs and not for the matrix operation, as many participants did.
Another way to demonstrate this skill was showing the fully understanding of a situation explaining it with their own words or being able to use general representations when doing operations. For the former case, in Exercise 804, one participant explained his choice for how many possible ways there are to choose 2 and 8 students out of 10 writing "it is the same to choose the 2 that stay or the 2 that leave" (P78), showing a deep understanding of the situation. For the latter, in Exercise 711, a proof about adding functions was included. Here, the participants needed to use the explicit form of a linear function to make computations even though the proof could be performed without them because the property must be true for all the linear functions with the given characteristics. In this sense the participants exhibited poor understanding of the situation and concepts in the task as well as poor abstraction skills. The last observed aspect in this skill is the understanding showed when assessing several solutions attempts to the same task, as in Exercises 802 and 709. In item 802, the participants exhibited their understanding of the concepts when writing the statement "If the square of any natural number is divided by 3, then the remainder is only 0 or 1," in a symbolic language easier to compute such as tests their understanding by asking them to point out the specific reason why one of the given solutions attempts did not work. For instance, P12 pointed out that Leon's option "showed that it is true for the first n numbers; it means, conjecture," thus concluding that the attempt is invalid. For the same case, P25 wrote "there is no test for all; it is missing" justifying why the test Figure 6. TEDS-M item 709 (Brese & Tatoo, 2012) was incorrect. For reaching these conclusions, the participants must have a clear understanding of proves were the condition have to be true "for all" cases.

Monitor own's and student's work:
This skill has two directions and requires the preservice teacher to practice the task of reviewing the correctness of the procedures, the most convenient word choice and that the results make sense in the context. Similarly, they must be able to monitor students' solutions. From the analysis of participants' solutions, it was possible to observe that they were strict on the items that required reviewing the possible solutions of the students to the exercises, identifying, and pointing out why the procedures were not valid, as we already mentioned P12 and P25 did. Nevertheless, when it comes to monitoring their own work, there is evidence of various flaws. For example, in giving values to the interior angles of a parallelogram that did not add up to 360 (P15). It is also evident in Exercise 604, a word problem solved by a system of equations, when the students obtain values for the variables that exceed the total indicated in the statement, but they did not notice it (P12 and P68).
There was also an issue with the words they chose to refer to mathematics objects. In Exercise 806, where they must describe why the students committed a mistake inferring information from a histogram, P75 wrote that they were thinking that "each graphic represented a country," when the correct was to say, "each bar represented a country." On the other hand, P63 and P69 attributed the error to the students answering that there were seven countries represented because it was the number of countries in Central America. This statement is erroneous for two reasons: first, this is not the number of countries in Central America, and second, because the title of the graph indicated that the sample was from Central America and South America. There are preservice teachers who are not paying attention to the way they carry out their procedures or how they express themselves about mathematical objects or give answers without checking their justifications. When it is considered that students learn from what teachers do and how they do it, these failures in teachers' monitoring of their own performance are not favorable for student learning.

Understand, identify, and use connections in mathematics:
Considering the way mathematics is built, as well as understanding, verbalizing, identifying, and using connections between concepts and variables, are crucial skills for the teachers. Concerning this skill, we observed that participants were capable of posing and using connections in many cases, as in Exercise 704 (Figure 7), where many participants (i.e., P17, P19, P36, and P48) connected the results obtained from different formulas (perimeter, Pythagoras, trigonometrical identities) to find the solution to the problem. Nevertheless, in other cases, even though they could pose different and many equations from a statement, they became confused and were not able to connect them to reach the goal (i.e., P58, P72, P77, and P80). It seems that they were just writing equations without a clear purpose for them.
The connections are also made when linking the same mathematical object or situation in different languages. Here there were examples of associations between the functions and the situations they model, as in Exercise 710, where the participants conclude from the statement "the height h of a ball t seconds after it is thrown into the air" that it can be modeled by a quadratic function and not with an exponential function. Another example of that type of connection is in Exercise 610 when they must evaluate whether "the diagonal of a square with side of length 1" and "the result of dividing the circumference of a circle by its diameter" are always an irrational number or not. The participants made connections between the definition of an irrational number and the geometric formulas that allowed them to identify the number that each situation represented. In this case, many participants (i.e., P3, P12, P22, P39, and P47) made annotations of the formula for the circumference of the circle and drawings of a square with its diagonal and property of the 45-45 special triangle.
On the other hand, the participants exhibited flaws in making connections between natural or word language and symbolic language. In this sense, we observed that some preservice teachers interpreted the statement "Peter has 6 times as many marbles as David" in Exercise 604 A1, as P=6+D instead of P=6×D (i.e., P13, P49, and P70). This type of mistake is common in secondary students; therefore, teachers are expected to not only able to avoid them but also implement strategies, so their students have a better understanding and do not make them.

Formulate proofs:
The skill of formulating proofs requires a very well understanding of the mathematical knowledge, the connections between results, and the logic behind the proof. The questionnaire had two exercises that required the participants to formulate proofs and two where they had to decide if some attempts of proofs were valid or not. However, some participants (P36, P49, and P64) used the structure of a proof for organizing their thoughts in the Exercise 704 of the side lengths of the parallelogram. In Exercise 711, they had to prove that if adding two linear functions that intersect at a point P on the x-axis, the graph of the sum function would also pass through point P. The solutions to this exercise showed that the participants had difficulty using the hypothesis in a useful and efficient manner. Most participants who failed to provide a valid proof used the fact that f(p)=0 and g(p)=0 to equalize the functions (f(p)=g(p)), but the efficient strategy was to add them since the aim was to know if the point belonged to the sum function (f(p)+g(p) = 0). This caused their proofs to be long and difficult to follow; in some cases, the proofs did not even reach the goal (Figure 8). In the same exercise, three participants (P13, P21, and P63) tried to reason by contradiction using specific functions without verifying that their examples did not meet the conditions given in the statement; thus, the test attempt was invalid.
Another mistake regarding the proof strategies was noticed in Exercise 814, when proving that the only way that the result of operating two 4×4 matrices, with an operation that multiplies input by input, would be zero, is that one of the matrices is the zero matrix. The statement is false and requires the use of a counterexample to prove it, as P79 did ( Figure 9A). However, as presented in Figure 9B, the reasoning of preservice teachers P7, P16, P26, P29, P49, and P71 was that as the statement was valid when using a zero matrix, then it must be true. Thus, they are not considering all the cases and are overgeneralizing. These examples highlight that some participants are not proficient in the ability to formulate proofs, and this is without considering the mathematical rigor but only the way the participants thought.

Procedural fluency
Procedural fluency is the most practiced aspect by students in schools, and it is related to making procedures correctly, flexibly, and efficiently. The aspects of this skill analyzed in future teachers' responses consist of quickly recalling and accurately executing procedures and algorithms. In different exercises, participants exhibited their proficiency recalling formulas and relations such as the perimeter and Pythagoras formula, the special triangle 30-60 relation, and the law of sines, as well as posing a system of equations using given relations in a written problem. Proficiency requires not only recalling the formulas but also using them in a correct context and without calculation mistakes. However, some participants made computation errors very similar to those made by high school students. For example, P22 failed to sum the algebraic expression "x+x+2x+2x=68x=6", and P55 made a mistake solving the special product of square of a difference (Figure 10). As mentioned, these errors may relate to the poor monitoring of their work.

Adaptive reasoning
This skill is about providing explanations and justifications of mathematical decisions, either solving or evaluating the solution of a problem, and it is a very important part of mathematical thinking. The participants' solutions made it possible to observe this facet from two moments: when they had the role of teachers who had to explain the reasoning of students and when they were the ones who had to offer justifications for their procedures. The first situation occurred in two exercises. In Exercise 604 B, the preservice teachers had to explain the reasons why a word problem is more difficult for students than another. Although 73.4% (n=79) of the respondents provided valid reasons, such as the use of fractions or that making calculations with rational numbers is difficult for students, some of their explanations were not easy to understand (see examples in Table 4).
Other example of the skill of evaluating students' solutions, was observed when the preservice teachers made annotations such as "it is necessary to prove it for all [numbers]" P25 when referring to a proof that only tested for few numbers. Considering that the participants would be teachers who will have to provide feedback to students and parents and understand and diagnose the possible difficulties their students might face in an exercise, the word choice and the depth of their explanations must improve.
On the other hand, when they had to provide explanations of their own work, there were cases when they were not successful and resulted in proofs composed of equations without connections and other ones when they had it so clear that they could write it mostly in natural language, as the examples in Figure 11 show, although the answer is not correct or complete.

Strategic competence
The strategic competence skill is related to the heuristics and their implementations for solving problems. It includes the skills of selecting strategies for solving problems, having a flexible approach, generating, evaluating, and implementing solving problem strategies, and knowing various solutions approaches. These four skills were the ones observed in the preservice teachers' work.  Table 4. Examples of participants' solutions to Exercise 604B because the students have to "use the same variable for two different characteristics" P33 because "a variable x can be related at the same time to two other different situations" P47 because it "involves 2 times the same variable" P49 because it "refers to how much one has with respect to two people" P79 The ability to select strategies was shown, for example, in the parallelogram exercise where the participants chose convenient, practical, and valid strategies to find the side length. Using the fact that they had a right triangle formed by one side of the parallelogram and the two bisector lines, they chose strategies such as the Pythagoras formula or the trigonometric ratios (see Figure 7). They also exhibited this skill when choosing the proof strategy of the counterexample in Exercise 814 (see Figure 11), instead of trying to test many cases. When solving problems in mathematics, it is important to have a flexible approach, which means to be able to see the problem from different perspectives or solve an easier problem first. In this sense, some participants demonstrated this skill when dividing the parallelogram figure or extracting the triangles from it as a strategy to observe properties and relationships more clearly, as highlighted in Figure 7. Similarly, they could solve the problem in different steps, noticing that different strategies could solve different values.
However, when implementing the solution strategies, they were not always able to connect all the parts they generated, resulting in long chains of equations without a clear north. Figure 12 is an example of this situation in which participants could not successfully connect their equations.

Assess the mathematical knowledge of learners
Understanding the way students are thinking in mathematics allows the teacher to know how they are interpreting the mathematical contents and how they are using them in their practices. Some evidence of this skill is revealed in the items that required preservice teachers to evaluate the students' work or analyze the reasons for the mistakes made. For instance, when they had to explain why the participants wrongly interpreted a histogram about the frequency of the adult female literacy rate in Central and South American countries, the respondents provided different reasons. It is possible to distinguish between the participants who explain the students' responses by pointing out that they only counted the bars (29 of 79) and those who make explicit the fact that the student assumed that each bar represented a country (32 of 79). Among the responses, P80, P52, and P23 stand out, referring to the fact that the student in the example did not pay attention to the axes of the histogram (y-axis was frequency and x-axis adult female literacy rate). P71 goes further, adding that probably the student thought "that each bar represented a country, and the highest rate would be the highest column." Another of the reasons mentioned by P63 and P69 is that the student could have thought about the number of countries in Central America, ignoring that the study also included South America. From these explanations, it is possible to observe that the future teachers tried to not only about think the most obvious possibilities but also explain typical errors such as not reading the axes or the title of the graph, or the confusions more related to the context as in the case of Central American countries.
In the case when item 604B asked for why one-word problem was more difficult for secondary students than the other, they mentioned points as the data were "less explicit" (P77) or that "students generally have trouble interpreting the relations of proportionality, regarding the translation from natural language to mathematical" (P80). These explanations are closer to typical errors faced by students when solving word problems than with the structure of the equations systems involved in the solution of the problem.

DISCUSSION
Given the lack of information on the knowledge for teaching mathematics and the professional competences of Costa Rican preservice teachers, we evaluated the cognitive and teaching-related skills covered in the TEDS-M questionnaire of 79 future mathematics teachers, as well as their mathematical understanding for teaching evidenced in the solutions to the items.
The quantitative analysis of the cognitive subdomain revealed that preservice teachers were better prepared to solve items in the applying domain than in the knowing and reasoning domains. However, their performance in the reasoning domain, the one of higher cognitive demand, was unsatisfactory. The reasoning domain was also the one with lowest performance in the international TEDS-M study (Hsieh et al., 2014). In the mathematics school curriculum, five crucial processes must be practiced in the classroom: reason and argue, pose, and solve problems, communicate, connect, and represent (MEP, 2012). Accordingly, the teacher has to be proficient not only in applying but also in knowing and specially reasoning to guide the students to achieve mathematical proficiency. The fact that the preservice teachers in the higher quartile had a different pattern of performance, achieving a higher score in reasoning than in the other subdomains, and the big gap between them and the other three quartiles, make evident that 75% of the respondents have been left behind in the development of their reasoning skills during their TEP.
Additionally, when observing the pattern of the results in the cognitive subdomains by university, it is possible to infer that the TEPs of universities B, C, and D have an approach focused on applying. The descending pattern applying-knowing-reasoning is also followed by Chile and the United States according to the TEDS-M results (Hsieh et al., 2014), so it may be associated with cultural factors. On the other hand, the pattern of university A descending from knowing to reasoning to applying is not consistent with any of the patterns found by Hsieh et al. (2014). Nonetheless, the results that the reasoning subdomain has the worst performance in the analysis by quartiles and by TEP should be a wake-up call to universities to urgently improve teacher preparation in reasoning skills. Notably, the reasoning, knowing, and enacting subdomains were significantly different in the TEPs. This result is consistent with the issue addressed by Román and Lentini (2018) about the heterogeneity of the TEPs and the lack of control of TEP quality (Alfaro et al., 2013). However, our results are an attempt to describe how the TEPs differ.
The qualitative analysis allows us to get closer to the participants' understandings and have a clearer panorama of their strengths and weaknesses. Regarding the positive aspects, we can highlight the understanding and correct use of properties, definitions, and representations to solve and prove mathematical tasks. As well, the participants could make numerous connections between them and, sometimes, give them good use. In addition, many contestants exhibited good performance in writing proofs. All of these skills are of daily use in the mathematics classroom. Other strengths shown by Costa Rican preservice teachers include proficiency recalling formulas, providing explanations of their work, and selecting valid strategies for solving problems.
More related to teaching skills, the respondents indicated that they could monitor students' work by revising attempts to solutions identifying the correct attempts and pointing out the mistakes in the incorrect ones, which shows conceptual understanding and understanding of students' way of thinking. On the same topic, they were also able to provide various explanations of the students' difficulties, including the most common ones on mathematical content and some more atypical ones related to beliefs or context. All these skills are expected from mathematics teachers as mentioned in several frameworks (e.g., Ball et al., 2008;Carrillo et al., 2018). Therefore, it is a good sign that Costa Rican future teachers have them. However, we also found weaknesses worth considering before drawing conclusions.
The analysis of participants' solutions revealed errors, from a very basic level such as computing mistakes, and others more complex in the structure of proofs, for instance issues with understanding the zero-absorption property, and consequently it was used incorrectly. Some mistakes are like the ones committed by students, such as not checking that the answer makes sense with the problem statement or doing incorrect translations from natural language to the symbolic language of an algebraic expression. The complex response items evidenced the participants' problem of posing numerous equations without having clear what they would be useful for and resulting in large chains of equations systems that sometimes did not lead to an answer. It is evident that the preservice teachers were not monitoring their work-an important moment when solving problems (Polya, 1945) and also had problems with what Schoenfeld (1985) calls control in problems solving process, which has to do with a correct and efficient use of the heuristics.
Other mistakes observed were related to reasoning by contradiction when the task did not allow it or overgeneralizing results after trying it with just one case, observed in other studies with preservice teachers (Demir et al., 2018). These reasoning actions are considered in the MUST framework as part of the mathematical activity perspective (Kilpatrick et al., 2015). Teachers have to come up with conjectures and generalizations spontaneously while providing explanations in the class or persuading students of an incorrect solution path. Therefore, future teachers must be proficient in these skills, first in their role as solvers to translate them into the teaching role. Last but not least is the participants' weak abilities to provide feedback, although they could identify the students' mistakes.
As mentioned above, they were not very good at explaining them. The word choice and the depth of the explanations were difficult to comprehend, and students and parents would have trouble gaining insights from them. The Costa Rican preservice teachers' gap regarding giving feedback was already pointed out in a previous article where 56% of they stressed they rarely or never had the opportunity to learn to do this type of assessment activities (Alfaro Víquez & Joutsenlahti, 2020). Many of the weaknesses noticed in qualitative analysis are coherent with the preservice teachers' performance in the reasoning subdomain.
In conclusion, we identify that the preservice mathematics teachers that participated in this study are proficient in the applying subdomain, which means they are good at solving routine problems, representing mathematic information, selecting solving strategies, and implementing instructions (Tatoo et al., 2008). However, they failed in the reasoning subdomain, which was required to analyze, generalize, integrate, justify, and solve nonroutine problems (Tatoo et al., 2008). The respondents also exhibited weaknesses in teaching-related skills, such as monitoring their own work or providing clear and useful feedback, and an alarming number of basic mistakes in computations, not verifying the answer or translating algebraic expressions from natural to symbolic language. Moreover, although they could pose numerous equations by using connections between contents, they were posed without having a clear use to reach the answer.
Our findings suggest that universities need to make adjustments in their TEPs to reinforce the problems identified. Actions for the remediation of these issues are important to address the teachers' knowledge and skill gaps before they start teaching (PEN, 2019). Similarly, the results on the heterogeneity in the performance of the universities require the action of the policy makers to establish quality standards for the training of mathematics teachers and to implement hiring policies that ensure the quality of the teachers who go to the classrooms. This study had some limitations. First, we only assessed the knowledge for teaching mathematics from a cognitive perspective and leaves situated aspects aside. To have a complete picture of the knowledge required to teach mathematics and the professional competencies of preservice teachers, it is necessary to complement this study with others where the participants can be observed in practice. As Kaiser et al. (2017) stated, there is complex interaction between the knowledge-based and the situated competence facets, and to have a better understanding about teachers' professional knowledge, both facets need to be considered. Therefore, as teaching practice is crucial to becoming true experts in teaching mathematics, future studies could be performed on that topic. Second, it will be important to discuss these results with the teacher educators with the purpose of identifying the causes and proposing ways of improving mathematics teacher preparation in Costa Rica. Another interesting investigation will be to apply the TEDS-M questionnaire to in-service teachers and observe if the teaching experience makes a difference in the results. Finally, it is essential to obtain information from future teachers prepared in private universities, since studies have indicated that those TEPs cover fewer mathematics topics and offer weaker pedagogical knowledge than public universities (Alfaro et al., 2013).
Funding: This research was funded by the University of Costa Rica, through the postgraduate study grant for the first author OAICE-CAB-160-2016.