International Electronic Journal of Mathematics Education www.iejme.com REASONING ABOUT VARIATION: STUDENT VOICE

This paper reports one recent study that was part of a project investigating tertiary students’ understanding of variation. These students completed a questionnaire prior to, and at the end of, an introductory statistics course and this paper focuses on interviews of selected students designed to determine whether more information could have been gathered about the students’ reasoning. Clarification during interviews reinforced researcher interpretation of responses. Prompting assisted students to develop better quality responses but probing was mostly useful for assisting students to re-express reasoning already presented. Cognitive conflict situations proved challenging. The diversity of activities identified by students as assisting the development of their understanding provides a challenge for educators in planning teaching sequences. Both educators and researchers need to listen to students to better understand the development of reasoning.


INTRODUCTION
Statistical reasoning, involving reasoning with statistical ideas and making sense of statistical information, depends heavily on the understanding of basic underlying concepts (Garfield, 2002) and one important underlying concept is variation (Wild & Pfannkuch, 1999).Since 1999 there has been an increasing interest in research into reasoning generally, and into reasoning about variation in particular.Critical to this research has been detailed studies into understanding of variation.For the purposes of this paper variation is defined as the describing or measuring of a characteristic that is liable to change (Reading & Shaughnessy, 2004, p. 202).The research focus on reasoning about variation has necessitated more qualitative research into levels of reasoning (e.g., Watson, Kelly, Callingham & Shaughnessy, 2003).This often involves coding of responses to open-ended questions, with related uncertainty of interpretation.Accompanying this trend has been increasing use of interviewing as a research methodology,

RESEARCH INTO REASONING ABOUT VARIATION
Recognition of the lack of acknowledgement of variation by students in responses to a national test item prompted Shaughnessy, Watson, Moritz and Reading (1999) to develop the "gumball task."This provided the stimulus for an ever-expanding platform of research over the last decade to investigate students' reasoning about variation.Subsequent research developments are now explained in the light of key research papers and major statistics education research events that have allowed researchers to share their work.Six key stages are elaborated.
First, understanding of the concept of variation itself was the focus of research.The "gumball task" was taken forward by a number of researchers (e.g., Torok & Watson, 2000).Noticeably, interviews formed part of the research methodology, essential for probing deeper understanding.The 2001 Second International Research Forum into Statistical Reasoning, Thinking and Literacy (SRTL2) turned attention to 'reasoning' about variation by providing in-depth qualitative analyses of students' responses (Reading, 2002;Shaughnessy, 2002;Watson, 2002).Early research into the importance, meaning and understanding of variation was summarised by Meletiou (2002).
Second, research into reasoning about variation was expanded into many different statistical contexts.Much research reported at the Sixth International Conference on Teaching Statistics (Phillips, 2002) included appreciation of the influence of variation.In fact, Watson and Callingham (2003) explicitly incorporated appreciation of variation in the more cognitively advanced levels of their six-level statistical literacy construct.Concurrently, researchers showed increasing interest in developing frameworks to describe how students worked with variation (e.g., Watson, Kelly, Callingham & Shaughnessy, 2003;Reading & Shaughnessy, 2004).
Third, a coming-of-age for the research occurred with Reasoning about Variation being the focus for the Third International Research Forum on Statistical Reasoning, Thinking and Literacy (SRTL3) (Lee, 2003).This forum increased the scope of research beyond school-aged students and formalised the frameworks for reasoning about variation.Work with tertiary-level students and teachers provided a broader picture of the reasoning.The various prescriptions for explaining reasoning were being more formally described and links with informal reasoning investigated.Two issues of the Statistics Education Research Journal (volumes 3(2) and 4(1)) took a number of the SRTL3 studies to a broader audience.Three themes emerged: a link between the tool used and the reasoning observed; the interconnection of reasoning about variation with all parts of the statistical investigation cycle; and the need to reflect in teaching both understanding of phenomena, and learning to reason about variation using tools (Pfannkuch, 2005).Reflection on the SRTL3 papers led Garfield and Ben-Zvi (2005, p. 93) to conclude that "understanding of variability is much more complex and difficult to achieve than prior literature suggests." Fourth, there was a rapid increase in research designed to refine cognitive development hierarchies.Analysis of school students' appreciation of variation from point expectation and from distributional expectation in chance settings (Watson & Kelly, 2004a) was supplemented with interviews to determine developmental progression (Watson & Kelly, 2004b).Research to develop hierarchies expanded beyond school level with Canada's (2006) proposed framework for examining pre-service teachers' reasoning about expecting, displaying, and interpreting variation and Reading and Reid's (2005) hierarchy based on tertiary student responses to minute paper tasks.Case studies of the latter were provided by Reid and Reading (2005) to illustrate their hierarchy of No, Weak, Developing, and Strong Consideration of Variation.
Fifth, the research base broadened and deepened to include linking to, and contributing to, research into the understanding and learning of other statistical concepts.This was particularly evident at the Fourth International Research Forum on Statistical Reasoning, Thinking and Literacy (SRTL4) (Makar, 2005) which focused on reasoning about distribution, but included many explicit indications of the fundamental importance of variation to reasoning about distribution.In fact, Reading and Reid (2006b) used previously identified cognitive development in reasoning about variation to propose foundations for tertiary students' reasoning about distribution.The complexity of the connection between variation and distribution was flagged by Pfannkuch and Reading (2006, p. 4) as a major conceptual obstacle in statistics.
Sixth, and finally, there has been further expansion of the scope of research involving reasoning about variation.Research shared at the Seventh International Conference on Teaching Statistics (ICOTS7) (Rossman & Chance, 2006) included: a growing focus on conditions to support improvements in reasoning about variation; insights into the way variation was viewed; and continuing work into developing cognitive hierarchies that involved students completing questionnaires supplemented with interviews.A focused review of research into students' statistical learning and reasoning by Shaughnessy (2007) provided an overview of various models of reasoning (pp.966-967) and, in particular, an emerging conceptual model to describe student reasoning about variation progressing "from ikonic, to additive, to proportional, and finally to distributional" (p.976).Another addition to research has been the proposition of a hypothetical learning trajectory (Garfield, delMas & Chance, in press) to assist in the development of reasoning about variability as students progress towards more formal reasoning.
Although the research into reasoning about variation has developed rapidly, much of the qualitative work is exploratory in nature.Future possible directions for research into reasoning about variation include exploring tools that support the development of reasoning, and the reasoning about variation that is needed to assist reasoning about more advanced statistical concepts such as formal inference.Following is the detail of a study that sought to provide more information about tertiary students' reasoning about variation, by exploring the possibility that extended information about reasoning could be provided through the use of certain interview techniques such as clarifying, prompting and probing.

THE STUDY
This study continues a previously reported project Understanding of Variation (Reading & Reid, 2005) which assessed development in understanding of variation, and identified teaching strategies that assisted that development.The focus was on tertiary students studying an introductory statistics course that treated variation as a core concept.The sequence of teaching activities was designed to give more structure to the link from students' initial 'intuitive' understanding of variation (minute papers, group discussions), to a better understanding of variation (hands on demonstrations, computer simulations, written tasks).
Reasoning statistically about variation, a necessary component of making this link, requires students to explain what they are doing and why they are doing it, i.e., reason.To assess the level of reasoning about variation, hierarchies of consideration of variation were developed, based on the analysis of student responses to the various learning activities; pre-study and poststudy questionnaires (Reid & Reading, 2006), minute papers (Reid & Reading, 2004;Reading & Reid, 2005) and assignments and class tests (Reid & Reading, 2005).Hierarchies to assess the cognitive development of reasoning about variation had previously been developed for coding school student responses (e.g., Torok & Watson, 2000;Watson, Kelly, Callingham & Shaughnessy, 2003) but there was a lack of such hierarchies for the tertiary level.
Important research issues arise with such analysis, including whether researchers are interpreting responses correctly and whether students are responding optimally.The focus of this report is interviews, following the pre-and post-study questionnaires, designed to clarify these issues.The opportunity was also taken in the interviews to ascertain which learning activities the students perceived as assisting the development of their understanding.Some detail about the pre-study and post-study questionnaires is given before the interview analysis is detailed.A questionnaire (Figure 1) was developed focusing on variability (Q1), comparing data sets (Q2), sampling (Q3 and Q4) and probability (Q4).This questionnaire was given to students at the beginning (pre-study) and end (post-study) of the course.A hierarchy (Figure 2) was developed using the responses and other variation-related hierarchies (e.g., Torok & Watson, 2000;Reading & Reid, 2005).
Question 1.What does variability mean to you? Give a verbal explanation and/or an example.
Question 2. Citizens in an outer suburb were concerned about the reliability of their bus service to the centre of the city.They monitored the in-bound and out-bound service of the buses at Bus Stop 33, and recorded the number of minutes late.Zero minutes late indicates the bus was on time while a negative number of minutes late indicates the bus was early.The data are displayed in the two graphs.Describe and compare the performances of the two bus routes.

Minutes Late
The hierarchy, with levels of no, weak, developing and strong consideration of variation, was then used to code the post-study responses.The pre-and post-study levels of consideration were compared.This analysis was reported in Reid and Reading (2006).Notably, responses to the post-study questionnaire indicated the use of more sophisticated terminology and more emphasis on measuring and modelling variation (rather than just describing).Comparison of student performances showed that the development of consideration of variation differed with the context of the question and there were more marked shifts towards better quality responses for Q1 and Q3.
No consideration of variation Q1:do not consider any sources of variation Q2:may refer to a measure of centre, but not to any measure of spread Q3:do not acknowledge any variation about the expected values Q4:do not acknowledge any variation about the theoretical or expected outcomes Weak consideration of variation Q1:discuss one source of variation but expression is poor Q2:refer to the range and/ or basic description of shape Q3:acknowledge variation and expectations are articulated but not based on given data, look for extraneous causes of variation Q4:allow for variation but amount suggested is low or high, causes given are extraneous Developing consideration of variation Q1:describe clearly one source of variation (within-group, between-group, controlling factors, measurement error) Q2:refer to measure of location and more detailed description of spread Q3:consider variation between expected and observed values and/or identify need for a larger sample or more information Q4:provide a realistic amount of variation, but may not be centred correctly, reasoning may be based on frequencies rather than proportions Strong consideration of variation Q1:describe clearly more than one source of variation Q2:provide further information about the distribution, such as explicit proportions Q3:not described since no response coded at this level Q4:provide a realistic amount of variation, and proportional reasoning is correctly used Selected students were interviewed after they had completed the questionnaires to address the research questions: What extra information can be gained about students' reasoning by interviewing students?What teaching and learning activities do students perceive as assisting their development of understanding?Details of the interview process and analysis of the transcripts follow.

METHODOLOGY
The interview following the pre-study questionnaire ("pre-interview", see protocol in Figure 3) was designed to clarify what students meant by expressions in their written responses and to prompt and probe students to provide further reasoning.Probing was used in both clarifying and prompting situations to encourage students to extend their responses.As a final general probe for each question, students were asked if they had any last comments to add.Explanations were checked for any change in coding level.Selection of students for preinterviews was based on questionnaire responses that needed clarification, i.e., the interpretation of wording needed to be checked with the respondent.The interview following the post-study questionnaire ("post-interview", see protocol in Figure 4) was used to investigate students' reasoning when presented with situations of cognitive conflict based on their own responses to the pre-and post-study questionnaires.These cognitive conflict situations were posed to have students explain their preferred choice of two of their responses that had been coded at different cognitive levels.Students who had responded at different cognitive levels in the pre-and postquestionnaires for at least two of the four questions were selected for post-interview.Students were asked to explain awareness of their own understanding and to identify those activities that they considered to be most effective in progressing, or hindering, the development of their understanding.Selection for post-interviews was based on questions where there had been differences in the level of the pre-study and post-study coding.Due to time constraints and the fact that not all of everyone's responses would be expected to show different cognitive levels, each post-interview only referred to two of the four questions.
Students' views of the aspects of the course that assisted their understanding to develop were analysed according to Petocz and Reid's (2003) frameworks for conceptions of statistics learning and teaching.All interviews were audio-taped and conducted by a non-teaching researcher but analysed by two researchers, one teaching and one non-teaching.

RESULTS FOR PRE-INTERVIEWS
Of the 32 students who completed the pre-study questionnaire six students (pseudonyms); Bron, Deb, Cassie, Adam, Bruce, and Colin were interviewed and results follow.

Clarifying
The clarifying questions generally took the form "what do you mean by…" Altogether 42 points of clarification were sought.Three of these clarifications involved some aspect of the written response that could not be interpreted by the researchers, and for each the student was able to satisfactorily explain the meaning he/she had intended.For example, Deb explained that her use of "evenly skewed" to describe the outbound bus data (Q2) was meant to describe a symmetrical distribution but she did not know the appropriate term.
For the other (39) clarification points, researchers lacked confidence in their interpretation of the students' meaning.Of these, students were unable to clarify 11 queries.Examples follow.Cassie was asked, in Q1, about her statement that "variability means differing aspects of characteristics or quality among a group" and Deb was asked to clarify her comment that "variability means differentiation."Although general conclusions can be drawn about their intentions, neither student was able to explain the intended meaning.During the interview Bruce described, in Q2, "most of the data within here, or most of the data between here and here" (pointing to values).He agreed that it was not worded well but was not able to give an alternative, leaving the interviewer assuming that only range was intended.
For the remaining 28 situations students clarified interpretation.All but three interpretations agreed with those made by the researchers.For the three non-agreeing interpretations, two instances arose in Colin's responses but neither suggested that the response should have been coded at a different level.In Q3 his observation that the southern regions "having no abnormalities is surprising" had been interpreted as meaning he expected abnormalities in these regions due to factors not given in the question.In fact, he clarified that if the regions have equal proportions of population then some abnormalities should be expected in each region.In Q4 Colin had stated he wanted the mean of the numbers to be five, which put no restrictions on the spread of the numbers, but he clarified that he meant "closer to five" making the spread smaller than would be suitable.The only clarification that led to a suggested change in coding (from weak to developing) was for Adam's Q1 response about variability as "amount of difference between the sets of scores", which was clarified in relation to diagrams he had drawn as meaning the distance between points on one diagram.Previously this had been interpreted by the researchers as comparing high variability in one diagram with low variability in the other diagram.

Prompting
The prompt questions: • questioned the quality or basis of given responses (e.g., "is that important?", "…. was based on the fact that ….?", "why did you ….?"), • suggested alternatives to consider (e.g., "do you think you could get none?"), • asked for a reason for something (e.g., "…because…."), • suggested a point as a way forward in explaining the response (e.g., give alternate representations, summarize information, draw a picture), or • asked a specific question leading the student to consider a point involving a deeper level of thinking.
Some students reacted favourably to prompting.Minimal prompting was needed to assist the male students to elaborate on their written responses, but the three female students did not respond so well.Prompting elicited little more from Bron, whose responses were already developing or strong.Deb needed a lot of prompting to encourage her to advance her responses, and Cassie generally did not manage to run with leads she was given, often just agreeing with a prompt rather than being able to use it as a stimulus to develop more discussion.
The most important prompts to consider are those that allowed the students to produce changes in the level of response.Seven prompts produced increases in the level but two resulted in a weaker level.The prompting appears to have been most effective for Q3 where four of the six students increased the level of their response: Deb (weak to developing), Bruce (developing to strong), Colin (weak to strong) and Cassie (weak to strong).Deb's response was interesting because the change in coding for Q3 only occurred after much prompting.Initially, Deb's written response had been coded as weak because she had looked for extraneous causes of variation and had not made use of the given data.After prompting to bring her attention to the given data, Deb identified that more are needed.After a further prompting question asking how many more years of data are needed, Deb stated ten years but when asked why ten she simply conceded that 20 would be necessary.When asked, in response to a statement she made, "enough years to do what?",Deb responded "to do a valid comparison."This was sufficient to be coded as a developing response because Deb had identified the need for a larger sample.However, a number of prompting questions had been asked before the better response was given.
On the other hand, with just one prompt, "how many past years do you need?", Bruce improved his response that had already been coded as developing, by suggesting ten years and then demonstrating possible year-by-year variation by using example data.This provided a vision of what a strong response for Q3 would be.No written responses to this question had been able to do this.Then a final prompt, "are you actually saying there is something poor in this information?",led Bruce to state that conclusions could not be drawn because there was no question asked in the first place.He elaborated with consideration of experimental design issues such as the need for: a research question; more data; and the factors to be considered in the inferential process.
One student showed improvement in each of the other questions; Deb (Q1, weak to developing), Adam (Q2 developing to strong) and Bruce (Q4, weak to developing).In his explanation for Q2 Adam was able to provide more information about the distribution each time he was prompted.First, the prompt "what have you probably made use of ?" helped him to consider density ("clumpedness") and anchoring ("clumped around zero") of the distribution.Second, in response to the prompt "what is it actually measuring if you were calculating the standard deviation?",Adam first elaborated on distances from one point to another but then, with no further prompting, changed to "how far from the centre", followed by mention of bimodal shape.The depth of information provided by Adam's answers was sufficient to code the response as strong.
The two occasions when prompting decreased the coding, from developing to weak, were Q2 for Cassie and Q4 for Colin.Colin was interesting because not only did prompting reduce the level of his response to Q4 from developing to weak but further prompting actually undermined his confidence even in his weak response.Colin's written response, coded as developing, indicated that a realistic amount of variation centred around five was expected.When challenged with the prompt that another of the multiple choice options could also be considered as correct in the light of the reason he had given for his first choice, Colin selected an option with unreasonably large variation as being more realistic.Then, when prompted with "what other numbers" would be expected for 40 repetitions, Colin initially responded with runs of very low numbers but eventually admitted that he was not able to visualise the results.His focus then on runs of numbers was sufficient to assign a weak level of coding to his response.Rather than eliciting a deeper response, the prompting had actually undermined the consideration of variation that his previous response had demonstrated.

Probing
The probe questions: • directly asked students for more information (e.g., "is there anything else you might like to add?"), • implied that more was needed by repeating part of the student statement but trailing off to allow the student to continue (e.g., "you're saying there are other factors that could …"), • brought together points made by the student and repeated them for comment, • directed the student to a particular part of the response (e.g., "is it easier with your example?"), or • put the onus back on the student to provide more by redirecting a question that the student asked (e.g., "that's what I'm asking you").
Probing had two purposes; as a stimulus for discussion to continue after a clarification or a prompt, and as a general probe for more information.When used as part of a clarification or prompt, probes helped to keep the student talking, especially when there was a new studentintroduced idea emerging.For example, a probe in Q3 summarizing Cassie's explanation about the need for more information such as influential factors other than those given in the question, allowed her to move on to suggesting that samples from other years would also be useful.Some students gave quite detailed information on initial triggers, such as a clarifying question, but for others probing was necessary.As the specific probing was part of clarifying or prompting it was not relevant to consider any effects on the coding level.When a general probe was used, there was no more information given by any of the students and hence no changes to coding levels.However, some dissatisfaction with responses was expressed, such as Deb who admitted that in Q3 she had taken the wrong approach and Bruce who said he could have given a better answer in Q2, despite claiming that the question was poorly worded.The general probe was usually given after both clarifying and prompting that had already allowed the student to share relevant ideas, and so lack of the provision of any further information, was not unexpected.

RESULTS FOR POST-INTERVIEWS
Four students (pseudonyms); Anne (Q1, Q4), Ellie (Q2, Q4), Adam (Q2, Q3), and Bruce (Q1, Q3), were interviewed after the post-study questionnaire.These students were chosen from those who had a post-study response coded at a different level to the corresponding pre-study one.

Cognitive Conflict
Students found it difficult to make a clear decision when presented with a choice between their own pre-and post-study responses to a question, often being influenced by the fact that most post-study responses included evidence of the courses' teaching and learning activities.Students often used standard statistical terms (e.g., standard deviation) and referred to numerical and graphical summaries (e.g., a sampling distribution) that had been presented in class.However, evidence that the student understood these inclusions was lacking.In explaining his preference for his post-study Q1 response, which displayed a weaker level of consideration, Bruce said that he "just tried to look at it from a more -I guess -mathematical, statistical view" and provided a numerical summary.There were two other instances where students preferred the weaker response.Ellie's justification for her choice in Q4 was that she had misinterpreted the post-study question.Anne chose the weaker response (pre-study) in the first part of Q4, because it was "really vague."Further discussion led her to comment that a combination of answers that "… doesn't contradict itself on one page, [is] always good."She was uncertain about her responses and worried by apparent contradictions.In two cases, students developed a third response in preference to either of their previous responses.Ellie decided to revise her response to the last part of Q4 after her misinterpretation of the question became apparent.Anne, after prompting, drew on a practical example to help formulate a better response to Q1.

Teaching and Learning Activities that Impacted on Statistical Reasoning
All four students believed they understood a statistical concept if they could apply that concept in a variety of contexts.Each of the teaching and learning activities included in the course was deemed, by at least one student, to have helped develop that understanding.These included well-structured lectures; step-by-step instructions for applying methods; the use of minute papers to identify knowledge gaps for use in self-directed learning; computing assignments that brought together theory and practice; working on tutorial questions prior to the class to maximize the benefit of direct interaction with teaching staff; and instantaneous feedback in a small class setting.
When asked if they understood the concept of variation, Adam, Bruce and Ellie all agreed, although Adam and Bruce could not explain how they knew that they understood.Ellie explained by defining the concept in a similar way as the approach to Q1 in the questionnaire.When asked if she understood, Anne replied "probably not" and explained that "there hasn't been a definition given for it."The students found it difficult to identify specific teaching and learning activities that had helped develop their understanding of variation.Perhaps they felt that they had already discussed them with reference to their understanding of general statistical concepts, or they may have struggled to answer since "variation" is a more abstract concept.Ellie and Bruce mentioned statistical concepts such as sampling distributions and the link between sample size and the standard error but only Ellie cited a particular activity-a minute paper that had considered an intuitive analysis of variance-as useful in developing her consideration of variation.A common theme that did emerge was the use of graphics and visuals as important tools in helping students develop a better understanding of statistical concepts in general, and of variation in particular.

DISCUSSION
A wealth of information about the way that individual students express their reasoning about variation was gained by interviewing students using clarifying, prompting, probing, and cognitive conflict situations.Clarifications indicated that generally researcher's initial interpretations of written responses were consistent with those students intended but the value of interviewing is demonstrated better by the probing and prompting.Despite prompting being more readily utilized by some students more than by others, to expand on responses, all but one student was assisted to increase the level of response to at least one question.Those prompts that built on information or ideas supplied by the student were more successful than those that suggested a way forward that was not linked to student-initiated ideas.Probes helped students to continue their thread of discussion and better articulate their explanations, although the general probe as a final query did not produce any relevant information.However, probing is not sufficient to improve the quality of a student response.There needs to be some form of prompting and even then some students do not necessarily improve.A possible explanation of the higher number of improved quality responses for Q3 was that the wording of the question was more general than the other questions and with prompting the students were better able to unpack the question.The two occurrences of reduction in levels, after prompting, serves to demonstrate that there may be some instances when students are able to provide standard statistical expressions but not able to explain what they mean.
Students found it difficult to make choices in cognitive conflict situations and some preferred the post-study responses using concepts and terms from the course, even though they did not necessarily reflect understanding.This produced rich situations to allow the student to think more deeply about his or her reasoning.The listener needs to be sensitive to the thinking path of the student and to structure prompting and cognitive conflict situations accordingly.
Student explanations of self-awareness of conceptual understanding provided useful insights into their conception of learning.With reference to Petocz and Reid's (2003) conceptual framework for statistics learning, two students viewed their learning as "applying" and the other two as "linking."None of the students was able to articulate the preferred view of the value of statistical learning in "learning about areas outside statistics" or to "support changing one's views."Students attributed the development of their understanding to a wide range of teaching and learning activities and generally explained how each contributed.However, they showed poor conceptions of statistics teaching, according to Petocz and Reid's (2003) conceptual framework.This means that the students were not able to appreciate what teaching could, or should, do for them.They viewed teaching as simply "providing materials, motivation, structure", or "explaining material and helping with student work."Only one student conceived teaching as "linking statistical concepts and guiding learning."Preferred conceptions of teaching would be as "anticipating student learning needs" or "being a catalyst for open mindedness."International Electronic Journal of Mathematics Education / Vol.2 No.3, October 2007 Although no particular strategy was favoured for learning, the desire for visual aids and graphical representations was a recurring theme.
This study was exploratory in nature, limited by the small number of interviews and the use of audio-tape, rather than video-tape, not capturing information about actions when students pointed to parts of their responses.Despite its limitations, this study has demonstrated the usefulness of clarifying, prompting, probing, and cognitive conflict to encourage students to talk about their reasoning about variation.It has also shown that students' awareness of the nature of their own understanding, and the teaching they experienced is not well developed conceptually.Strategies are needed to assist students to change their view of teaching from a simple provision of essentials to a focus on student learning.

IMPLICATIONS FOR TEACHING AND RESEARCH
The proliferation of research into reasoning about variation over the last decade has advanced the knowledge about students' cognitive development of reasoning about variation, as well as the connection between this and reasoning about other statistical concepts.The initial push saw an increasing number of statistics education researchers focus on variation.Then a focus on reasoning in particular, was followed by a broadening and deepening of the scope of the research.Various hierarchies have been proposed to explain cognitive growth of reasoning, or understanding, of variation and methodologies refined to efficiently describe the levels in these hierarchies.Now educators need to make use of these hierarchies in teaching, for planning and assessing learning, and researchers need to assist educators by further developing the hierarchies and evaluating their effectiveness in supporting learning about variation.
In the reported study, student explanations of their unique expressions and additional reasoning in prompted and cognitive conflict situations were shown to provide a deeper insight into how individual students reasoned about variation.Such processes could be used to understand better how students reason, and highlight the importance of listening to students.As students identified a variety of teaching strategies that benefited their developing reasoning, educators need to plan a range of activities, including opportunities for students to talk about their work.Facilitation by prompts and cognitive conflict situations would encourage development of their reasoning.Researchers are challenged to determine whether similar techniques are effective in obtaining deeper insights into student reasoning about other statistical concepts.Finally, the extra insight into reasoning provided during clarification and in response to prompts and probes highlights the value of interviewing, a time consuming but rewarding data collection process for researchers.To understand and support students' reasoning about variation we need to allow, in fact encourage, students to have a voice.