A prior version of this paper received the 2015 NTLI Fellowship Award from the Association for Mathematics Teacher Education.

The importance of using technology in the preparation of preservice mathematics teachers (PSTs) has been at the forefront of national conversations in mathematics teaching and teacher preparation for over 15 years (e.g., Association of Mathematics Teacher Educators, 2015; Garofalo, Drier, Harper, & Timmerman, 2000; National Council of Teachers of Mathematics, 2000). Mathematics teacher educators (MTEs) are charged with promoting PSTs’ engagement with a variety of technological tools as well as mathematics-specific technologies that deepen understanding of mathematics and students’ thinking with technology.

Such engagement may require MTEs to make changes to their teaching goals to provide learning opportunities that better foster the development of PSTs’ technological pedagogical content knowledge (also referred to as technology, pedagogy, and content knowledge, or TPACK; Mishra & Koehler, 2006; Niess et al., 2009). Therefore, the purpose of this paper is to (a) present one approach for incorporating technology into a mathematics methods course that utilizes several types of technology into one lesson, (b) highlight affordances and limitations of different technological choices, and (c) discuss implications for teacher education.

Specifically, our approach situated MTEs as having to draw upon their own TPACK, where the specialized content was mathematics education, to create opportunities for PSTs to develop their TPACK, where the specialized content was a topic in secondary mathematics. To do so, MTEs have often used an approach to increase PSTs’ TPACK by engaging them in the tasks similar to those that they will be expected to use in their own classrooms, examining classroom videos, and students’ work to make sense of students’ reasoning about the task (e.g., Didis, Erbas, Cetinkaya, Cakiroglu, & Alacaci, 2016; Wilson, Lee, & Hollebrands, 2011). The task we chose to use is a data analysis activity we refer to as Mislabeled Variables (Appendix A), one that the instructor had previously taught to sixth-grade students (Lovett & Lee, 2016).

As students engaged in the Mislabeled Variables task, they investigated variable types (i.e., categorical and quantitative) and distributions generated from data gathered from a series of survey questions. Students used data collected from a class survey to make claims about which survey questions produced the data for each variable, where the variables are given letters names (e.g., *A* and *B*) rather than descriptive names (e.g., gender or shoe size).

A data analysis task was chosen in response to the need to increase the preparation of PSTs to teach statistics, as reflected in the increased emphasis on statistics in recent years in the K-12 curriculum (National Council of Teachers of Mathematics, 2000; National Governors Association Center for Best Practice & Council of Chief State School Officers, 2010). Research has shown that many preservice secondary teachers do not possess the statistical knowledge needed to teach statistics effectively and often are not provided enough learning opportunities related to statistics learning and teaching in their mathematics methods courses (Lovett, 2016). In this context we chose to pose the Mislabeled Variables task to our PSTs.

The lesson was designed and enacted in an undergraduate mathematics education course for middle school and high school mathematics teachers taught by the first author at a large southeastern university. This lesson consisted of two parts: (a) PSTs engaged in the Mislabeled Variables task, and (b) PSTs watched and discussed a video case of sixth-grade students’ answers, reasoning, and misconceptions on the same task.

Video cases are a common tool used by MTEs and researchers to help PSTs reflect on their own knowledge and practice, and they allow PSTs opportunities to observe and understand professional practice and students’ mathematical reasoning (Grossman et al., 2009; Lampert & Ball, 1998; van Es & Sherin, 2002, 2008). Short clips that highlight important classroom moments are recommended because they can be viewed repeatedly to focus on important classroom moments (Borko, Jacobs, Eiteljorg, & Pittman, 2008; LeFevre, 2004).

We were unable to videotape the classroom when the task was enacted with sixth-grade students, so we chose to create a brief animated video to illustrate the classroom scenario that included samples of students’ work. Animations allow MTEs and researchers to recreate a real classroom environment without introducing cameras and microphones into the classroom. They are also useful when capturing high quality sound of what the teachers and students are saying is not possible (Chazan & Herbst, 2012).

## Literature Review

Research exploring animations in mathematics teacher education is still in its infancy. The majority of the prior research conducted has examined the use of *LessonSketch*, an online environment that enables users to create representations of classroom scenarios (Herbst, Aaron, & Chieu, 2013). While conducting a professional development with in-service teachers, Chazan and Herbst (2012) found that participants were able to identify with the fictional teacher and that the animation did not inhibit participants’ discussion of the instruction depicted in the video. Teachers were also able to project their previous classroom experiences onto the characters in the animation so they could have a discussion about their experiences.

In another study, Herbst, Aaron, and Erickson (2013) asked preservice teachers to rate videos and animations on their genuineness. In addition, the researchers explored preservice teachers’ capacity to notice pedagogical and content knowledge features of the animations, asking them to reflect on alternate actions of the teacher while considering their previous experiences. The researchers found no significant differences in ratings in any of the measures except in genuineness, concluding that animations could be just as effective as video case examples in teacher education. In light of these findings, we had evidence that an animation of student engagement with the Mislabeled Variables task might provide PSTs with an authentic glimpse of student strategies.

## Framing the Task Design

In the design of the lesson, we extended Lee and Hollebrands’ (2011) three aspects of teachers’ knowledge related to teaching statistics with technology to characterize four types of teachers’ knowledge: (a) statistical knowledge (SK), (b) technological statistical knowledge (TSK), (c) pedagogical statistical knowledge (PSK), and (d) technological pedagogical statistical knowledge (TPSK; Lee & Nickell, 2014).

Statistical knowledge is foundational for developing statistical knowledge for teaching and technological pedagogical statistical knowledge (Groth, 2013; Lee & Hollebrands, 2011). To assist in the development statistical knowledge, PSTs should engage with technology-enabled tasks that allow exploratory data analysis (EDA). To develop TSK, PSTs should engage in tasks with dynamic statistical software that encourages simultaneously development of statistical ideas and technological skills.

In terms of PSK, particular pedagogical decisions arise when teaching statistics that differ from other areas of mathematics, such as (a) planning for group projects and discussions about data, (b) supporting students in making statistical arguments based on appropriate evidence, and (c) considering the contexts used for teaching statistical ideas. Therefore, PSTs should learn to engage students in statistical investigations in a variety of contexts that require students to make decisions and arguments and consider how respond to different conclusions among groups during discussions (Shaughnessy, 2007).

The ultimate goal in the preparation of PSTs to teach statistics with technology is to develop the specialized subset of knowledge representing TPSK. This type of complex knowledge is likely to develop with extended experiences with technology and considerations of the impact of such tools on the teaching and learning of statistics. MTEs are tasked with designing technology-enabled learning environments that develop aspects of PSTs’ TPSK. The technologies chosen will impact what aspects of TPSK that PSTs have an opportunity to develop.

Madden (2011) suggested that tasks used with teachers can be provocative in the sense that they can excite and stimulate focused conversations and attention to statistics, context, and technology. Tasks can also be pedagogically provocative since they can stimulate a focus on pedagogical issues within statistics. Within SK, this study focused on engaging PSTs in statistical thinking through explorations of real data using TinkerPlots (Konold & Miller, 2011) and analyses of data to draw conclusions. In this way, the statistical knowledge that PSTs developed may have been interwoven with their technological statistical knowledge (TSK), which focused on using technology to explore and analyze data.

Within PSK, the Mislabeled Variables task focuses on providing PSTs with experiences with group work and supporting arguments with evidence that could be used as a foundation for planning their own lessons. Within TPSK, the goal is to provide PSTs with opportunities to reason about students’ learning of statistical ideas with technology.

### The Mislabeled Variables Task for Preservice Teachers

We identified learning objectives related to SK, TSK, PSK, and TPSK to guide our planning. The content objective (SK) was to increase PSTs’ ability to reason about the context of data and measurement units; to engage with a multivariate data set with a dynamic statistical tool, TinkerPlots; and to make claims about reasonable contexts for different data distributions. In terms of PSK, objectives were to encourage PSTs to justify their reasoning with data-based evidence, to critique the reasoning of others, and to consider how to promote this type of reasoning and critique in students.

The TSK objective was for PSTs to learn how dynamic statistical software can support initial data exploration using dot plots, dynamic linking, and coordination of two or three variables. Lastly, for TPSK the objective was for PSTs to reason about students’ statistical understandings and misunderstandings and approaches to the task while using a dynamic statistical software.

### Engaging Preservice Teachers in the Task as Learners

It was important to engage PSTs in the Mislabeled Variables task first as learners (to develop SK and TSK) using the task as it was previously taught to sixth-grade students (Lovett & Lee, 2016). The Mislabeled Variables task was adapted from Garfield and Ben-Zvi’s (2008) Variables on Backs, a task designed to assist introductory statistics students in developing an understanding that different statistical questions produce different types of variables. The Variables on Backs task was enhanced through the use of TinkerPlots to explore survey questions, types of variables that questions produce, measurement units, and expected data values.

To begin the Mislabeled Variables task, PSTs completed a personal information survey as a Google Form containing 16 questions (e.g., “What time did you go to bed last night?”). Questions were chosen intentionally to produce different data types (e.g., whole numbers, decimals, and time values). Survey responses were gathered online and used to create a data set in TinkerPlots, in which 16 attribute names were labeled as A, B, C, and so forth, and randomized so as not to match the order of questions from the survey. See Appendix B for the survey questions and assignment of the corresponding letters. (The Mislabeled Variables task for teacher educators is also available as a resource in the Teaching Statistics Through Data Investigations MOOC-Ed, friday.institute/tsdi)

In class, PSTs were seated using a single laptop with access to a practice file and the class survey data in TinkerPlots. The instructor provided a quick tutorial on using TinkerPlots, since PSTs had no prior experience with the software. This tutorial included an introduction to the cards, an explanation of how TinkerPlots uses colors to represent quantitative and categorical variables, and step-by-step instructions for creating a plot, separating data, and stacking data. An identical tutorial was provided to sixth graders. Next, each pair of PSTs was assigned data from two attributes and asked to make a conjecture about survey questions that most likely generated the data. Attributes were assigned so that each PST pair reasoned about different types of data (e.g., whole numbers, decimals, and time values). The attributes of C, F, I, J, O, and P were assigned to at least two groups, since these data were featured in the animation to be used during the second portion of the lesson.

Following small group work, the instructor modeled Common Core Mathematical Practice 3, “Construct viable arguments and critique the reasoning of others” (National Governors Association Center for Best Practice & Council of Chief State School Officers, 2010) by engaging PSTs in a class discussion. PSTs presented claims and evidence about the match between the attribute and the source of the data to their peers and critiqued the reasoning of others. The instructor purposefully chose PSTs to present their group’s approach, encouraging PSTs to examine specific attributes and demonstrate different ways of reasoning about the task.

### Pedagogical Focus of the Lesson

Following their engagement in the task, the PSTs watched a video case of sixth-grade students’ answers, reasoning, and misconceptions – specifically, a 3-minute animation created with GoAnimate to depict examples of how sixth grade students reasoned when they had completed the task (see Video 1). This animation provided the instructor with the opportunity to use authentic artifacts to engage PSTs in sense-making activities with student work (as in, e.g., Didis et al., 2016; Wilson et al., 2011).

The instructor showed the animation once to help PSTs understand the content of the animation. Then, the instructor directed PSTs to focus on claims provided by the sixth graders and how these claims were similar or different to those made by PSTs. The instructor showed the animation a third time and directed the PSTs to focus on ways in which students reasoned about the data and engaged with TinkerPlots.* *The PSTs compared and contrasted this engagement with their own use of the software.

**Video 1.** GoAnimate movie of sixth grader’s reasoning (from https://youtu.be/fWI1Ht0NPLE).

### Classroom Implementation: A Focus on Sequencing the Class Discussion

During small group work on the first part of the lesson when PSTs were engaged as learners with the Mislabeled Variables task, the instructor strategically identified PSTs to present their work during the class discussion. The following sequence was purposefully chosen so PSTs’ reasoning and approaches would increase in sophistication throughout the class discussion and align with the animation of the sixth graders’ reasoning discussed in the second part of the lesson.

### Range of Values

The first two groups supported their claims through a discussion of the range of the data values. Amy and Ed claimed that letter *E* represented “What month were you born in?” and justified this hypothesis by noting, “The values went from 2 to 12, and 12 is the maximum.” Amy said that she was born in December, so she felt that this was a reasonable claim.

The class did not find this evidence wholly convincing. To verify the claim, Amy polled the class to determine if anyone was born in January or August. Since there were no data values for either of those months and since no one was born in January or August, the class was convinced that the claim was true.

Reagan and Molly used a similar technique to provide evidence for letter *A*. They claimed *A* was the total number of letters in the first and last name. They looked up everyone’s name on our course management system and counted the letters. Noting that one classmate had a total of eight letters in their name and another had 17 and that these values were the same as the minimum and maximum data values for letter *A*, the class was convinced that this claim was true.

### Multiple Representations

Next, PSTs looked for questions on the survey that might match attribute *F*. We chose Morris’ group to present first since their supporting evidence was not conclusive. Specifically, they claimed that data from *F* was generated from the survey question, “What is your gender?” Morris noted that the data consisted of ones and zeros and, thus, had to correspond to one of two questions that could produce this type of data. He supported his argument with evidence that there were 11 females in class, which were represented as zeros. Another student pointed out that attribute *C* also had data containing 11 zeros and 10 ones. The instructor asked Morris if it bothered him that he did not know which attribute represented *F* and which represented *C*. Although Morris was not bothered by this circumstance, other PSTs were.

The instructor then asked Kadeem’s group to present work utilizing multiple representations to support their claims. Kadeem suggested examining shoe size because men typically have larger shoe sizes than women and, as such, this would help identify gender. Kadeem believed attribute *J* represented shoe size because of the range of the data. Another student in class supported Kadeem’s claim by pointing out the data values of *J* consisted entirely of whole or half values and that only two questions of the survey were capable of producing half values.

At this point, Cam, Kadeem’s partner, went to the board and demonstrated how he used technology to coordinate his reasoning among the attributes *C, F*, and *J* (Figure 1). Cam knew that there was a male student in the class with a size 14 shoe, so he displayed three graphs, *J* vs. *C*, *J* vs. *F*, and *J*. He highlighted the data point of 14 in *J* and showed where that point appeared in the other two graphs (Figure 2).

With this evidence, the class agreed that attribute *C* represented gender, and *F* represented wearing glasses.

### Identifying Their Own Card

To conclude the class discussion, the instructor asked Joelle and Bethany to share their approach to determining letter *O*. Their claim was that *O* was generated by the question “What time did you go to bed last night?” Joelle and Bethany did not create a graph as their evidence, instead presenting Bethany’s card, card 12.

The instructor asked the PSTs to justify how they knew it was Bethany’s card. Pointing to the values on the card, Bethany noted, “I have 13 letters in my first and last name. I have 2 siblings, 8 pets, and a 9.5 shoe size. I woke up at 10 on Saturday, and I have seven letters in my first name.” The class found this argument convincing.

Five of the seven groups used this approach as evidence for one of the attributes they were assigned. However, the instructor was still skeptical and asked Bethany how she knew that the unit associated with 10 was a measure of time. Joelle flipped to a different card and showed that *O* had values written in an hour:minute format. Bethany’s value was not written in that form, and the PSTs decided that one card did not provide sufficient evidence to identify all the attributes.

### Discussion of the Animation

After watching the animation, the instructor asked PSTs to evaluate answers and claims made by sixth graders. As a class, PSTs claimed that the students incorrectly identified *J* as hours slept. The PSTs had already presented evidence and decided that *J *corresponded to shoe size; therefore, they were convinced that the sixth graders’ reasoning was flawed. PSTs agreed with the other answers presented by the sixth graders.

After showing the animation a third time, the instructor asked the PSTs to consider the strategies sixth graders used to reason about the attributes. The instructor asked PSTs first to identify similarities between their approaches and those of the sixth graders. Several PSTs noted that sixth graders used strategies similar to their own. For instance, both groups considered the range of values and type of data while locating their own cards in the data set.

Morris pointed out that the students used the graphs of *C* and *F* simultaneously to compare attributes *C *and *F*. As a class, the PSTs noted that both they and the students used multiple representations, although students did not use the technology to the same degree of sophistication.

When considering the differences between themselves and sixth graders, the PSTs pointed out immediately that the first pair of students formed an opinion but did not support their claim with evidence. PSTs also recognized that sixth graders reasoned in the aggregate, considering the measures of center to make a claim – an approach that none of the PSTs employed. PSTs were surprised that sixth graders had reasoned in a way that they had not.

### Developing TPSK

Students’ reasoning and approaches to the Mislabeled Variables task fostered PSTs further development of SK, TSK, and PSK. For instance, PSTs employed different ways of reasoning, consistent with Konold, Higgins, Russell, and Khalil’s (2015) four different perspectives on data: data as a *pointer*, *case-value*, *classifier*, and *aggregate*. In drawing conclusions about the context of the data, PSTs utilized a variety of approaches, including *case-value* and *classifier. *However, PSTs also used a different type of reasoning – the type of number (e.g., rational or whole number) – to reason about how such a number would make sense as a measurement of the context of a source of data. This perspective of reasoning about data as measurement was identified by Lovett and Lee (2016) in their study of sixth graders’ reasoning with the same task.

With no prior experience using TinkerPlots, PSTs were able to develop aspects of TSK as they discovered how to create and link multiple representations, using them to support their work and arguments. Related to TPSK, the PSTs demonstrated evidence that, while engaged in discussing the animated representation of students’ work on the task, they could reason about claims students presented while critiquing the students’ reasoning. PSTs recognized that students took different approaches than they did to solve the task, identified which approaches were common among students, and reasoned about different ways that students utilized or did not use different features in TinkerPlots.

## Affordances and Limitations of Technologies Used

The instructor made a purposeful decision to include five technologies for PSTs in this lesson: (a) Google Forms, (b) TinkerPlots, (c) shared laptops, (d) projector and large display, and (e) a GoAnimate animation. As we considered the design and implementation of these five technologies, different affordances and constraints emerged with respect to PSTs’ growth and development of TPSK. We organize affordances and limitations for each technology in the following paragraphs.

### Google Form

To plan and implement the task, it was most efficient for survey data to be collected prior to class time. To collect these data, we had PSTs complete the 16-question personal information survey using an online Google Form. Their answers populated a spreadsheet that was used to assign letters representing the attribute for each question and to sort data by alphabetical order by letter attributes. The spreadsheet was also used to import data into TinkerPlots.

A constraint of using a Google Form to collect data was that the data had to be cleaned prior to assigning letters for each attribute. Since the survey questions were posed as “short answers” in a Google Form, PSTs entered their answers in a variety of formats. For instance, PSTs often added units to their answer (e.g., a.m./p.m., hours, or men’s/women’s) and did not follow directions, for instance, reporting the exact time instead of rounding to the nearest half hour or using the name of the month instead of the corresponding number. Even though cleaning the data took time, it was preferable to the instructor entering data manually. Furthermore, the Google Form enabled the instructor to model online data collection practices for PSTs.

### TinkerPlots

Incorporating TinkerPlots in the task enabled PSTs to analyze the entire data set with drag-and-drop data visualization tools, providing a more open investigation of all attributes from multiple perspectives. For instance, TinkerPlots allowed for quick construction of graphical representations, not only of the attributes a pair was assigned, but also other attributes PSTs wanted to explore. Such approaches are difficult, if not impossible, without dynamic statistical software. Because of the ease and speed that PSTs were able to construct graphical representations, they were able to spend the majority of the task reasoning and drawing conclusions about the data. While reasoning about the data and context, PSTs were able to view multiple representations simultaneously and dynamically link representations of different attributes to make claims about the source of various data they were provided.

A limitation of incorporating TinkerPlots into the task was that PSTs were not able to fully utilize the capabilities for analyzing the data, since this was their first exposure to the software. However, this experience mirrored the experience of sixth graders. Even though it is identified as a limitation of the software, PSTs could easily navigate the basics of data visualization with different representations and did not need advanced features of TinkerPlots to engage with the Mislabeled Variables task.

### Shared Laptops and Projecting a Display

Having pairs of PSTs share a laptop facilitated conversations around the data and encouraged PSTs to verbalize their reasoning to each other, illustrating to PSTs the usefulness of shared laptops for encouraging communication and collaboration. Similarly, PSTs used the classroom computer and projector to facilitate class discussion. Moreover, they used the class computer to recreate graphs as they taught their peers new skills. Such an approach demonstrated to PSTs how technology could be used in their own lessons to support sharing, discussions, and small group work.

### GoAnimate

GoAnimate animation provided PSTs with an opportunity to examine authentic reasoning strategies of sixth-grade students. The animation enabled the instructor to recreate discussions from a sixth-grade classroom that were not video recorded. Moreover, the animation provided PSTs with access to real student work and reasoning. The actual sixth-grade class discussion took approximately 45 minutes. The most salient aspects of this interaction was choreographed and rendered as a 3-minute animation. The creation of this animation provided the instructor with an opportunity to more carefully consider ways in which PSTs may reason with the task.

Moreover, the animation production helped the instructor orchestrate the class discussion among PSTs to include attributes similar to those highlighted in the sixth-grade discussion. The brevity of the animation made it possible for PSTs to watch the video several times, comparing their own answers and approaches with those of the students in a manner supported by previous research (LeFevre, 2004).

A possible constraint of the GoAnimate animation is that it features characters with unusual appearances and voices, a possible source of distraction. Anticipating the animation could be disruptive, the instructor acclimated the PSTs to the animation during an initial viewing, showing the animation a second time while asking PSTs to focus on the answers and reasoning of the students. Furthermore, the animation does not feature real students; PSTs may prefer video of actual children to animated characters. However, the affordances and the amount of class time saved by using an animation seemed to outweigh these constraints.

## Implications for Teacher Education

The work of MTEs requires the design of lessons covering topics that are often difficult to teach and learn, such as statistics, while incorporating technology to develop teachers’ TPACK (Association of Mathematics Teacher Educators, 2015). In the *Statistical Education of Teachers* report, Franklin et al. (2015) presented recommendations for mathematics education programs to develop statistical knowledge and pedagogical statistical knowledge among their PSTs. These recommendations included engaging PSTs with real-world data sets through statistical investigation, developing knowledge of dynamic statistical software as a learner and as a teacher, analyzing student misconceptions, engaging in appropriate teaching strategies, and developing strategies for assessing students’ statistical knowledge.

With the limited amount of time that many teacher preparation programs have to devote to mathematics methods courses, it is worthwhile for MTEs to consider carefully which topics and which technologies should be targeted – namely those that have the greatest impact in the development of PSTs’ TPSK. The Mislabeled Variables task provides guidance to others confronted with challenging design decisions for creating technology-enabled learning environments.

Incorporating numerous different technologies into one task in a methods course can be time consuming, requiring considerable planning. The technologies highlighted in this lesson offered many affordances. Indeed, the data analysis task chosen, the easy-to-use capabilities of TinkerPlots for graphing, and the use of the animated video of students’ approaches all provided provocative opportunities aligned with Madden’s (2011) task framework. In particular, the use of the MTE-created animated video of students’ work on the task seemed to prove pedagogically provocative for enhancing PSTs’ PSK and TPSK.

Although Herbst et al. (2013) found that animations have the same impact on preservice teachers’ ability to reason about content and pedagogy as do video clips, more research is needed on how animations can be used in mathematics methods courses as an alternative to videos of actual classrooms, how different animation software provide different affordances or constraints for PSTs’ reasoning, and how these animations impact PSTs’ abilities to reason about students’ approaches.

## References

Association of Mathematics Teacher Educators. (2015). *Position of the Association of Mathematics Teacher Educators on technology. * Retrieved from http://www.amte.net/position/amtetechnology

Borko, H., Jacobs, J., Eiteljorg, E., & Pittman, M. E. (2008). Video as a tool for fostering productive discussions in mathematics professional development. *Teaching and Teacher Education, 24*(2), 417-436.

Chazan, D., & Herbst, P. (2012). Animations of classroom interaction: Expanding the boundaries of video records of practice. *Teachers College Record, 114*(3), 1-34.

Didis, M. G., Erbas, A. K., Cetinkaya, B., Cakiroglu, E., & Alacaci, C. (2016). Exploring prospective secondary mathematics teachers’ interpretation of student thinking through analyzing students’ work in modelling. *Mathematics Education Research Journal, 28*(3), 349-378.

Franklin, C., Bargagliotti, A. E., Case, C. A., Kader, G. D., Schaeffer, R. L., & Spangler, D. A. (2015). *The statistical education of teachers*. Alexandria, VA: American Statistical Association.

Garfield, J., & Ben-Zvi, D. (2008). *Developing students’ statistical reasoning: Connecting research and teaching practice*. New York, NY: Springer.

Garofalo, J., Drier, H. S., Harper, S., & Timmerman, M. A. (2000). Promoting appropriate uses of technology in mathematics teacher preparation. *Contemporary Issues in Technology and Teacher Education, 1*(1). Retrieved from http://www.citejournal.org/

Grossman, P., Compton, C., Igra, D., Ronfeldt, M., Shanhan, E., & Williamson, P. (2009). Teaching practice: A cross-professional perspective. *Teachers College Record, 111*(9), 2055-2100.

Groth, R. E. (2013). Characterizing key developmental understandings and pedagogically powerful ideas within a statistical knowledge for teaching framework. *Mathematical Thinking and Learning, 15*(2), 121-145.

Herbst, P., Aaron, W., & Chieu, V.-M. (2013). LessonSketch: An environment for teachers to examine mathematical practice and learn about its standards. In D. Polly (Ed.), *Common core mathematics standards and implementing digital technologies* (pp. 281-294). Hershey, PA: IGI Global.

Herbst, P., Aaron, W., & Erickson, A. (2013). *How preservice teachers respond to representations of practice: A comparison of animations and video*. Paper presented at the annual meeting of the American Educational Research Association, San Francisco, CA. Retrieved from http://hdl.handle.net/2027.42/

Konold, C., Higgins, T., Russell, S. J., & Khalil, K. (2015). Data seen through different lenses. *Educational Studies in Mathematics, 88*(3), 305-325.

Konold, C., & Miller, C. D. (2011). TinkerPlots: Dynamic data exploration. Emeryville, CA: Key Curriculum Press.

Lampert, M., & Ball, D. L. (1998). *Teaching, multimedia, and mathematics: Investigations of real practice*. New York, NY: Teachers College Press.

Lee, H. S., & Hollebrands, K. F. (2011). Characterising and developing teachers’ knowledge for teaching statistics with technology. In C. Batanero, G. Burrill, & C. Reading (Eds.), *Teaching statistics in school mathematics-challenges for teaching and teacher education* (pp. 359-369). The Netherlands: Springer.

Lee, H. S., & Nickell, J. (2014, July). How a curriculum may develop technological statistical knowledge: A case of teachers examining relationships among variables using Fathom. In K. Makar, B. de Sousa, & R. Gould (Eds.), *Sustainability in statistics education. Proceedings of the Ninth International Conference on Teaching Statistics. *Flagstaff, Arizona. Voorburg, The Netherlands: International Statistical Institute.

LeFevre, D. (2004). Designing for teacher learning: Video-based curriculum design. In J. Brophy (Ed.), *Using video in teacher education* (pp. 235-258). Amsterdam, NE: Elsevier.

Lovett, J. N. (2016). *The preparedness of preservice secondary mathematics teachers to teach statistics: A cross-institutional mixed methods study*. Unpublished doctoral dissertation. North Carolina State University. Raleigh, NC.

Lovett, J. N., & Lee, H. S. (2016). Making sense of data: Context matters. *Mathematics Teaching in the Middle School, 21*(6), 338-346.

Madden, S. (2011). Statistically, technologically, and contextually provocative tasks: Supporting teachers’ informal inferential reasoning. *Mathematical Thinking and Learning, 13*(1), 109-131.

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. *Teachers College Record, 108*(6), 1017-1054.

National Council of Teachers of Mathematics. (2000). *Principles and standards for school mathematics*. Reston, VA: Author.

National Governors Association Center for Best Practice & Council of Chief State School Officers. (2010). *Common core state standards for mathematics*. Washington DC: Author.

Niess, M. L., Ronau, R. N., Shafer, K. G., Driskell, S. O., Harper, S. R., Johnston, C., . . . Kersaint, G. (2009). Mathematics teacher TPACK standards and development model. *Contemporary Issues in Technology and Teacher Education, 9*(1), 4-24. Retrieved from http://www.citejournal.org/volume-9/issue-1-09/mathematics/mathematics-teacher-tpack-standards-and-development-model

Shaughnessy, J. M. (2007). Research on statistics learning and reasoning. In F. K. Lester (Ed.), *Second handbook of research on mathematics teaching and learning* (pp. 957-1009). Charlotte, NC: IAP.

van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. *Journal of Technology and Teacher Education, 10*(4), 571-597.

van Es, E. A., & Sherin, M. G. (2008). Mathematics teachers’ “learning to notice” in the context of a video club. *Teaching and Teacher Education, 24*(2), 244-276.

Wilson, P. H., Lee, H. S., & Hollebrands, K. F. (2011). Understanding prospective mathematics teachers’ processes for making sense of students’ work with technology. *Journal for Research in Mathematics Education, 42*(1), 39-64.