Smith, R., Shin, D., Kim, S., & Zawodniak, M. (2018). Novice secondary mathematics teachers’ evaluation of mathematical cognitive technological tools. Contemporary Issues in Technology and Teacher Education, 18(4). https://citejournal.org/volume-18/issue-4-18/mathematics/novice-secondary-mathematics-teachers-evaluation-of-mathematical-cognitive-technological-tools

Novice Secondary Mathematics Teachers’ Evaluation of Mathematical Cognitive Technological Tools

by Ryan Smith, Radford University; Dongjo Shin, University of Georgia; Somin Kim, University of Georgia; & Matthew Zawodniak, Texas State University

Abstract

As technology becomes more prevalent in the mathematics classroom, teachers will need to be able to effectively evaluate technological tools to use with students. In this study, the authors examined secondary mathematics teachers’ evaluation of online dynamic geometry tools. The analysis focused on the teachers’ noticing of technology; specifically, what features within the tools mathematics teachers attended to, how they interpreted these features, and in what ways they responded. Findings indicated that secondary mathematics teachers attended mostly to mathematical features of the tools and considered the tools’ ability to focus on student engagement and student thinking to be very important, as well as the ease of implementation of the tool. The secondary mathematics teachers tended to begin their evaluation by determining how the tools work and attending to its appearance and then moved toward examining the mathematical features and how they related to student thinking.

The National Council of Teachers of Mathematics (NCTM, 2014) and the Association of Mathematics Teacher Educators (AMTE, 2015) stated that technology tools are essential and indispensable resources for teaching and learning mathematics in the 21st century. Mathematics teachers who use technology effectively “maximize the potential of technology to develop students’ understanding, stimulate their interest, and increase their proficiency in mathematics” (NCTM, 2008, p. 1).

Even though many technology tools are available to teachers, however, numerous teachers have difficulty taking advantage of these tools and effectively incorporating them into their curricula (Koehler, Mishra, Kereluik, Shin, & Graham, 2014; Niess, 2011). In their recommendations on the uses of educational tools for professional development of mathematics teachers, the authors of the Mathematical Education of Teachers II stated, “Teachers need to develop the ability to critically evaluate the affordances and limitations of a given tool, both for their own learning and to support the learning of their students” (Conference Board of the Mathematical Sciences [CBMS], 2012, p. 34).

Teacher’s decisions on which tools to use and how to use the tools in a particular learning situation can provide students opportunities or constraints to develop mathematical understanding (CBMS, 2012). Therefore, investigating the ways teachers make these decisions is important. Yet, relatively few studies have been conducted on the ways mathematics teachers evaluate technology tools specifically designed to teach a particular mathematics concept (e.g., Battey, Kafai, & Franke, 2005; Johnston & Suh, 2009; Smith, Shin, & Kim, 2017a).

In fact, most of the research on mathematics teachers’ evaluation of technology has focused on the teachers’ criteria to evaluate technology (e.g., Smith et al., 2017a) or the type of technology they used in their lesson plans (e.g., Johnston & Suh, 2009). However, these researchers did not examine the processes used by teachers to evaluate the technology. The purpose of this study was to examine the ways in which novice secondary mathematics teachers evaluated four online technological tools, each of which included an online dynamic geometry applet designed to have students explore the same geometric concept.

Literature Review

Teachers’ Evaluation of Mathematical Cognitive Technology

As technology has become more prevalent in today’s classroom, teachers are using it in a variety of ways, including having students do research on the internet; assessing students’ skills, knowledge, and understanding using digital devices; having students collaborate and interact with each other and the content using interactive digital displays; communicating with each other and the teacher both in and out of the classroom; and developing understanding of content to name a few.

This paper focuses on technology that teachers and students can use to develop students’ understanding of mathematics, specifically mathematical cognitive technologies (MCTs). MCTs are digital technological tools “that [help] transcend the limitations of the mind (e.g., attention to goals, short-term memory span) in thinking, learning, and problem-solving activities” (Pea, 1987, p. 91).

Popular MCTs include dynamic geometry environments (e.g., GeoGebra), graphing utilities (e.g., Desmos), dynamic data analysis and statistical environments (e.g., Fathom), and many online applications designed for teaching and learning of particular mathematics concepts. MCTs have the potential to help students “become more fluent in performing routine mathematical tasks that could be laborious and counterproductive to mathematical thinking” (Pea, 1987, p. 106). Rather than being bogged down by routine tasks and computations, students focus on problem solving and developing mathematical thinking skills.

Moreover, MCTs are environments that afford students the opportunity to recognize patterns and discover mathematical properties by developing and testing conjectures, exploring various mathematical characteristics, and discovering theorems. All MCTs are not the same, each having their own affordances, limitations, and features that influence students’ learning and understanding of mathematics. Thus, teachers’ evaluation and selection of which MCT(s) to use is not trivial.

Previous research on the ways teachers evaluate MCTs has primarily focused on the criteria created and analysis performed by prospective elementary teachers (e.g., Battey et al., 2005; Johnston & Suh, 2009). In both studies, the researchers found, in general, that prospective elementary teachers seem to select and use technology based on student engagement, surface features of the software, and motivation, rather than on student thinking and accurately representing the mathematics.
In Smith et al.’s (2017a) study of secondary mathematics teachers’ criteria to evaluate MCTs, both prospective and in-service teachers created criteria focused on how well an MCT represented the mathematical concepts, whether the MCT included supportive features to aid in developing the appropriate mathematics concept, how students interacted and engaged with the mathematics concepts when using the MCT, and whether the MCT afforded all students the opportunity to learn. The differences between the criteria used by the secondary and elementary teachers were likely related to the differences between the teachers’ knowledge base, which consisted of their knowledge of mathematics, teaching, and technology.

Evaluation and Noticing

Evaluation, in general, is often defined as the systematic assessment of the merits and value of objects (e.g., Scriven, 1991). The process of evaluating any object, behavior, or idea requires a person to consider the object, or some aspect of it, and make a judgment based on a set of criteria, which could be either implicit or explicit. In general, some goal guides the evaluation, and the criteria usually stems from meeting this goal. To evaluate MCTs, teachers consider the technological tool and its features and interpret them according to a set of criteria in order to determine whether to use this particular tool to teach a particular concept. Their decision on which technological tool to use is based on their evaluation.

Many mathematics education researchers (e.g., Jacobs, Lamb & Philipp, 2010; Star & Strickland, 2008; van Es & Sherin, 2002) have examined teachers’ decisions and their decision-making processes to gain a better understanding of what teachers attend to in a classroom situation and how they make sense of what they attend to using their knowledge of teaching, learning, and the context of the situation. As students engage in a particular classroom activity, what teachers attend to is important in order to understand the complicated mechanisms of teachers’ decision making, which influences student learning (Star & Strickland, 2008).

Other researchers (e.g., Jacobs et al., 2010) have suggested that the way teachers respond to what they observe seems to be just as important as what they attend to and how they interpret it. Jacobs et al. argued teachers’ decisions about what to do next are an integrated teacher move almost simultaneously taking place with their attentions to what has previously occurred and their interpretations of these occurrences. Thus, identifying and examining teachers’ three actions –– attending, interpreting, and deciding how to respond –– in a classroom activity could provide insight into teachers’ decision making processes.

These three actions comprise Jacobs et al.’s professional noticing of children’s mathematical thinking framework. Researchers in mathematics education have extended or adapted Jacobs et al.’s (2010) noticing framework to connect to other areas of mathematics education, including how teachers examine children’s participation in classrooms (Wager, 2014), and teachers’ noticing of curricular materials (Males, Earnest, Dietiker, & Amador, 2015).

Even though much of the mathematics education research on teacher noticing is situated in the context of the classroom or artifacts of the classroom, and teachers’ evaluation of technology is done outside the classroom, the processes teachers use to evaluate technology are similar to noticing. When evaluating and noticing, teachers perceive and attend to some specific object or behavior in the specific context, interpret the object or behavior, and possibly respond to their interpretation of the object or behavior if warranted.

Two distinct differences appear between evaluation and teacher noticing, however. First, evaluation uses a set of criteria to assess an object or situation, whereas noticing seems to examine the teachers’ attentions free from any sort of guidelines or criteria. When teachers notice, they focus on what seems interesting about the situation.

Second, teacher noticing does not seem to include any sort of judgment, whereas the point of evaluation is to determine the merits and value of the object. Yet, Sherin, Russ, and Colestock (2011) contended that the way a teacher perceives a classroom situation is influenced by the teacher’s expectations, knowledge, and understanding of the situation. When teachers attend to and interpret a student’s mathematical thinking, they are focusing on what the student knows and understands, which will be based on the teacher’s understanding of the mathematical content and how students think about the content. Thus, teachers make comparisons between what they expect the students to know and what the students actually know.

One could consider these expectations a form of criteria. When noticing, the teacher may not be making a merit-based judgment, but the teacher is judging the students’ knowledge and understanding. In some ways, evaluation could be viewed as a specific type of noticing. Thus, using a noticing lens to examine teachers’ evaluation of MCTs seems reasonable.

Mathematical Cognitive Technology Noticing Framework

Based on the works of Jacobs et al. (2010) and Males et al. (2015), we developed the mathematical cognitive technology noticing (MCTN) framework (Smith, Shin & Kim, 2017b) in order to use a noticing lens to examine how mathematics teachers evaluate MCTs. The MCTN framework consists of the three noticing actions (attending, interpreting, and responding) in which teachers could engage when evaluating MCTs and the types of features or activities within each of these actions (see Figure 1). In the following sections, we provide descriptions of each of the activities within each of the three actions of attending, interpreting, and responding.

Figure 1
Figure 1. Mathematical Cognitive Technology Noticing Framework.

 

Attending. In this framework, attending refers to the features of the MCT that teachers identify when evaluating the MCT. The teachers demonstrate their attentions by making remarks verbally or by interacting with the features within the MCT. We categorized these features into six codes: instructions and questions, mathematical features, interaction, aesthetics, communication, and supportive feature (see Figure 1).

For example, if teachers discussed or read the MCT’s instructions or questions, we coded these attentions as instructions and questions because the teachers focus on these particular features of the MCT. When teachers focused on the mathematical objects within the MCT (e.g., a geometric figure, a graph of a function, a scatterplot), we coded these attentions as mathematical feature. When teachers interacted with the objects within the MCT (e.g., drag figures or parts of figures, input values), we coded these attentions as interaction. When teachers focused on the layout of the MCT or website, such as color or the organization of the MCT, we coded these attentions as aesthetics. When teachers attended to the ways the MCT affords them to collect or display students’ work, or allows students to communicate with other students and the teacher, we coded these attentions as communication. Last, when teachers focused on particular features of the MCT that assist operating the MCT, such as help buttons or a video demonstration, we coded these attentions as supportive feature.

Interpreting. When teachers provided their thoughts and ideas about particular features of the MCT or the MCT as a whole, we referred to these actions and statements as interpreting. In this category, teachers reflected on their own uses of the MCT, considered how students may use the MCT, and anticipated how possible uses of the MCT affect students’ learning of mathematical concepts. We categorized these interpretations into six codes: design, differentiation, mathematics, student engagement, student thinking, and value (see Figure 1).

When teachers anticipated how the layout of the MCT (e.g., use of color to represent various mathematical objects) may influence students’ learning, we coded such these interpretations as design. When teachers considered the ways in which the MCT may affect the learning of particular populations of students or if the MCT, or one of its features, provides opportunities for all student to learn, we coded such interpretations as differentiation. We also created the code mathematics to document when a teacher interpreted the display of the mathematical ideas in the MCT or interpreted and anticipated the ways in which students interact with mathematical objects within the MCT (e.g., how the display and use of multiple linked mathematical representations within the MCT influence students’ interaction or learning).

When teachers anticipated how the MCT may enhance students’ learning or distract them from learning, we coded the interpretations as student engagement. Sometimes teachers  considered how students’ uses of the MCT may influence their thinking of the mathematical content. We coded these interpretations as student thinking. Last, we created the code value to document when teachers expressed their preference for a particular feature of the MCT or the MCT as a whole.

Responding. Finally, while evaluating MCTs teachers may make decisions based on their attentions and interpretations of the MCT’s features. We refer to these actions as responding. In this category, we created four possible types of responses: choose, adapt, incorporate, and redesign (see Figure 1). First, when teachers decided whether they wanted to select the MCT after using and examining the MCT, we coded such responses as choose. Sometimes teachers wanted to change the MCT itself to better fit their teaching. For example, when teachers wanted to add or remove a particular feature from a given MCT, we coded these kinds of responses as redesign. When teachers did not state any modifications to the MCT itself but planned to emphasize particular features or consider having students answer particular questions, we coded the responses as adapt.

Teachers may have also chosen not to modify the MCT but considered what they would need to do to successfully implement the MCT into their classroom (e.g., a demonstration of how to use the MCT or a conversation about a particular feature). We coded these responses as incorporate. In both incorporate and adapt responses, teachers did not change the MCT or a feature of the MCT. Instead, in the incorporate responses, teachers considered how to include the MCT into existing classroom activities, while in the adapt responses, teachers attempted to make minor adjustments by placing greater emphasis on particular features to meet learning goals.

Use of the MCTN framework can provide insights into how mathematics teachers evaluate MCTs and the types of knowledge they draw upon and use in their evaluations. The framework allows researchers to document and examine the features to which mathematics teachers attend, how they interpret these features, and how and whether they respond based on their attentions and interpretations.

In this study, we examined novice secondary mathematics teachers’ evaluations of MCTs. To examine their evaluations, we used a noticing lens, specifically the MCTN Framework. The research question that guided our study was, “In what ways do novice secondary mathematics teachers evaluate MCTs, specifically online dynamic geometry tools?” (For the purposes of this paper, we considered tools as the online applet and surrounding features on the webpage including, but not limited to, instructions and directions, questions, and supportive resources.)

Methods

Participants

We conducted the study in a mathematics education course at a university in the southeast region of the United States. The course was an elective for graduate and undergraduate students focused on the teaching and learning of mathematics with technology. The lead researcher was the instructor of the course. In the study, we grouped participants in trios to facilitate conversations among group members as they evaluated the online dynamic geometry tools.

The lead researcher conducted a short survey on the first day of class, in which he asked each member of the class to list the mathematics and mathematics education courses they had recently taken, their teaching experience (amount, type of experience, and grade level), their previous degrees and current degree program, and their familiarity with technology (types of technology they had used and courses they had taken). After reviewing the survey data, the lead researcher grouped the 21 class members into seven trios based on (a) their teaching experience, (b) their level of prior teaching experience (high school or middle school), (c) their knowledge and familiarity with technology, and (d) their level of education.

Due to limited resources and agreement to participate in the study, we captured the work of five groups. This paper describes our analysis of the one group of novice secondary mathematics teachers in order to address the research question. All three of the teachers (Mary, Abby, and Bob; all names are pseudonyms) had earned undergraduate degrees in secondary mathematics education from the same program and university within 2 years of this study being conducted. During their undergraduate studies, all had completed at least one mathematics education technology course and a semester long student teaching experience.

At the time of the study, Mary, Abby, and Bob were completing their first year of full-time teaching at either a middle school or high school. They were also enrolled in a mathematics education master’s program. We selected this group for a number of reasons. The teachers’ educational background and teaching experience were very similar within the group, and this was the only group of teachers who were all certified to teach mathematics in secondary schools (other groups were either seeking certification or certified to teach grades 4-8). During the unit that preceded their evaluation, this group of teachers displayed an advanced knowledge of the content and technology, regularly referenced their own students, and considered their own potential actions in their classrooms. Although the group of teachers has limited experience in the classroom, the combination of being comfortable with the content, having teaching experience on which to reflect, and knowledgeable about various MCTs made them strong participants on which to initially examine novice secondary mathematics teachers’ evaluation of MCTs.

Context and Data Collection

We collected data during the second day of class. Each class meeting was approximately 3 hours in length. The lead researcher began by asking the class to consider the Triangle Inequality Theorem and had them use a preconstructed dynamic geometry sketch and task sheet to investigate the theorem. Then, he asked the class to analyze the sketch and task sheet to examine some of the benefits and limitations in using the MCT to explore this theorem and anticipate how eighth-grade students would use and think given the same instructional activity.

In addition, the lead researcher showed a sequence of video clips of a pair of eighth-grade students using the same preconstructed dynamic geometry sketch and task sheet to consider how the MCT influenced students’ mathematical thinking and learning (the sequence of activities and questions can be found in Hollebrands & Lee, 2012, p. 23-26). At the completion of the class activity, the lead researcher gave an assignment in which each group was to develop a list of criteria to analyze online dynamic geometry tools designed to help students learn the Triangle Inequality Theorem.

After the groups of participants had developed their criteria, they analyzed and evaluated the four online tools using their criteria. The lead researcher instructed each group to evaluate each of the four tools and to pay particular attention to the features of each MCT they viewed as beneficial and those they viewed as drawbacks. Afterwards, each group compared the four online tools and selected the one they would use to teach the Triangle Inequality Theorem.

We chose this sequence of activities for two main reasons. First, Hollebrands and Lee (2012) developed the Preparing to Teach Mathematics With Technology curriculum using technology, pedagogy, and content knowledge (TPACK) as their design framework, which we believe is foundational to teachers’ evaluation of MCTs. Second, the Triangle Inequality Theorem is a core geometric concept that secondary school students regularly encounter.

While evaluating the tools, each group was given two computers to use. On one computer, the participants interacted with the online tools. On the other computer, the participants typed their analysis and evaluation. We video- and audio-recorded the group of novice secondary mathematics teachers in order to capture their words, actions, and gestures as they evaluated the four online tools. In addition, the participants’ uses of both computers was recorded via a screen-capture recording program. We collected and digitized all artifacts, including the group’s criteria and evaluation of the four online tools.

We selected the four tools (described in Appendices A-D) for the teachers to evaluate for a number of reasons. First, we wanted the teachers to evaluate tools that afforded students and teachers the opportunity to interact with the mathematics in various ways. The different types of interaction afforded by the four tools included the use of sliders (Tools 1, 2, and 3), dragging segments to determine (Tools 1, 3 and 4) and using line segments or blocks (Tool 1). Second, we wanted the teachers to evaluate tools that displayed the mathematics in various ways including the inclusion of the inequality statements (Tools 1 and 2), the theorem itself (Tool 1), and a visualization of why the theorem is true (Tool 2).

We wanted the teachers to evaluate tools that offered various forms of feedback including whether the included inequality statements were true (Tools 1 and 2), whether a triangle could be formed for a particular set of segment lengths (Tools 1 and 3), or no feedback (Tool 4). We wanted the teachers to evaluate tools that offered various types of support including questions to answer (Tools 2 and 3), and links to videos and other related materials (Tool 4). By having the teachers evaluate tools with unique sets of features, we believed we would be able to better examine the types of features the teachers attend to and how and whether they interpret and respond to the tools’ features.

Data Analysis

We created transcripts of the novice secondary mathematics teachers group’s activities and conversations. Members of the research team read through the transcripts, looking for instances in which members of the group were evaluating the MCTs. We paid particular attention to when the participants seemed to be attending to features of a MCT either through their words or actions using a MCT, when they remarked on particular features or made comments regarding the MCT as whole, and when they were selecting which MCT they would use with eighth-grade students. In other words, we looked for noticing episodes: portions and patterns of the transcript in which the participants were attending, interpreting, or responding.

We used the MCTN framework to code each noticing episode, including the novice secondary mathematics teachers’ actions and discussions while evaluating the MCTs. Two members of the research team independently coded the group’s evaluation of each of the four tools using the MCTN framework. Once we finished coding, we compared codes and reconciled any differences through lengthy discussions. Across the four tools, the pairs of researchers had an interrater reliability of 78%. We looked within and across the tools for trends and commonalities examining the frequency and chronology of their noticing actions.

Findings

Rather than organize this section by the MCTN framework (attending, interpreting, responding), we organized it by tool in order to provide a rich description of Mary’s, Abby’s, and Bob’s evaluations of the tools, along with our analysis of their evaluations. Organizing the section this way highlights the similarities and differences between the teachers’ evaluations of the tools while also demonstrating patterns in their evaluations.

Tool 1

Description. To begin, the teachers opened the first tool (see Appendix A), and Bob immediately attended to the instructions and stated, “… There’s some unnecessary frivolity here. It [the tool] is asking you to do things when it’s already told you the answer [Triangle Inequality Theorem].” Mary disagreed with Bob, stating the tool is relevant to learning the theorem. Bob then reiterated his previous claim stating, “It’s already giving me the answer.” Mary attended to the inequality statements again and said, “The mathematics contained is accurate, that is, inequalities are true.”

Dragging the sides and sliders, Bob said, “I guess it’s fairly user friendly.” Bob and Mary attended to the instructions again and pointed out that they did not provide detailed direction about how to use the tool. Bob clicked on the check boxes “Show Inequality,” “Show Answer,” and “Classification” and said, “I think it tells you too much.”

Next, they discussed whether the tool would enhance or distract student learning. Bob claimed it would distract students because it gives too much information, while Mary said it could be beneficial or distracting, depending on whether or not students previously knew the theorem. Later, Mary seemed to agree with Bob’s issues with the tool when she attended to the inequality statements and check boxes, stating, “They tell you the answers. … It doesn’t require you to think.” Bob added, “However, with the support of a lesson, students could use this tool to appropriately navigate through the theorem.”

Next, the teachers attended to the sliders and considered the possible triangles that could be formed. In their written evaluation, the group concluded, “This applet covers all cases, because students can manipulate the side lengths to any combination of values.” In addition, the teachers thought the large amount of information might prevent students from developing misunderstandings related to the theorem. The teachers indicated they could use the check boxes in a differentiated manner by having some students check the boxes and other students leaving them unchecked. In the end, the teachers thought the tool is better suited for students to verify the Triangle Inequality Theorem and investigate particular cases rather than develop the theorem on their own.

Analysis. In the beginning of their evaluation of Tool 1, they attended to the instructions and questions. In the middle, they focused more on the interaction by dragging sliders and sides and checking boxes to see what happened. After focusing on interaction, the teachers attended to the layout of the tool (i.e., aesthetics) and its benefits and drawbacks. Throughout the evaluation, the teachers attended to mathematical features, making them the features to which they attended most frequently. Their frequent attentions to these features might be simply due to the fact that this tool provided many mathematical features (e.g., side lengths and triangle inequality statements).

We noticed the group’s most common types of interpretations were design and student thinking. Many of these interpretations were related to whether the tool, as Abby said, “gives students too much information in its entirety” and how “it doesn’t require students to think and doesn’t allow students to develop relationships among side lengths independently.” We noticed the teachers interpreted the mathematical features in a variety of ways including mathematics, student thinking, value, design, and differentiation.

The teachers only had one type of response (incorporate) and these responses were paired with two types of interpretations, student thinking and design. For instance, while discussing whether the tool allows for a logical progression of mathematical thought processes, Mary stated, “It doesn’t require thought. But you could provide a lesson. You would have to do extra work to make this applet do that.”

Mary anticipated how students’ potential uses of the tool might influence their thinking or, in this case, the lack thereof. Additionally, she anticipated teachers would need to do “extra work” by building a lesson around students’ uses of the tool that would help students guide toward developing the theorem. Similarly, after expressing his concerns that this tool gave students too much information, Bob stated, “If you [teachers] used this the right way in class, you’d be fine.” Thus, the teachers seemed to consider the success of this tool would greatly depend on the way a teacher incorporates it into the lesson.

Tool 2

Description. Opening Tool 2 (see Appendix B), the teachers initially noticed different features of the tool; Bob noticed the circles and the inequalities in the tool, while Mary began reading the instructions. Then the teachers focused on the side lengths, with Mary stating, “The side lengths are appropriate,” and Bob noting that the tool allowed for a side length of 0. After considering each of the previously mentioned features individually, the teachers began to analyze the tool as a whole.

Mary and Abby thought the tool was clean, easy to read, and visually appealing. However, they worried the tool might be overwhelming or distracting. Specifically, they thought the circles would enhance student learning if the students could persevere beyond the possible distractions they would initially cause. Abby said that “the inclusion of the circles enhance the exploration of the theorem,” though Bob said the tool was not a discovery investigation, “because it tells you the inequalities.”

With the idea of exploration in mind, the teachers discussed their fondness for the questions provided by the tool. The teachers said the second questions would afford students the opportunity to develop the theorem. Next, Mary focused again on the sliders’ range of possible values. Together, the teachers decided the tool showed all cases and had a balanced difficulty level. Finally, the teachers returned to the questions at the bottom of the tool and said that the tool provides a platform for developing because its questions ask students to explain their own reasoning or connection why the three inequalities must be true for a triangle to exist.

Analysis.  We noticed that the most frequent attentions of the teachers were mathematical features, such as circles, inequalities, and sliders of the side lengths. The teachers interpreted these mathematical features from a variety of perspectives (mathematics, student engagement, and student thinking), but we mostly coded their interpretations of mathematical features as value, because the teachers frequently expressed their fondness for the mathematical features of the tool. In addition, student thinking was the second most frequent interpretation related to mathematical features.

When interpreting Tool 2, the teachers focused mostly on student thinking, value, and student engagement. Approximately half of the value interpretations were also associated with student thinking, because the teachers were doing both types of interpretations in the same noticing episode. Thus, the teachers seemed to be attending to a particular feature of the tool, interpreting that feature by examining how it influences student thinking and whether the feature and its influence is beneficial to student learning, or expressing their preference. In particular, when the teachers interpreted mathematical features, they interpreted the same mathematical features in different perspectives at the same time. For example, the teachers interpreted circles as distracting features, but they also interpreted that the circles can enhance students’ exploration.

The teachers only had one response (choose). This particular responding action occurred at the end of their evaluation when they compared the four tools and selected Tool 2 as the one they would use in their classroom. The teachers’ choosing was not associated with a particular feature or interpretation, but rather based on their overall analysis of the four tools. Interestingly, during their evaluation the teachers did not consider any changes to the design of the tool, how they could modify the tool’s instructional materials, or how to incorporate it into their classroom.

Tool 3

Description. Upon opening Tool 3 (see Appendix C), Bob said, “I like the smiley face. That’s cool.” After reading the instructions and dragging the sliders, they noticed two forms of feedback when a triangle could be formed: the smiley face and a tan triangle. Although the instructions asked them to drag the segments when a triangle could not be formed, the teachers did not understand why students would need to do that, because the segments would never intersect. The teachers also attended to the angle labels, noting the other tools did not include this feature.

After interacting with the tool and realizing the smiley face only appears when a triangle is formed, Bob said, “It would be better if it gave you a frowny face if it didn’t work.” The teachers said the tool seemed to be “pretty straightforward” by showing the smiley face when a triangle is formed and just dragging the sliders to adjust the lengths of the segments.

Next, the teachers noted the tool does not present the theorem. Bob said, “It [Tool 3] also has its merits, because it doesn’t tell you the triangle inequality. Like the first one [Tool 1] showed it to you, and this doesn’t show you that relationship.” He also said, “Even though it [Tool 3] created the triangle for you, you could still discover the inequality.” The teachers also considered whether the mathematics was accurately represented, such as whether the side lengths do indeed form a triangle and the use of three different colors to coordinate the segments and their corresponding sliders.

Next, Abby attended to how students interact with the tool by stating the smiley face motivates and encourages students. She also considered the angle labels saying, “In terms of the mathematics, I think that in this case with the angle measures, since we’re not really concerned with angle measures, it could potentially distract.” Mary suggested teachers should emphasize the question. She said, “This question down here at the bottom, does make them [students] think about it [the theorem], but unless I require that they, like, submit some kind of answer for that, they’re just going to ignore that.”

Next, the teachers realized the sliders can be manipulated to any length between 0 and 10 and discussed possible issues with having segment lengths of 0. Abby anticipated students’ thinking saying, “Students could potentially find a problem with side length of zero. However, a discussion could be had about this issue.” Finally, the teachers thought the tool would allow all types of triangles.

Analysis. In our analysis of the teachers’ evaluation of Tool 3, we noticed they mostly attended to mathematical features. They seemed to initially focus on the features that were most unique and distinct (e.g., the smiley face, a tan triangle, or angle labels) but progressed to less obvious features, such as the range of the segment lengths. They interpreted these mathematical features in a variety of ways (student engagement, student thinking, and value), but the most frequent was mathematics. The teachers also attended to instructions and questions at various points of their evaluation and interpreted them in terms of student thinking and value.

Examining the teacher’s interpretations, we noticed the teachers seemed to focus on student thinking and value. As the teachers’ evaluation progressed, we noticed their interpretations focused more on anticipating and inferring how students would think and engage with the tool. For example, when the teachers attended to the smiley face or a tan triangle, they initially made sense of it as a fun feature of the tool (i.e., value), then interpreted it as a means to engage students in thinking mathematically (i.e., student engagement), and finished by considering whether it could prevent students from developing the Triangle Inequality Theorem because the feedback was very direct (i.e., student thinking).

Based on their attentions and interpretations, the teachers made all four responses (e.g., incorporate, redesign, adapt, or choose). The teachers’ incorporate responses were associated with how the tool represents mathematical features and how it would influence student thinking. For example, when discussing the sliders’ range included 0, Abby considered how she could incorporate the tool into her teaching. She recognized the need to have a discussion about this particular mathematical feature to help students understand or resolve their possible confusions. When discussing whether to choose this tool as the best one to use with students, the teachers also considered student thinking, that is, how the tool helps students develop the inequality theorem. Bob considered redesigning this tool to include a frowny face when a triangle was not formed which was based on his interaction with the tool. And, Mary considered how she would adapt the tool by emphasizing the question at the bottom (instructions and questions) to “facilitate students’ deeper understanding of the theorem” (student thinking).

Tool 4

Description. Opening Tool 4 (see Appendix D), the teachers immediately attended to the text on the website and said they liked that it included the mathematical objectives. They located two sets of instructions, which they found odd. Next, the teachers focused on the segments by counting the number of segments and attempted to drag them to form a triangle. They attended to the behavior of the segments as they dragged them and became frustrated. Mary said “I can just drag them all over the place like a worm. I don’t understand [what] to do.”

Abby tried to focus on whether the tool accurately portrayed mathematical concepts, but she remained unsure. Bob attempted to change the size of the segments, which were fixed, by using the + and – keys as indicated by the instructions. Bob quickly realized pressing these keys just zoomed in and out within the applet and did not actually change the segments’ lengths. The teachers found the zooming feature to be unnecessary and thought that drawing attention to it might distract and confuse students. Mary noted that the tool would not allow students to examine all cases because there are a fixed number of segments of specific lengths, but said the zoom feature could be beneficial for students who have vision difficulties. Bob indicated if one could resize the segments, then it could cover more cases.

The teachers returned to the instructions and said they were unclear and difficult to follow. They found the number of segments to be overwhelming and distracting, which would likely frustrate the students who would potentially give up. They focused their attention again on the behavior of the segments as they dragged the endpoints. Mary noticed the behavior of a segment was different depending on which endpoint was being dragged. She became exasperated when she could not tell which endpoint determined which dragging behavior. She also questioned why she had multiple segments of the same length and why segments were different colors.

Abby attempted to redirect Mary’s and Bob’s attention by focusing on the questions within the tool, which Abby thought were good. Mary disagreed, saying the questions focus on forming particular triangles but do not directly assist students in developing the Triangle Inequality Theorem. Mary and Abby agreed the tool was not hard to understand, but the limited number of segments would not provide an accurate portrayal of the number of possibilities. Finally, the trio noted the inequality statements were never suggested on the website.

Analysis. In our analysis of the teachers’ evaluation, we noticed the teachers immediately began attending to the instructions and questions and continued to return to these features throughout their evaluation. In fact, the teachers attended to the instructions and questions more than any other attention category. They also focused their attention on their interaction with the tool, primarily their difficulty dragging the segments and the mathematical features of the tool, including the number of segments on the screen and the absence of the triangle inequality statements. Surprisingly, the teachers did not attend to the supportive features, such as the video lecture and links to other related websites.

The most common interpretation was value. In fact, most of the instructions and questions were coded value with the teachers consistently expressing their dislike for the instructions and questions. The trio only had one noticing episode coded as student thinking, also coded as value, which was Mary and Abby’s discussion of the quality of the questions. The teachers often interpreted their interactions in terms of design because they considered how the design of the tool influenced student interaction and learning rather than considering whether the feature was beneficial.

The mathematics interpretations were based on their interactions, the mathematical features, or were not tied to particular features. Across their evaluation of this tool, the teachers mainly interpreted the features in a negative manner, because they thought their students were unlikely to develop the Triangle Inequality Theorem using this tool. The teachers had a single response in their evaluation of this tool. We coded Bob’s remark about the possibility of resizing the segments as redesign because he was considering technical features of the tool he would consider changing to benefit students’ learning.

Across Tools

During our examination of the teachers’ evaluations of the four tools, we noticed particular trends. First, there seemed to be a consistent temporal pattern to the teachers’ evaluations. Generally they began by figuring out how to use the technology (instructions and questions and interaction), attending to its appearance (aesthetics) and making initial judgments about the MCT and its features (value). Then, they generally moved toward examining the mathematical features and how they related to student thinking.

This behavior may explain the outlier that is Tool 4. As noted previously, the teachers disliked this tool due to their difficulty in figuring out how to use it. It seems the teachers never got past the how to use and appearance stage of their evaluation to progress to attending to the mathematical features and interpreting how the features affect student thinking.

Moreover, with the knowledge that Tool 4 is likely an outlier, it is evident the teachers focused predominantly on the mathematical features of the tools. In fact, approximately half of the teachers’ attentions in Tools 1, 2, and 3 were directed toward mathematical features. In addition, these mathematical features were viewed in a variety of ways. The teachers attended to mathematical features alone, but also in terms of aesthetics, the interactions they afforded and how the instructions and questions connected with the mathematical features. At no point during their evaluation did the students consider communication features or supportive features of the tools, even though Tool 4 did include many supportive features.

The teachers also interpreted the mathematical features in every way possible under the MCTN framework. That is, the teacher interpreted how the design of the tool related to the mathematical features, how the mathematical features could be used in a differentiated manner, and how the mathematical features would influence the ways in which students interacted with the mathematical objects present in the tool. Additionally, the teachers considered whether certain mathematical features would be distracting, could influence student thinking, or judge whether the features were beneficial.

Discussion

We consider the novice secondary mathematics teachers’ significant focus on the mathematical features of the tools as a positive foreshadowing of the potential for teachers to make accurate and well-informed decisions when considering technologies to implement in their classrooms. However, the teachers in this study were also asked to select a tool at the end of their evaluations, and this selection can also inform what teachers find important when considering technologies.

This group chose Tool 2 at the end of their evaluations, with Tool 3 as the runner-up. The teachers’ evaluations of these tools both focused on mathematical features and student thinking but also had some very stark differences between the two. For example, when evaluating Tool 2 the teachers often considered how the tool would affect student engagement, which they rarely did while evaluating Tool 3. This finding suggests that student engagement remained an important factor for these novice secondary mathematics teachers, a similarity shared with prospective elementary teachers (e.g., Battey et al., 2005; Johnston & Suh, 2009).

The most significant contrast between their evaluations of the two tools was how the teachers responded. During their initial analysis of Tool 2, the teachers only responded by choosing this tool as the one they would use in their classroom. However, the teachers responded eight times to Tool 3 during their analysis. The teachers considered redesigning this tool to include a frowny face when a triangle was not formed, incorporating the tool by providing instructions to the students, and adapting the tool by emphasizing the end questions and requiring students to answer them. They also considered adding a discussion about the possibility of a side length of 0 and developing a supporting guided lesson on the inequalities present in the theorem.

Overall, the teachers felt Tool 3 provided freedom for students to develop the theorem but perhaps had too much freedom, requiring a great deal of teacher support to make the tool a success. Perhaps this is why, in the end, the teachers decided Tool 2 would be their choice, even after robust conversation of how to successfully modify and use Tool 3. The amount of work necessary to successfully implement the technology seems to have been a significant influence on the teachers’ decision.

The mathematical features and potential for student engagement, coupled with the ease of implementation, led the teachers to choose Tool 2. However, the criteria the teachers developed also influenced their evaluations. For example, the teachers interpreted how each tool allowed for differentiation. For three of the four tools, the teachers considered only differentiation once and did so independently of the features of the tool.

Furthermore, for these three instances, it was always in response to their criterion “Does it have a balanced difficulty level?” We cannot be certain whether the teachers would have made this kind of interpretation without the use of this criterion. In fact, during their evaluation, they consistently referenced their criteria to ensure their evaluations were thorough.

Implications

Although researchers and policy makers have addressed the importance of evaluating and selecting an appropriate technological tool for instruction (e.g., CBMS, 2012; Niess, 2011), little research has examined secondary mathematics teachers’ evaluation of MCTs. This study provides insights into what aspects secondary mathematics teachers consider when evaluating and selecting an appropriate MCT and what aspects mathematics teacher educators should consider to enhance mathematics teachers’ ability to evaluate and select MCTs.

According to Dreher and Kuntze (2015), a lack of mathematics content knowledge tends to prevent teachers from noticing students’ understanding, and thus, specific content knowledge is a prerequisite for teacher noticing. Similarly, during our analysis of Tool 4 we found that the ability to quickly and easily learn how to use a technological tool may be paramount when teachers evaluate it.

The teachers’ analysis of Tool 4 differed greatly from their analyses of the other three tools both in content and in breadth due to their inability to fully comprehend the tool.  Our finding may be consistent with Dreher and Kuntze’s (2015) finding, in that teachers’ specific technological knowledge (e.g., how to use a tool) is a prerequisite to focus their evaluation on mathematical features and student thinking. Thus, it seems that in order for teachers to perform thorough and effective evaluations of technology, mathematics teacher educators must help teachers develop their abilities to determine how a tool works.

Although novice teachers in our study focused on the mathematical features and student thinking in their evaluation, their selection of a tool was significantly influenced by the tool’s potential for student engagement and the ease of implementation. Battey et al. (2005) were concerned with prospective teachers’ tendency to see the use of technological tools as a stand-alone, fun activity. Even though our participants did not necessarily view technology as a supplemental activity, engagement with the tool was still a high priority for our participants. The roles technological tools play in student engagement and motivation is important, but the use of technological tools should be considered an integral part of teaching and learning mathematics (CBMS, 2012; Niess, 2011).

The findings in our study indicated that mathematics teacher educators must provide teachers with an opportunity to evaluate MCTs. Moreover, mathematics teacher educators should provide teachers appropriate guidance so that teachers can focus more on students’ mathematics and mathematical thinking than merely on whether a tool motivates and engages students.

Limitations and Future Direction

The focus of this study on noticing episodes and trends of a single group of teachers has illuminated possible patterns in teachers’ evaluation of MCTs. However, by examining only what one group of teachers noticed when evaluating one familiar mathematical topic (the Triangle Inequality Theorem) using one particular kind of tool (online dynamic geometry applets) in a limited amount of time, we cannot generalize to the entire population of secondary mathematics teachers. Nevertheless, our findings indicate that novice teachers are able to perform thoughtful evaluations of technology and do so in a particular manner.

Moreover, the use of criteria seemed to influence the teachers’ evaluations, which was part of the sequence of instructional activities. Because the teachers created their own criteria to evaluate the MCT, they did not consider all of the possible features. Perhaps having teachers use the MCTN framework as a guide to evaluate MCTs would develop teachers’ knowledge and skills to thoroughly evaluate MCTs. Future research should investigate the relationships between the sequence of activities, including the development of criteria, and teachers’ evaluations of MCTs and teachers’ uses of the MCTN framework as a guide for evaluating MCTs.

The results of this study suggest further research should be conducted to examine the many patterns that arose during the data collection and analyses of this study. Besides further study applying the ideas of this paper to a larger sample size, the researchers have identified other studies that, together with this current study, would build a robust perspective of how teachers evaluate technology tools for potential use in their classroom (e.g., Smith et al., 2017a; Battey et al., 2005; Johnston & Suh, 2009). One such study would compare secondary mathematics teachers with varying levels of teaching experience to see how teaching experience affects secondary mathematics teachers’ evaluations of technology. Smith at al. (2017a) found teaching experience, even only 1 year, seemed to greatly influence the criteria teachers create to evaluate MCTs.

Another study would include more attention to teachers’ beliefs about teaching and learning mathematics and how teachers with various beliefs structures evaluate MCTs. In addition, examining how expert technology using teachers evaluate MCTs could provide a clearer view on how mathematics teachers evaluate MCTs. Finally, a study in which researchers examine how teachers evaluate MCTs coupled with the ways in which they notice students’ thinking as they implement the selected MCT would provide a more holistic picture of teachers’ decision making when teaching mathematics with technology.

References

Association of Mathematics Teacher Educators. (2015). AMTE technology position statement. Retrieved from https://amte.net/publications

Battey, D., Kafai, Y., & Franke, M. (2005). Evaluation of mathematical inquiry in commercial rational number software. In C. Vrasidas & G. Glass (Eds.), Preparing teachers to teach with technology (pp. 241–256). Greenwich, CT: Information Age.

Conference Board of the Mathematical Sciences. (2012). The mathematical education of teachers II. Providence, RI: American Mathematical Society.

Dreher, A., & Kuntze, S. (2015). Teachers’ professional knowledge and noticing: The case of multiple representations in the mathematics classroom. Educational Studies in Mathematics88(1), 89–114.

Hollebrands, K., & Lee, H. (2012). Preparing to teach mathematics with technology: An integrated approach to geometry. Dubuque, IA: Kendall Hunt.

Jacobs, V. R., Lamb, L. L. C., & Philipp, R. A. (2010). Professional noticing of children’s mathematical thinking. Journal for Research in Mathematics Education, 41(2), 169–202.

Johnston, C., & Suh, J. (2009). Pre-service elementary teachers planning for math instruction: Use of technology tools. In I. Gibson, R. Weber, K. McFerrin, R. Carlsen, & D. A. Willis (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2009 (pp. 3561–3566). Chesapeake, VA: Association for the Advancement of Computing in Education.

Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The technological pedagogical content knowledge framework. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 101–111). New York, NY: Springer.

Males, L. M., Earnest, D., Dietiker, L. C., & Amador, J. M. (2015). Examining K-12 prospective teachers’ curricular noticing. In T. G. Bartell, K. N. Bieda, R. T. Putnam, K. Bradfield, & H. Dominguez (Eds.), Proceedings of the 37th annual meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education (pp. 88–95). East Lansing, MI: Michigan State University.

National Council of Teachers of Mathematics. (2008). The role of technology in the teaching and learning of mathematics. Retrieved from http://www.nctm.org/about/content.aspx?id=14233

National Council of Teachers of Mathematics. (2014). Principles to actions: Ensuring mathematics success for all. Reston, VA: Author.

Niess, M. L. (2011). Investigating TPACK: Knowledge growth in teaching with technology. Journal of Educational Computing Research44(3), 299–317.

Pea, R. D. (1987). Cognitive technologies for mathematics education. In A. Schoenfeld (Ed.), Cognitive science and mathematics education (pp. 89–122). Hillsdale, NJ: Erlbaum.

Scriven, M. (1991). Evaluation thesaurus (4th ed.). Newbury Park, CA: Sage Publications.

Sherin, M. G., Russ, R. S., & Colestock, A. A. (2011). Assessing mathematics teachers’ in-the-moment noticing. In M. G. Sherin, V. R. Jacobs, & R. A. Philipp (Eds.), Mathematics teacher noticing: Seeing through teachers’ eyes (pp. 79–94). New York, NY: Routledge.

Smith, R.C., Shin, D., & Kim, S. (2017a). Prospective and current secondary mathematics teachers’ criteria for evaluating mathematical cognitive technologies. International Journal of Mathematical Education in Science and Technology, 48(5), 659-681

Smith, R., Shin, D. & Kim, S. (2017b). A framework for examining teachers’ noticing of mathematical cognitive technologies. Journal of Computers in Mathematics and Science Teaching, 36(1), 41-63.

Star, J. R., & Strickland, S. K. (2008). Learning to observe: Using video to improve preservice mathematics teachers’ ability to notice. Journal of Mathematics Teacher Education, 11(2), 107–125.

van Es, E. A, & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571–596.

Wager, A. A. (2014). Noticing children’s participation: Insights into teacher positionality toward equitable mathematics pedagogy. Journal for Research in Mathematics Education, 45(3), 312–350.


Appendix A
Tool 1 Description

Figure A1. The first tool evaluated by the teachers. The applet is available at http://www.geogebra.org/en/upload/files/english/dtravis/triangle_inequality.html

 

This tool provides a wealth of features to assist students in exploring the Triangle Inequality Theorem. The instructions that accompany the tool state the goal of the tool which to allow users “to see why the sum of two sides must be greater than the third in order for a triangle to exist” and tell the users to “Adjust the side lengths accordingly, then move the blue points to try and create a triangle.” There are also three inequality statements displayed within the tool: a + b > c, a + c > b, and b + c > a. The sides are adjusted using sliders whose lengths range from 1 to 10 with 1 increment. The color of the sides corresponds to the color of the slider. Users can view the segments in two ways: as box units (default) in which they must drag the blue point on the first box (see Figure A1a) or as line segments in which they must drag the endpoints of the green and red segments (see Figure A1b). In order to change to the segments, the user must uncheck the box next to Box Units (upper left hand corner) and then check the box next to Line Segments. When the box next to Show Inequality is checked, the tool displays the three inequality statements with the values from the sliders. It displays “YES” or “NO” depending on whether the inequality statement holds (see Figure A1b). When users check the box next to Show Answer, two things could be displayed. If a triangle can be formed, the Show Answer displays “YES!!!” and another check box appears which will display the Classification of triangle (e.g., an obtuse triangle in Figure A1b). But, if a triangle cannot be formed, “NO!!!” appears without the check box of Classification.


Appendix B
Tool 2 Description

 

Figure B1. The second tool evaluated by the teachers. View the applet at http://geogebracentral.blogspot.com/2010/09/triangle-inequality.html

 

The tool uses circles to show whether a triangle can be formed for a given set of segment lengths. Two of the segments (orange [a] and blue [b]) determine the radii of two independent circles with the green segment (c) connecting the centers of the circles. If the circles intersect twice, then a triangle is automatically formed (see Figure B1a). If the circles do not intersect or only insect once, a triangle is not formed (see Figure B1b). The user creates the segments of given lengths using color-coded sliders that correspond with the colors of the segments. The sliders range from 0 to 5 units with 0.1 increments. There are three inequality statements displayed within the tool: a + b > c; b + c > a; and, c + a > b on the top of left side. The tool displays True or False depending on whether the inequality statement holds for a given set of segment lengths. There are three types of interactions in this tool: 1) dragging the sliders; 2) dragging the circles; 3) the feedback provided by the inequality statements and the visual representation of the segments and circles. Users cannot drag the actual segment because the construction of the triangle is dependent on the intersection of the circles. One can drag one of the circles when a triangle is formed, but doing so only reorients the triangle. The tool’s instructions ask users to move sliders and observe what happens and explain why the inequalities must be true for a triangle to exist. In addition to the tool, the website provides information about GeoGebra including a collection of GeoGebra tools. There are tutorials about how to use the software and suggestions on how to use GeoGebra in the teaching and learning of mathematics.


Appendix C
Tool 3 Description

Figure C1. The third tool evaluated by the teachers. The applet is available at http://highaimsggb.pbworks.com/f/Triangle_Inequalities_Garrison.html

 

This tool relies on various forms of feedback in order for users to develop the Triangle Inequality Theorem. The website provides brief instructions on how to use the tool and a question underneath the tool for users to generalize a theorem based on their experiences using the tool (see Figure C). The tool is very different depending on whether a triangle is formed. Two sets of objects are displayed at all times: a set of sliders (pink, green, and yellow) and a set of segments whose lengths and colors are linked to the corresponding sliders. Users are able to change the length of segments by using sliders that range from 0 to 10 with 0.5 increments. When the sliders are dragged such that the segments can form a triangle, two additional objects appear: a smiley face and a tan triangle (see Figure C1a). The tan triangle shows the user where the pink and green segments would need to be dragged in order for the triangle to be created. The user can drag each endpoint of the two segments (pink and green) to make the triangle. The tan triangle also has markings of the interior angles of the triangle and the angles are named α, β, and γ. However, the measures of these angles are not displayed and the user cannot measure them. If the conditions for a triangle are not satisfied, the tool does not make the tan triangle nor is a smiley face displayed (see Figure C1b). Users can drag each endpoint of the two segments to check the fact that a triangle cannot be formed. The tool’s website does not provide any further assistance or instructions other than the name of the tool’s creator and a link to GeoGebra.


Appendix D
Tool 4 Description

Figure D1. The fourth tool evaluated by the teachers. The applet is available at http://teachers.henrico.k12.va.us/math/IGO/03TrianglesPolygons/3_3.html

 

The fourth tool provides a number of supports for teachers and students (see Figure D1). On the left side of the tool, there are links to warm-up problems, activities and notes, to the video (which is also displayed below the tool) and to practice problems. On the right side of the screen, the objectives are provided. There are links to two sets of instructions that must be clicked to open, links to other activities (an online quiz and a link to a webpage on how to construct a triangle with three given segments), a hands-on activity, and contextual problem about the different number of triangles that can be created from a rope with 13 knots. One set of instructions instructs the user on how to zoom in and out within the tool and to move the segments to create a triangle. The second set of instructions asks the students to find the segments that create the smallest triangle, the largest triangle, whether segments of lengths 3, 4, and 7 will form a triangle and the relationship among the 3 sides of a triangle. There are seven whole number segments fixed lengths. The segments are colored green and red but the tool does not indicate why the segments are different colors. To move the segments, one must drag one of the endpoints (the segment itself is not draggable). Both endpoints are draggable but do not have the same behavior.

  1. One of the endpoints (A) acts as the center of a circle such that when the other endpoint (B) is dragged, B revolves around A.
  2. When A is dragged, the segment acts as if it is a rigid object. Thus, you can pull the segment by dragging A and the segment follows. Or, you push A toward the segment and the segment moves in front of A in the instruction that A is being dragged. The movement is similar to that of a wagon. You can pull the handle (A) and the wagon follows. You can push the handle and wagon will move according to the force instruction applied to the handle.

The endpoints of the segment are not labeled, so one has to ascertain the behavior of the endpoints by dragging them. The instructions do not discuss the behavior of the segments.

 

 

Loading