Borowczak, M., & Burrows, A. C. (2016). Enabling collaboration and video assessment: Exposing trends in science preservice teachers’ assessments. Contemporary Issues in Technology & Teacher Education, 16(2). https://citejournal.org/volume-16/issue-2-16/science/enabling-collaboration-and-video-assessment-exposing-trends-in-science-preservice-teachers-assessments

Enabling Collaboration and Video Assessment: Exposing Trends in Science Preservice Teachers’ Assessments

by Mike Borowczak; & Andrea C. Burrows

Abstract

This article details a new, free resource for continuous video assessment named YouDemo. The tool enables real time rating of uploaded YouTube videos for use in science, technology, engineering, and mathematics (STEM) education and beyond. The authors discuss trends of preservice science teachers’ assessments of self- and peer-created videos using the tool. The trends were identified from over 900 assessments of 170 videos, with over 131 unique users. Included in this data set is a 2-year study focusing on 27 preservice science teachers (from a 5-year study of 76 total science preservice teachers) and their use of the tool. The authors collected both quantitative (numerical scores) and qualitative data (open-ended questions) from the 27 participants. Findings show that (a) rating two metrics had a non-zero bias between the two metrics; (b) preservice teachers found continuous video rating beneficial in enabling video assessment, promoting critical thinking, and increasing engagement; and (c) preservice teacher’s self-assessment was uncorrelated with their peers’ assessment. Additionally, the elements to enable skill improvement were met, including (a) a well defined task, (b) a challenging task, (c) immediate feedback, (d) error correction, and (e) practice. Implications include improvement in preservice teacher reflection and discussions, especially related to STEM content and pedagogy.

Within the context of improving education in the science, technology, engineering, and mathematics (STEM) fields, underrepresented groups in the STEM fields, K-20 and industry partnerships, social media use in education, and assessment is the important concept of teacher reflection, practice, and improvement. As such, Rich and Hannafin (2009) called for “evidence of impact” of using video and reflection with preservice teachers (p. 64).  Preservice and newer in-service teachers often struggle with reflection and, inherently, self-assessment. Interestingly, a significant discrepancy also arises between the peer and self-assessment of end-products. In developing culminating products such as videos and self-reflection documents, which are required by many teacher licensure programs, preservice teachers could benefit from additional peer support in order to improve their self-assessment and reflection skills.

Currently, online video feedback systems consist of a binary like or dislike judgment, with disjointed and unfocused open-response comments. These systems of peer assessment and feedback offer little constructive benefit to video creators.  Viewers also face a challenge when providing summative assessment of videos. While the determination of binary feedback of a video is a snap judgment, being able to define, describe, and justify the reasons behind the judgment is challenging due to the number of variables and the multiple points of reference under consideration throughout the video’s duration.

Aggregated binary assessments, the type typically available on video sharing sites, are analogous to students’ receiving a set of pass/fail grades from all of their teachers, who all use their own private and unique scoring rubric. While popular social media sites such as Youtube and Facebook, as well as online learning sites like Coursera and Kahn Academy, all allow for discussions of video content, their decoupled free response structures do not allow for a continuous formative assessment of the original content but, rather, a highly variable summative assessment based on the final opinion of the content viewer.

A different approach or tool that provides continuous feedback could promote positive attitudes towards technology use, which has been shown by Cullen and Greene (2011) to predict intrinsic and extrinsic motivation. With this in mind, we created YouDemo, an online tool, and used it with preservice teachers to assess various aspects of assessment. This study focuses on the discrepancies between peer and self-assessment, the relationship and bias between formative and summative assessment abilities, and the impact of assessing the work of peers and comparing it to one’s self-assessment of similar work.

Purpose/Problem/Gap in Literature

Within the context of educational assessment, a binary rating system provides a weak summative and nonconstructive evaluation of the overall product. The evaluation becomes a function of an individual viewers’ personal lens and is not based on a precisely defined metric (characteristic or quality) or metrics over the course of the entire work. Currently, video annotation is predominately composed of tools that allow for nonaggregating, text-based markup of videos. These tools include standalone PC applications such as VCode (http://social.cs.uiuc.edu/projects/vcode.html) and ANVIL (http://www.anvil-software.org/), as well as online applications that are not typically freely accessible to teachers, such as VideoPaper (https://vpb.concord.org/) and MediaNotes (http://www.cali.org/content/medianotes/). Presently, the only known free annotation tool is VideoANT (https://ant.umn.edu/), which allows text-based annotations to YouTube videos (Hosack, 2010).

In an educational setting, the age of traditional online courses and massive open online courses (MOOCs), online video-based critiques and assessment by peers and mentors can lack the depth and richness of in-person critiques and debates (Rich & Hannafin, 2009). Practice with video assessment and self-reflection is critical, because many preservice teachers are now subject to edTPA requirements (Barron, 2015) and must submit teaching videos and showcase their ability to reflect and self-assess. Their video submissions are critical to their final edTPA scores.

The tool presented here, YouDemo.org, targets preservice teachers, their K-12 mentor teachers, and university professors who are interested in critiquing peer videos and receiving aggregated evaluation feedback on their own videos. The tool links to existing YouTube videos, allows continuous critique of two metrics (or qualities), and provides a user access to the aggregated assessment.

YouDemo enables the continuous assessment of two video-creator-defined metrics. For the remainder of this article, “video creators” include users who create or upload videos, while “video evaluators” or “assessors” are those who provide feedback for the videos. Creators can view the results of aggregated quantitative metric assessment as well as qualitative feedback provided by evaluators. Creators can then evaluate, reflect, and compare their own self-assessment with an aggregate of their peers’ anonymous assessment of their work. This process allows video creators to gain authentic summative and formative feedback on their videos, which promotes reflection and pedagogical questioning.

YouDemo provides a teaching mechanism for both formative and summative assessment that can support and enable learning at all levels of education. Additionally, the validity and reliability of tools or assignments used in the classroom are important assessment aspects, and YouDemo underwent this scrutiny. As stated by Mertler (2003),

Evidence must be continually gathered and examined in order to determine the degree of validity possessed by decisions. Three formal sources of evidence that support the existence of validity include content, criterion, and construct evidence. Content evidence relies on professional judgment; whereas, criterion and construct evidence rely on statistical analyses. Content evidence of validity is the most important source of evidence for classroom assessments. As with validity, reliability addresses assessment scores and their ensuing use. (p. 66)

Over the course of 5 years, we trialed the continuous evaluation and video data aggregation at three universities in North America. In order to assess the impact of the tool, we conducted a mixed methods study, where a subset of the trial participants’ feedback of their own videos and that of their peers was captured before and after its use.

Although continuous rating and evaluation of a target source is not a new concept, having been used in election debates (Yang & Park, 2014), behavior coding practices (Messinger, Mattson, Mahoor, & Cohn, 2012), and even emotional response to music videos (Soleymani, Pantic, & Pun, 2012), we found no teaching connection. Thus, the new technology used during our study enabled preservice teachers with the means of collecting peer-assessment of any two instructor-selected video content qualities (such as content clarity, sound-level, humor, evidence of data collection, evidence of data analysis, and others).

Other potential use cases include K-20 teachers collecting critique feedback on student work from a class of students, K-20 students collecting critique feedback on their work or a group’s work from a class or panel of teachers, or administrators collecting feedback on their own work, teacher work, or student work.

To the best of our knowledge, no other online or free tool exists that allows continuous assessment of videos. Furthermore, no tools exist that allow users to specify and enforce the metric, or criteria, that they wish to have evaluated. The tool presented in this study, YouDemo, is a free tool for continuous, metric-focused evaluation of videos enabling formative anonymous, peer-assessment as well as experience in self-reflective practice.

Theoretical Framework and Literature Review

In using the video assessment technology, we embraced a social constructivist view (Vygotsky, 1978) as a theoretical framework. Focusing on the social process of learning while the preservice teachers critiqued the videos of themselves and their peers rather than only on the final product produced (the video itself) was paramount. Since STEM education is currently in the US national spotlight (Air Force Studies Board National Research Council, 2010; Bush, Karp, Popelka, & Bennett, 2012; National Governors Association Center for Best Practices and the Council of Chief State School Officers, 2010; National Science Board, 2012; National Council of Teachers of Mathematics, 2012; NGSS Lead States, 2013; National Science Teachers Association, 2012), gaining insights into STEM education video production and critique using a social constructivist perspective is important to consider while emphasizing critical content. Additionally, it is important to include the perspectives and assessments of currently underrepresented groups in STEM fields, such as minorities, students with low socioeconomic backgrounds, and women (Lehming, Gawalt, Cohen, & Bell, 2013), and how a technology implementation (like the tool presented here) might engage these groups with STEM content and purpose.

Partnership building and sustained collaboration are extremely important for mutually beneficial interaction between STEM and educational partners (Borowczak, 2015; Burrows, 2011, 2015). Through the use of video technology that provides explicit feedback, teacher-to-student and student-to-student dyads can strengthen their collaboration efforts and partnerships through directed and focused reflection. Over the years, video has been used to assess pre- and in-service teachers (Hannafin, Shepherd, & Polly, 2009), enhance learning (Clarke, Flaherty, & Mottner, 2001; Williams, Farmer, & Manwaring 2008), build technical skills for careers (Clarke et al., 2001; Hunt, Eagle, & Kitchen, 2004), promote more efficient teaching and better learning (Hunt et al., 2004; Kpanja, 2001), increase student understanding (Dillon & Gabbard, 1998), and increase student participation and teamwork (Sweeney & Ingram, 2001; Ueltschy, 2001), amongst other outcomes.

Thus, the whole scene of learning, or the process that leads to the product as expressed in sociocultural theory, is embraced. The individual parts in isolation do not create the scene. Using the whole scene within context will sharpen the understanding of how STEM education videos and their peer and instructor critiques can affect learning and understanding for the K-20 student audience.

Building partnerships and collaborations through interactions are not limited to face-to-face meetings, as technology interactions can build partnerships and learning as well. McCabe and Meuter (2011) looked at the seven principles for good relationship practices that included (a) encouraging contact between faculty and students, (b) encouraging reciprocity and cooperation among students, (c) encouraging active learning, (d) giving prompt feedback, (e) emphasizing time on task, (f) communicating high expectations, and (g) respecting diverse talents and ways of learning (Chickering & Gamson, 1987).

Determining if a tool enhances one or more of the seven principles is vital, as technology is one method to augment learning (McCabe & Meuter, 2011). Looking at technologies, and choosing the right one enables instructors to differentiate student instruction (Jones & Cuthrell, 2011).

With 83% of young adults using social networking sites (McCabe & Meuter, 2011; Taylor & Keeter, 2010; Zickuhr, 2010), video is already a part of the daily life of most in-service teachers. “Video adds a new dimension to the ways in which teaching and learning can be viewed, described, and interpreted. In particular, the literature emphasizes that video footage enables data collection and analysis to be an ongoing and iterative process” (Fitzgerald, Hackling, & Dawson, 2013, p. 61). Web 2.0 technologies are infiltrating schools of every level (Jones & Cuthrell, 2011). “The 21st century science classroom now contains nontraditional teaching tools, including laptops, personal digital assistants, and digital measuring devices” (Bang & Luft, 2013, p. 118).

University faculty members are utilizing YouTube and other social networking sites to distribute details of events and ideas (Haase, 2009). “YouTube can be used as a tool to inform and display and as a forum for critical analysis and commentary” (Jones & Cuthrell, 2011, p. 76). K-20 students are producing YouTube videos and displaying their own work in various settings, such as art and science classrooms (Sweeney & Ingram, 2001).

As Liberatore (2010) stated, “It is clear that the tech-savvy students of the net generation enjoy finding and sharing the videos” (p. 215). Acknowledging, then, that students would also like sharing self-produced videos is not a huge leap, and those self-produced projects allow for an authentic learning experience (Kearney & Schuck, 2006).

Preservice teachers can benefit from recording and analyzing their own lessons (Friend & Millitello, 2014; Star, Lynch, & Perova, 2011, Van Es & Sherin, 2008). However, preservice teachers who are new to video self-observation tend to hyperfocus on their teaching methods (Fadde & Sullivan, 2013b). While coding videos can be daunting (de Mesquita, Dean, & Young, 2010), peer critique with classroom partners using video sharing and Web 2.0 technologies can generate discussion and learning with preservice teachers (Fadde & Sullivan, 2013b; Heintz, Borsheim, Caughlan, & Juzwik, 2010; Star et al., 2011).

Providing preservice teachers with opportunities to practice analyzing videos of other peer preservice teachers may help the video creators eventually to evaluate video recordings of themselves (Fadde & Sullivan, 2013b). Research shows that well-defined and challenging but achievable tasks with immediate feedback are critical for skill improvement. The opportunity to correct errors and repeat the process until skills become more routine is also vital (Williams et al., 2008).

There are limitations to technology use such as video assessment, and some tools will work better than others in different situations (McCabe & Meuter, 2011).  Preservice teachers who are new to video self-observation tend to notice only their teaching delivery (Fadde & Sullivan, 2013a; Kagan & Tippins, 1991; Wang & Hartley, 2003). Using peer critique first is beneficial, since the focus is on peer teaching and the delivery is only one piece to assess (Kagan & Tippins, 1991).

Methods

To determine the usefulness of YouDemo in real-world preservice teacher applications, we tracked responses and solicited feedback on the tool itself. YouDemo, used in the study presented here as well as in prior studies (Borowczak & Burrows 2011; Burrows & Borowczak, 2014), enabled the assessment of online videos.  To date, YouDemo has been utilized in over 900 assessments of 170 videos, with over 131 unique users. The tool does not edit, store, or manipulate videos in any way, rather it links to video already hosted on the Internet (e.g., YouTube).

YouDemo targets three main users—a video creator (e.g., a preservice student), peer evaluators (e.g., a student’s peers), and an expert assessor (e.g., a student’s instructor). Each of these users plays a different role in any assessment cycle. Figure 1 shows the five main stages within the continuous video assessment cycle, as well as the user associated with that stage: video linkage (video creator), assessment requests (video creator and expert assessor), peer-assessment (peer evaluators), aggregate assessment review (video creator and expert assessor), and sharing of results (video creator).

Figure 1. The continuous cycle of video assessment: Creation, sharing, assessment, assessment aggregation, and sharing of assessment results.
Figure 1. The continuous cycle of video assessment: Creation, sharing, assessment, assessment aggregation, and sharing of assessment results.

Stage A: Video Linkage for Assessment

YouDemo does not store any videos, rather it relies on an existing video sharing site such as YouTube to store and play back videos. Users wishing to add a video to YouDemo simply link to an existing online video. During this process, the user has the opportunity to provide additional details about the video, the class it pertains to, summary details and, most importantly, two metrics that they want continuously assessed throughout the video playback. Figure 2 shows the linking process.

Figure 2. The linking process consists of five required fields including a YouTube address, a video name, two metrics and a video summary. Optionally, the video creator can select a course and enter a course PIN (personal identification number) as defined by the course instructor.
Figure 2. The linking process consists of five required fields including a YouTube address, a video name, two metrics and a video summary. Optionally, the video creator can select a course and enter a course PIN (personal identification number) as defined by the course instructor.

 

Stage B: Disseminating Assessment Request Using Social Media

Recognizing that today’s students enjoy sharing online videos (Liberatore, 2010), the YouDemo implementation connects to several popular social media platforms, including Facebook, Twitter, and Google+ (see Figure 3). This feature allows both creators and evaluators the ability to share, promote, and comment on the videos that they have added or the videos they have previously assessed. This type of propagation allows for an increased assessment population sample beyond the traditional confines of the typical classroom.

Figure 3. An example of the social media integration available to video creators.
Figure 3. An example of the social media integration available to video creators.

Stage C: Video Assessment

The video assessment portion of YouDemo consists of three main areas, including the video playback panel, a live assessment stream panel, and an information panel with video details and statistics. The video playback occurs using an interface similar to other online video sites with a play/pause button. As seen in Figure 4, the assessment stream (and the data collected from it) is controlled by the evaluator throughout the entire video using either the four directional keyboard arrows or, while on mobile devices, the four onscreen arrows. Evaluators see both a historical summary of their ratings and an instantaneous qualitative mapping of the current rating using the mappings in Table 1. Since the primary objective of the tool is to gather information in real time, an evaluator can pause the video without restarting the entire video and rating process. Upon completion of the video playback, the collected live assessment scores are stored by the YouDemo tool.

Figure 4. The assessment page containing three separate panels:  the video panel, the assessment streams, and information panel.
Figure 4. The assessment page containing three separate panels:  the video panel, the assessment streams, and information panel.

Table 1
The Likert-Scale to Qualitative Text Mapping in the Current Implementation

Likert-Value Qualitative Text
0-1Non-Existent
2-3Lacking
3-6Average
7-8Good
9-10Excellent

Stage D: Aggregated Video Assessment Results

To collect meaningful and useful peer-assessment data a process should guarantee assessor anonymity while providing aggregated summative assessment. This process is at the core of YouDemo. YouDemo enables the collection of assessment data on any two video metrics, as defined by the video creator. The tool allows video creators to view assessment results only across all assessors. Figure 5 shows an example of the aggregation process, where the average of all the individual evaluator ratings form an aggregate rating, ultimately shown to the video creator. This convention fundamentally handles the key hurdles of anonymity and aggregation.

Figure 5. Aggregation of evaluators’ assessment scores anonymizes individual assessment scores.
Figure 5. Aggregation of evaluators’ assessment scores anonymizes individual assessment scores.

Figure 6 shows the aggregated results as presented to video creators in YouDemo. In this current implementation of the technology, the video creator has access to a graphical representation of the metric score over time, as well what was “liked” and what needs “potential change.”Once a video is assessed, the collected data is stored, processed, and used to derive a new aggregated assessment summary, which includes both the two continuous quantitative metrics and several qualitative open-response questions that follow the video. This mixed (quantitative and qualitative) data provide the creator insight to how the video is perceived by others in both the context of the metrics selected and the evaluator’s personal lens.

Stage E: Sharing Video Assessment Results

The ability to share aggregated evaluations allows video creators such as preservice teachers to disseminate results to an instructor, an interviewer, a mentor teacher, or even their peers in order to understand more global trends. While the ability to disseminate results is not central to the scope of this work, it may be of particular interest in the context of classroom and online instruction when the number of students makes individual assessment infeasible.

The continuous video rating technology allows for peer critique of teaching videos. While the focus of this discussion is on its use in university level secondary science methods courses, implementation of this technology in other K-20 classrooms might require modification[a] of the ways video creators add videos and metrics.

While the tool has been presented as a peer-to-peer assessment tool, another expected use is as an instructor-to-student tool in which a classroom instructor could upload a video for a flipped classroom and have metrics of “Does this make sense?” and “Are you learning?” Students would be required not only to watch the video before class but to engage actively in rating the video. A teacher could easily view the students’ overall self-assessment of the material as well as how engaging the material was in its presentation, before meeting in class with the students.

Study

While we have been using YouDemo for 5 years with 76 preservice science teachers, this study focused on 27 preservice science teachers’ use of YouDemo over 2 years as they provided feedback to us in written and electronic forms.  The preservice science teachers were a mixed group, with undergraduates and graduates obtaining degrees in both a STEM subject and science education. As part of their degree requirements, they took a course on how to teach science within the context of STEM integration. The course required them to create two videos per class and post them to YouTube. The videos took the form of STEM demonstrations directed at a K-12 student audience, STEM hot topic commercials, and practice teaching sessions (micro-teaches). The instructor (second author Burrows) provided guidelines that the videos should run between 2 and 10 minutes in length, highlight specific STEM content, and relate to real world STEM applications in an engaging manner.

The study relies on three datasets: (a) participant self-assessment before and after their use of YouDemo (pre/post self-assessment), (2) written peer assessments of participant videos, and (c) YouDemo assessment data of participant videos. The three datasets contained both quantitative data and qualitative data.

The participant self-assessment consisted of summative assessment of the participant’s own video with respect to two metrics.  The self-assessment also contained open-response questions asking why the participant choose that self-assessment score. The written peer assessment asked the same questions (summative assessment of two metrics per video and open response) as did the self-assessment—each video was peer assessed by two fellow students. Finally, the YouDemo assessment data contained formative assessment data—tracking two assessment metrics throughout the entirety of the video—and open-response data concerning the specific qualities of the video. The questions were as follows:

  • What specifically do you remember about the video you just watched?
  • How did the tool affect your viewing of this video?
  • How do you think the tool could or should be used?
  • What did you like best about this video?
  • What would you change about this video?

With YouDemo explained from our perspectives and grounded in the literature of video use, we explored the usefulness, interactions, and peer-to-tool assessments with preservice science teachers. The research questions we investigated were as follows:

  • How do preservice teachers assess themselves and their peers—and how do these assessments compare using the tool?
  • How do preservice teachers interact with the tool and what video characteristics are most important to them?

Analysis

In addition to traditional statistical analysis, the quantitative data was subjected to validity and reliability testing. Validity of the YouDemo assessment tool was established by performing a correlation analysis between the per-metric and average scores computed by YouDemo and those reported by peers. The overall Pearson’s correlation coefficient between the YouDemo scores and the peer reported scores was 0.71. Additionally, a similar analysis was conducted between the pre- and postself-assessment of the same metrics, and the correlation between the two samples was 0.61.

Reliability of the quantitative data collected was assessed using both McDonald’s omega (ω) and Cronbach’s alpha (α) (Zinbarg, Revelle, Yovel, & Li), and were found to be ω = 0.96 and α = 0.89, respectively.  The qualitative data analysis was completed using coding for themes following Tesch’s (1990) eight steps. Ultimately, data collected during the study showed that the YouDemo tool is a valid and reliable method to collect data and answer the research questions.

Findings

Overall, the data sets indicated four main trends regarding participant interactions and comparability of the peer and self-assessments: (a) simultaneously rating two metrics had a non-zero bias or relationship between the two metrics; (b) the preservice teachers found continuous video rating beneficial in enabling video assessment, promoting critical thinking and increasing engagement; (c) the preservice teachers’ self-assessments were uncorrelated with their peers’ assessments; and (d) students with lower self-assessment were rated higher by their peers.

The quantitative data, derived from the actual video assessment tool, as well as pre- and postsurveys, revealed interesting patterns and relationships between the preservice teachers’ self-assessments and their peers’ assessments. Additionally, the relationships between the formative peer-assessments and the computed formative peer-assessments using the continuous summative assessment provided by the YouDemo technology showed trends in the data. This data enabled us to answer the second research question, “How do preservice teachers assess themselves and their peers, and how do these assessments compare using the tool?”

Figure 7 links to a Plotly graph [b] online showing the relationship between the two assessment metrics for all videos. A slight bias existed between the metrics. Over a large data set, an unbiased relationship between metrics would produce a linear regression through the line Y = X. In our data, the relationship between the metrics was skewed such that Y = 0.78*X + 1.25. When the first metric’s average was below 5, the second metric’s average tended to be less than the first, and when the first metric’s average was above 5 the second metric’s average tended to be greater than the first.

Metric 2 Versus Metric 1 Per Video

Figure 7. Metric 2 assessment scores as a function of Metric 1 scores for each video. A slight bias between the metrics is evident. Rating Metric 1 low leads to an even lower Metric 2. Rating Metric 1 high leads to an even higher Metric 2.

 

Figure 8 shows the relationship between the preservice teachers’ pre- and postassessment, while the color of the data point represents the average score given by their peers. Of the total data set, the alignment between pre- and postself-assessment, as well as peer assessment, was computed for the last 2 years of the study. The subset consisted of 27 fully matched data sets. In the first observation, about half of the preservice teachers experienced no shift in self-assessment after watching and assessing their peers (14 of 27 instances). Additionally, a majority of peer assessment tended to be “average” (or a score of 5 or 6), highlighting the inability of preservice teachers to differentiate strengths and weaknesses during assessment.

Next, those with low peer-assessment scores actually tended to self-assess much higher than average by a factor of about 100%— irrespective of their own assessment of their peers. Less than a quarter of all preservice teachers rated themselves lower after assessing their peers. Finally, those preservice teachers performing above average based on peer assessment (scores of 7 to 9), rarely rated themselves lower after assessing their own peers’ work (1 of 27 instances).

Post Self Assessment vs Pre Self Assessment <br>grouped by Peer Assessment

Figure 8. Preassessment and postassessment score, grouped by the average peer assessment score for 27 preservice teachers.

The preservice teachers’ formative assessment scores from the tool were averaged and compared to the summative assessment scores collected via written post peer and self-assessment. The objective was to determine how precise preservice teachers were in assigning summative assessment scores based on formative observations. Figure 9 shows the relationship between the average peer formative assessment score, computed by taking the average assessed value from the continuous rating tool and the self-summative assessment score collected from written assessments for the two individual metrics.
a

Average Formative Peer-Assessment versus Summative Peer-Assessment

b

Relative Difference in Formative and Summative Scores

Figure 9. Comparison of assessors’ summative assessment scores for metrics versus their average formative assessment score while using the continuous rating technology.

 

The patterns emerging showed a weak linear relationship between the average formative assessment scores and the summative score. While a correlation exists between the two, the precision of forming a summative assessment based on the formative observations is weak.  A similar relationship existed in preservice teachers’ ability to assign relative scores. The difference in the average formative assessment score is closely related to the difference in the summative score.

Using the coded qualitative data collected after the assessment of each video, in conjunction with the peer and self-assessment data, we found two main themes from the preservice teachers’ responses. One theme revolved around the use of YouDemo, or using the computer as an interface (e.g., using arrow keys), and the second theme involved the video properties (e.g., sound quality).

Based on the data from the use of YouDemo, the preservice teachers exhibited three characteristics for using the tool. The first characteristic was that they were more likely to push an arrow key for a metric when the video changed scenes. The second characteristic was that the arrows for increasing in a metric were used more frequently than the arrows for decreasing in a metric.

The third characteristic was that the preservice teachers said they paid more attention throughout the videos when they had to critique them using the YouDemo tool. For example, the responses to the question, “How did YouDemo affect your viewing of this video?” included the following:

  • “[It] made me pay attention.”
  • “[It] helped me be more objective.”
  • “I was thinking about aspects of the video that I would not have thought about.”

Table 2 shows the breakdown of the most significant responses from the 27 preservice science teacher participants.

Table 2
Top 20% of Responses From Preservice Teacher When Asked How the Tool Affected Their Viewing of the Video

Encoded ConceptCountFrequency
Allowed Rating/Accessibility3762%
Easier to Rate Video2237%
Forced Critical Thinking1932%
Increased Engagement1322%

 

The preservice teachers reported that when used in moderation, the tool could focus their attention and allow them to recall more details of the videos. For example, we asked students who watched the American Association for the Advancement of Science (AAAS, 2011) Science-in-a-Minute video focusing on Newton’s Second Law (Video 1), “What specifically do you remember about the video you just watched?”

Video 1. Newtons Universal Law of Gravitation – Science in a Minute (https://youtu.be/Jk5E-CrE1zg)

 

Among the responses were the following:

  • “[The video] discussed Newton’s law of gravity, showed a moon/earth tug of war.”
  • “That it dealt with Newton’s law of gravity”
  • “The earth and moon are 4 billion years old.”

The preservice teachers who peer critiqued videos showed that they were able to recall more specific details from the videos than did those who watched the videos without using the tool. When showing the same video without using YouDemo, typical student responses included shorter, less specific answers, such as,

  • “It was about gravity”
  • “Some one [sic] had laser eyes.”
  • “There were cool graphics.”

Regarding the second theme of video properties, the preservice teachers stressed the importance of entertainment, moderate pacing or flow, sound quality with appropriate speech patterns, and visual appeal with changing scenes. These findings are in line with Bueno de Mesquita, Dean, and Young’s observations (2010). When using YouDemo, participants focused on the bigger components of the video instead of the smaller details.

The limitations of the educational study are broken down into components, including setting, experience, and preservice teacher diversity. First, the educational trial study highlighted in this work was conducted in a university setting within the context of two required courses. This homogeneity of the participant pool’s prior pedagogical content knowledge could lead to unreproducible changes between peer and self-assessment, as well as  pre- to postself-assessment, in other preservice teacher populations going through other teacher preparation programs.

Second, the idea and coordination required to rate a video in real time was suggested to be unintuitive by several participants until the user gains experience in doing so. While the impact of the assessment collection interface has not been studied, participants who found the interface to be unintuitive could potentially skew the continuously collected peer assessment data reported earlier as the Average Formative Assessment Score.

Finally, all of the participants in the educational study were preservice science teachers, whose cultural backgrounds were similar. Further extending the homogeneity of the participant pool, the cultural background and experience of the participants may have skewed the peer assessment and self-assessment results as a whole. Given these limitations, the results from this study are not yet generalizable. A broader participant pool in conjunction with the collection of demographic, cultural, and teacher preparation markers would be required to form any generalizations of the entire preservice STEM teacher population.

Conclusion and Implications

Overall, the use of YouDemo engaged the preservice teachers in reflection and discussion on a deeper level than traditional means of pedagogical skill building in the classroom, based on the discrepancies we found in peer and self-evaluations. Based on results from this study of preservice teachers and their peers’ assessment of videos, both continuous formative assessment using YouDemo and summative final assessments, the fluid and varied ratings of dynamic media captures the discrepancy of preservice teachers’ ability to assign summative scores to formative experiences (Figure 9).

From the implementation and educational study of continuous video assessment, we can draw several distinct conclusions. First, free, continuous, focused video assessment broadened educational peer critique and self-reflection in STEM preservice teachers in this study. Second, video evaluators focused on and equated noncontent-based properties (e.g., sound and image quality) to higher quality videos, and these properties differed depending on viewing context. In particular, without constraint, video evaluators focused on microproperties, such as individual speech patterns or a particular visual element. With YouDemo they focused on macro, holistic properties such as flow and idea clarity. Furthermore, the tool promoted two key behaviors: (a) reflection from the evaluator and the video creator, and (b) the social-propagation of self-created and peer-assessed media.

Importantly, in line with recommendations by Chickering and Gamson (1987), the tool encourages communication between faculty and students, reciprocity and cooperation between students, active learning, prompt feedback, time on task, communication of expectations, and learning with diverse talents.  Focusing on, Williams, Farmer, and Manwaring’s (2008) five elements that are key to skill improvement, YouDemo enabled preservice teachers to complete tasks (captured using video evidence) that were well defined and challenging but achievable, while offering immediate aggregated assessment feedback from peers, as seen in the formative and summative assessment scores presented. As seen with the shift in pre- to postself-assessment scores, peer assessment data can enhance preservice teachers’ reflection of their own works enabling them to recreate videos or other deliverables that address errors. Based on the open responses concerning the the use of the continuous video assessment tool, it appears that the process of peer-assessment of videos engages and forces critical thinking in both video creators as well as video assessors.

The implications for the continuous video assessment technology and the educational study presented in this work are intertwined. This study confirms that a broadening of peer critiques using YouDemo enables more reflection of self and others. Secondly, video properties, like presentation skills and mastery of content, overlap with traditional classroom skills, and attention to these details in media shared on YouDemo is as important as the skills seen in K-20 classrooms and in standards such as the Next Generation Science Standards and Common Core (Bush et al., 2012; National Governors Association, 2010; NGSS Lead States 2013;). Instructors now have a new, free tool to encourage peer critique and collaboration, which enables continuous formative assessment and aggregation—a useful addition to pedagogy  and STEM content courses alike.

End Note

[a] YouDemo is built with openness, community, collaboration and partnerships in mind (Burrows & Borowczak, 2014). The requirements to successfully host this technology are lightweight and accessible to many K-12 school districts and collegiate departments. Requirements include a PHP-enabled webserver with MySQL. The storage requirements are minimal, with the website requiring less that 5 megabytes (MB) while the database will require about 2 MB per 1,000 ratings. Anyone interested in setting up their own instances of YouDemo is urged to explore YouDemo.org and contact the first author for assistance as needed. Importantly, although we created YouDemo, this is a free, open-source platform and we do not benefit from its use in any way.

[b]In efforts to promote open data, we have included nonidentifiable aggregated data using the online collaborative data analysis and plotting tool Plotly (www.plot.ly). The use of the interactive plotting tool allows anyone with an Internet connection to view precreated charts included within the paper, as well as manipulate the visualizations (e.g., turning specific data series on or off, changing the view of the 3-D visualization, adding custom fit-lines, etc.). Viewers who sign up on plot.ly can download and manipulate raw data.

References

American Association for the Advancement of Science. (2011). AAAS video contest finalists teach “Science in a minute.” Retrieved from http://www.aaas.org/news/releases/2011/0316am_film_fest.shtml

Air Force Studies Board, National Research Council. (2010). Examination of the U.S. Air Force’s science, technology, engineering, and mathematics (STEM) workforce needs in the future and its strategy to meet those needs. Washington, DC: National Academies Press.

Bang, E., & Luft, J. A. (2013). Secondary science teachers’ use of technology in the classroom during their first 5 years. Journal of Digital Learning in Teacher Education, 29(4), 118-126.

Barron, L. (2015). Preparing pre-service teachers for performance assessments. Journal of Interdisciplinary Studies in Education, 3(2), 68-75.

Borowczak, M. (2015). Communication in STEM education: A non-intrusive method for assessment & K20 educator feedback. Problems of Education in the 21st Century, 65, 18-27.

Borowczak, M., & Burrows, A. (2011). YouDemo: Capturing live data from videos. In N. Callaos, N., Savoie, M. Siddique, & D. Zinn (Eds.), Proceedings of the International Conference on Information and Communication Technologies and Applications, Gammarth-Tuni,Tunisia.

Bueno de Mesquita, P., Dean, R., & Young, B. (2010). Making sure what you see is what you get: Digital video technology and the preservice preparation of teachers of elementary science. Contemporary Issues in Technology and Teacher Education, 10(3), 275-293. Retrieved from https://citejournal.org/vol10/iss3/science/article1.cfm

Burrows, A. C. (2011). Secondary teacher and university partnerships: Does being in a partnership create teacher partners? (Unpublished doctoral dissertation). University of Cincinnati, Ohio.

Burrows, A. C. (2015). Partnerships: A systemic study of two professional developments with university faculty and K-12 teachers of science, technology, engineering, and mathematics. Problems of Education in the 21st Century, 65, 28-38.

Burrows, A., & Borowczak, M. (2014, October). Online STEM integration: Preservice science teachers in the director’s chair. In T. Bastiaens (Ed.), Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2014 (pp. 269-277). Chesapeake, VA: Association for the Advancement of Computing in Education.

Bush, S. B., Karp, K. S., Popelka, L., & Bennett, V. M. (2012). What’s on your plate? Thinking proportionally. Mathematics Teaching in the Middle school, 18(2), 100-109.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 2-6.

Clarke, I., Flaherty, T. B., & Mottner, S. (2001). Student perceptions of educational technology tools. Journal of Marketing Education, 23(3), 169-177.

Cullen, T., & Greene, B. (2011). Preservice teachers’ beliefs, attitudes, and motivation about technology integration. Journal of Educational Computing Research, 45(1), 29-47.

Dillon, A., & Gabbard, R. (1998). Hypermedia as an educational technology: A review of the quantitative research literature on learner comprehension, control, and style. Review of Educational Research, 68(3), 322-349.

Fadde, P. J., & Sullivan, P. (2013a). Designing communication for collaboration across engineering cultures. International Professional Communication Journal, 1(2), 135-158.

Fadde, P., & Sullivan, P. (2013b). Using interactive video to develop preservice teachers’ classroom awareness. Contemporary Issues in Technology and Teacher Education, 13(2), 156-174. Retrieved from https://citejournal.org/vol13/iss2/general/article1.cfm

Fitzgerald, A., Hackling, M., & Dawson, V. (2013). Through the viewfinder: Reflecting on the collection and analysis of classroom video data. International Journal of Qualitative Methods, 12, 52-64.

Friend, J., & Millitello, M. (2015). Lights, camera, action: Advancing learning, research, and program evaluation through video production in educational leadership preparation. Journal of Research on Leadership Education, 10, 81-103.

Haase, D. G. (2009). The YouTube makeup class. The Physics Teacher, 47(5), 272-273.

Hannafin, M. J., Shepherd, C. E., & Polly, D. (2010). Video assessment of classroom teaching practices: Lessons learned, problems and issues. Educational Technology, 50(1), 32-37.

Heintz, A., Borsheim, C., Caughlan, S., Juzwik, M. M., & Sherry, M. B. (2010). Video-based response and revision: Dialogic instruction using video and Web 2.0 technologies. Contemporary Issues in Technology and Teacher Education, 10(2), 175-196. Retrieved from https://citejournal.org/vol10/iss2/languagearts/article2.cfm

Hosack, B. (2010). VideoANT: Extending online video annotation beyond content delivery. TechTrends, 54(3), 45-49.

Hunt, L., Eagle, L., & Kitchen, P. J. (2004). Balancing marketing education and information technology: Matching needs or needing a better match? Journal of Marketing Education, 26(1), 75-88.

Jones, T., & Cuthrell, K. (2011). YouTube: Educational potentials and pitfalls. Computers in the Schools, 28(1), 75-85.

Kagan, D. M., & Tippins, D. J. (1991). How student teachers describe their pupils. Teaching and Teacher Education, 7(5), 455-466.

Kearney, M., & Schuck, S. (2006). Spotlight on authentic learning: Student developed digital video projects. Australasian Journal of Educational Technology, 22(2), 189-208.

Kpanja, E. (2001). A study of the effects of video tape recording in microteaching training. British Journal of Educational Technology, 32(4), 483-486.

Lehming, R., Gawalt, J., Cohen, S., & Bell, R. (2013). Women, minorities, and persons with disabilities in science and engineering: 2013 (Report No. 13-304). Arlington, VA: National Science Foundation.

Liberatore, M. W. (2010). YouTube Fridays: Engaging the net generation in 5 minutes a week. Chemical Engineering Education, 44(3), 215-221.

McCabe, D. B., & Meuter, M. L. (2011). A student view of technology in the classroom: Does it enhance the seven principles of good practice in undergraduate education? Journal of Marketing Education, 33(2), 149-159.

Mertler, C. A. (2003). Classroom assessment: A practical guide for educators. Los Angeles, CA: Pyrczak Pub.

Messinger, D. S., Mattson, W. I., Mahoor, M. H., & Cohn, J. F. (2012). The eyes have it: Making positive expressions more positive and negative expressions more negative. Emotion, 12(3), 430.

National Governors Association Center for Best Practices and the Council of Chief State School Officers. (2010). Common core state standards initiative. Retrieved from http://www.corestandards.org/

National Science Board. (2012). Science and engineering indicators 2012. Retrieved from http://www.nsf.gov/statistics/seind12/

National Council of Teachers of Mathematics. (2012). NCATE mathematics program standards. Retrieved from http://www.nctm.org/Standards-and-Positions/CAEP-Standards/

NGSS Lead States. (2013). Next generation science standards: For states, by states. Retrieved from http://www.nextgenscience.org/

National Science Teachers Association. (2012). NSTA standards for science teacher preparation. Retrieved from http://www.nsta.org/preservice

Rich, P. J., & Hannafin, M. (2009). Video annotation tools technologies to scaffold, structure, and transform teacher reflection. Journal of Teacher Education, 60(1), 52-67.

Soleymani, M., Pantic, M., & Pun, T. (2012). Multimodal emotion recognition in response to videos. IEEE Transactions on Affective Computing, 3(2), 211-223.

Star, J. R., Lynch, K. H., & Perova, N. (2011). Using video to improve mathematics teachers’ abilities to attend to classroom features: A replication study. In M.G. Sherin, V. R. Jacobs, & R. A. Philipp (Eds.), Mathematics teachers’ noticing: Seeing through teachers’ eyes (pp. 117-133). New York, NY: Routledge.

Sweeney, J. C., & Ingram, D. (2001). A comparison of traditional and web-based tutorials in marketing education: An exploratory study. Journal of Marketing Education, 23(1), 55-62.

Taylor, P., & Keeter, S. (2010). Millennials-a portrait of generation next: Confident. Connected. Open to change. Washington, DC: Pew Research Center.

Tesch, R. (1990). Qualitative research analysis types and software. London, UK: Falmer Press.

Ueltschy, L. C. (2001). An exploratory study of integrating interactive technology into the marketing curriculum. Journal of Marketing Education, 23(1), 63-72.

Van Es, E. A., & Sherin, M. G. (2008). Mathematics teachers’“learning to notice” in the context of a video club. Teaching and Teacher Education, 24(2), 244-276.

Vygotsky, L. S. (1978). Mind and society: The development of higher mental processes. Cambridge, MA: Harvard University Press.

Wang, J., & Hartley, K. (2003). Video technology as a support for teacher education reform. Journal of Technology and Teacher Education, 11(1), 105-138.

Williams, G. R., Farmer, L. C., & Manwaring, M. (2008). New technology meets an old teaching challenge: Using digital video recordings, annotation software, and deliberate practice techniques to improve student negotiation skills. Negotiation Journal, 24(1), 71-87.

Yang, K., & Park, T. (2014). K-motion: Visualizing election information for live television broadcasts. Multimedia Tools and Applications, 74, 11631-11651. doi:10.1007/s11042-014-2253-2

Zickuhr, K. (2010). Generations online in 2010. Retrieved from the Pew Research Center website: http://www.pewinternet.org/2010/12/16/generations-2010/

Zinbarg, R. E., Revelle, W., Yovel, I., & Li, W. (2005). Cronbach’s α, Revelle’s β, and McDonald’s ω H: Their relations with each other and two alternative conceptualizations of reliability. psychometrika, 70(1), 123-133.

 

Author Note

Mike Borowczak
Erebus Labs
Email:
[email protected]

Andrea C. Burrows
College of Education: Secondary Science
University Of Wyoming
Email:
[email protected]

 

 

 

Loading