Dede, C. (2005). Commentary: The growing utilization of design-based research. Contemporary Issues in Technology and Teacher Education [Online serial], 5(3/4). Available: http://www.citejournal.org/vol5/iss3/seminal/article1.cfm
Commentary: The Growing Utilization of Design-Based Research
Harvard Graduate School of Education
Dr. Roblyer’s (2005) excellent overview of educational technology research
makes a compelling case for improving the manner in which research on educational
technology is conducted. I commend CITE Journal for its initiative in publishing
and deconstructing a series of exemplary studies that illustrate best practices
in education research. This commentary is intended to extend and deepen Roblyer’s
very brief discussion of one particular type of research—studies of technology-based
instructional designs—that she describes as “almost non-existent.”
In fact, scholars are publishing a growing body of high quality, design-based
research studies that address many of the weaknesses of typical scholarship
in educational technology Roblyer highlighted.
What Is Design-Based Research?
Design-based research (DBR) is a relatively new methodological strategy for
studying a wide range of designs, including technology-based instructional designs.
Collins, Joseph, and Bielaczyc (2004) defined DBR thus:
Design experiments bring together two critical pieces in order to guide us
to better educational refinement: a design focus and assessment of critical
design elements. Ethnography provides qualitative methods for looking carefully
at how a design plays out in practice, and how social and contextual variables
interact with cognitive variables. Large-scale studies provide quantitative
methods for evaluating the effects of independent variables on the dependent
variables. Design experiments are contextualized in educational settings,
but with a focus on generalizing from those settings to guide the design process.
They fill a niche in the array of experimental methods that is needed to improve
Recently, both the special issue on DBR of The Journal of the Learning Sciences
(Volume 13, No 1, 2004) and the special DBR issue of Educational Researcher
(Vol. 32, No. 1, 2003) provided detailed, research-oriented expositions of this
methodology’s theoretical, conceptual, and analytic foundations. In contrast,
a special issue of Educational Technology (Volume 45, No. 1, 2005) focused on
more applied perspectives about DBR, illustrating these with case studies of
exemplary work using this method. The reader is referred to those sources for
more detailed definitions of DBR and examples of exemplary DBR studies in educational
How Does DBR Address the Issues Roblyer Raises?
Design-based research methodology intrinsically incorporates many of the features
Roblyer asserts are lacking in typical educational technology scholarship.
Significance for Practice and Implications
Numerous researchers, practitioners, and policy makers have criticized many
of the findings from educational research as having little impact on practice,
or even on the evolution of theory (Haertel & Means, 2003; Lagemann, 2002).
In contrast, as Stokes (1997) described, DBR resembles the scholarly strategy
chosen by the scientist Pasteur, in which investigation of difficult, applied,
practice-driven questions demands and fosters studies of fundamental theoretical
issues. As one illustration, the research my colleagues and I are conducting
on multi-user virtual environments (Nelson, Ketelhut, Clarke, Bowman, &
Dede, 2005) tests the efficacy of three alternative pedagogical strategies based
on different theories about learning: guided social constructivism, expert mentoring
and coaching, and legitimate peripheral participation in communities of practice.
We are examining which of these pedagogies works best for various types of content
and skills, as well as for different kinds of learners.
Beyond a Gulf Between Quantitative and Qualitative Methods
Roblyer depicts quantitative methods as appropriate for the generalizability
of an experimental intervention across sites, while qualitative methods are
portrayed as useful for studying its impact at a single site. Design-based research
takes a more nuanced, mixed-methods view of quantitative and qualitative analytics.
Many DBR studies utilize a form of “interventionist ethnography,”
in which research studies perturb a range of typical learning settings by introducing
evocative, theory-influenced designs, then use both qualitative and quantitative
analytics to draw out implications for new theories of teaching, learning, and
schooling. For example, Yasmin Kafai (2005) is using DBR to evaluate and evolve
a pedagogical approach called “classroom as living laboratory” by
keeping several variables constant (such as the teacher, pedagogy, and students)
while varying key aspects, such as collaborative arrangements. This study involves
a rich mixture of quantitative and qualitative analytics to elucidate implications
for design, theory, practice, and policy.
Improving Scalability and Sustainability Via Sophisticated Implementation
Another key way in which DBR differs from both conventional design and traditional
research is its emphasis on adapting a design to its local context, a vital
attribute for scaling up an innovation successful in one place to many other
venues with dissimilar characteristics (Dede, in press). In making judgments
about the promise of an intervention, differentiating its design from its “conditions
for success” is important. The effective use of antibiotics illustrates
the concept of “conditions for success”: Antibiotics are a powerful
“design,” but worshiping the vial that holds them or rubbing the
ground-up pills all over one’s body or taking all the pills at once are
ineffective strategies for usage – only administering pills at specified
intervals works as an implementation strategy. A huge challenge we face in education,
and one of the reasons our field makes slower progress than venues like medicine,
is the complexity of conditions for success required in effective interventions;
nothing powerful in facilitating learning is as simple as an inoculation in
Design-based research findings typically show substantial influence of contextual
variables in shaping the desirability, practicality, and effectiveness of designs.
For example, studies of educational technology frequently depict “conditions
for success” challenges related to teacher professional development, a
common issue in many types of educational interventions. Resolving implementation
problems such as this presents choices about alternative approaches to the iterative
evolution of a design. In this particular case, alternative strategies include
changing the design so that the intervention is more “teacher-proof,”
expanding the design so that extensive teacher professional development is now
part of the “treatment,” or abandoning the design as unpromising
because its effective use requires a level of knowledge and skill likely unattainable
in the typical teaching population for the foreseeable future.
This is not an easy dilemma to resolve and illustrates the ways that DBR,
in contrast to many types of conventional research, intrinsically confronts
scalability issues of great interest to practitioners and policymakers. For
example, with NSF funding my colleagues and I are currently studying the feasibility
and potential value of a “scalability index” that would provide
a quantitative measure along different contextual dimensions on the extent to
which the effectiveness of a technology-based intervention would be eroded by
shortfalls from ideal conditions in a particular implementation context.
I applaud CITE Journal’s initiative for seeking to improve typical research
on educational technology and Roblyer’s thoughtful conceptual framework
for accomplishing this. Extending that framework and the exemplary studies deconstructed
to include the growing body of scholarship using design-based research methodologies
would strengthen this effort.
Dede, C. (in press). Scaling up: Evolving innovations beyond ideal settings
to challenging contexts of practice. In R.K. Sawyer (Ed.), Cambridge handbook
of the learning sciences. Cambridge, England: Cambridge University Press.
Dede, C. (2005). Why design-based research is both important and difficult.
Educational Technology, 45(1), 5-8.
Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical
and methodological issues. Journal of the Learning Sciences, 13(1),
Haertel, G. D., & Means, B. (2003). Evaluating educational technology:
Effective research designs for improving learning. New York: Teachers College
Kafai, Y. (2005) The classroom as “living laboratory”: Design-based
research for understanding, comparing, and evaluating learning. Educational
Technology, 45(1), 28-33.
Lagemann, E. C. (2002). Usable knowledge in education. Retrieved November
29, 2005, from the Spencer Foundation Web site: www.spencer.org/publications/index.htm
Nelson, B., Ketelhut, D., Clarke, J., Bowman, C., & Dede, C. (2005). Design-based
research strategies for developing a scientific inquiry curriculum in a multi-user
virtual environment. Educational Technology, 45(1), 21-28.
Roblyer, M. D. (2005). Educational technology research that makes a difference:
Series introduction. Contemporary Issues in Technology and Teacher Education
[Online serial], 5(2). Retrieved November 29, 2005, from http://www.citejournal.org/vol5/iss2/seminal/article1.cfm
Stokes, D.E. (1997). Pasteur’s quadrant. Washington, DC: Brookings
Harvard Graduate School of Education