Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=imte20
Medical Teacher
ISSN: 0142-159X (Print) 1466-187X (Online) Journal homepage: https://www.tandfonline.com/loi/imte20
Twelve tips for conducting educational design
research in medical education
Weichao Chen & Thomas C. Reeves
To cite this article: Weichao Chen & Thomas C. Reeves (2019): Twelve tips for
conducting educational design research in medical education, Medical Teacher, DOI:
10.1080/0142159X.2019.1657231
To link to this article: https://doi.org/10.1080/0142159X.2019.1657231
Published online: 09 Sep 2019.
Submit your article to this journal
Article views: 394
View related articles
View Crossmark data
TWELVE TIPS
Twelve tips for conducting educational design research in medical education
Weichao Chen
a
and Thomas C. Reeves
b
a
Office of Medical Education, School of Medicine, University of Virginia, Charlottesville, VA, USA;
b
College of Education, The University
of Georgia, Athens, GA, USA
ABSTRACT
Despite a steady growth in educational innovations and studies investigating the acceptance and effectiveness of these
innovations, medical education has not realized sufficient improvement in practice and outcomes from these investments.
In light of this lack of impact, there has been a growing call for studies that more effectively bridge the gap between
research and practice. This paper introduces Educational Design Research (EDR) as a promising approach to address this
challenge. Twelve tips are provided to inspire and guide medical educators to conduct EDR to achieve the dual goals of
tackling a significant educational problem in a specific context while at the same time advancing the theoretical knowledge
that may be used to improve practice elsewhere.
Introduction
The field of medical education research has witnessed
steady growth since the 1950s (Traynor and Eva 2010).
Todays healthcare professionals are increasingly invested
in conducting educational research of various kinds (Eva
and Lingard 2008), with efforts often supported by their
institutions (Ahmed et al. 2016). Unfortunately, the impact
of educational research on medical education practice is
weak, just as it is in other educational contexts (Kaestle
1993; Kennedy 1997; Albert et al. 2007; Dolmans and
Tigelaar 2012; van Enk and Regehr 2018). As a result, there
has been a growing call for studies that more effectively
bridge the gap between research and practice.
In addition to the insufficient impact on practice, few
medical education research studies contribute adequately
to the definition and refinement of robust theory related
to teaching and learning. According to Cook et al.s(2007)
systematic review of the experimental medical education
research published in leading journals between 2003 and
2004, 45% of the studies lacked a conceptual framework.
The inadequacy of theoretical underpinnings and contribu-
tions in medical education research continues, with Meyer
et al.s(2018) text analysis listing failure to contribute to
the body of knowledge as Academic Medicine editors top
(44%) rationale for denying submissions consideration for
external peer review.
Responding to the demand for medical education
research that is both theory-oriented and application-rele-
vant, using each to inform the other (Eva 2010, p. 4), this
paper introduces Educational Design Research (EDR) as a
promising approach to accomplishing this twofold object-
ive. Figure 1 illustrates the three major phases of EDR proj-
ects (McKenney and Reeves 2019):
Analysis and Exploration involves working closely
with collaborators to acquire an understanding of a
significant educational problem and investigate how
others have addressed it;
Design and Construction focuses on identifying or cre-
ating appropriate design principles and using these
principles to develop your prototype intervention; and
Evaluation and Reflection consists of multiple iterations
of data collection and analysis to test your prototype
intervention and review the implications of the findings.
While only recently introduced to medical education
researchers (Dolmans and Tigelaar 2012; Wolcott et al.
2019), EDR emerged in other educational settings in early
1990s and has been steadily increasing as a way of enhanc-
ing the impact of educational studies on practice while at
the same time extending and refining theory (McKenney
and Reeves 2019). Encompassing similar approaches such
as design-based research and design-based implementation
research, EDR has been featured in multiple special issues
of leading educational research journals between 2003 and
the present. Figure 1 illustrates the process of EDR as
delineated by McKenney and Reeves (2019). The following
practical tips describe how EDR can be conducted by med-
ical educators to increase the relevance and impact of
scholarly research whilst developing more effective learning
environments.
Tip 1
Ensure your EDR initiative is focused on a significant
problem related to learning and teaching
First and foremost, your EDR initiative should focus on a
meaningful problem relevant to the practice of teaching
and learning and aim to meet this identified challenge.
This sharp focus on addressing a significant educational
challenge distinguishes EDR from many other approaches
to educational research (McKenney and Reeves 2019).
CONTACT Weichao Chen [email protected] Office of Medical Education, School of Medicine, University of Virginia, P.O. Box 800382,
Charlottesville, 22908, VA, USA
This article has been republished with minor changes. These changes do not impact the academic content of the article.
ß 2019 Informa UK Limited, trading as Taylor & Francis Group
MEDICAL TEACHER
https://doi.org/10.1080/0142159X.2019.1657231
As illustrated in Figure 1, your understanding of the prob-
lem will be informed by intensive Analysis and Exploration
during the first phase of your EDR initiative and subse-
quently enable the development of a prototype solution to
the problem during the second phase, Design and
Construction. The solution can be characterized as an
intervention in the form of newly planned or enhanced
educational processes (e.g. innovative pedagogical strat-
egies), programs (e.g. enhanced faculty development
efforts), or policies (e.g. changes in training sites for
residents) (Gruppen et al. 2018).
What constitutes serious problems for healthcare educa-
tors today? Many educational challenges are not unlike
those experienced by teachers at all levels (e.g., increasing
the integration of active learning principles or the achieve-
ment of higher-order learning outcomes), but others are
more specific to medical education, including:
Develop capacities to work effectively in increasingly
fluid healthcare teams.
Cultivate skills to communicate in a culturally competent
manner with patients and other healthcare professionals.
Prepare healthcare professionals for practice in a world
increasingly infused with machine learning algorithms
and robots.
Improve assessment protocols and feedback practices
to promote competency-based education.
Enhance healthcare professionals clinical reason-
ing (CR).
To clarify how EDR can be conducted in medical educa-
tion, a hypothetical study focused on the problem of
enhancing medical students clinical reasoning (CR) will be
used in the description of the following tips. Despite the
fact that CR is a core skill in medical practice with major
implications for patient safety, it is rarely explicitly taught
because it is assumed that these skills will be developed
by observing expert practitioners during clinical rotations
(Gay et al. 2013; Amey et al. 2017). Enhancing the CR of
their students is clearly a serious challenge for medical
teachers (Higgs et al. 2019).
Tip 2
Strive for both practical and theoretical outcomes
As delineated in McKenney and Reeves (2019), two primary
outcomes are sought simultaneously in EDR projects,
specifically a maturing intervention and enhanced theoret-
ical understanding. As illustrated in Figure 1, the practical
outcome of EDR is conceptualized as a maturing interven-
tion in recognition that no problem solution is ever final
because as new conditions evolve, interventions must
evolve as well. Figure 1 designates the theoretical contribu-
tion of EDR as theoretical understanding. In EDR, theoret-
ical understanding is usually represented in the form of
design principles. Design principles are the best practices
derived from prior research that you will use to inform the
creation of your prototype solution and that subsequently,
you will refine through your data collection, analysis, and
reflection activities.
In applying EDR to the challenge of enhancing students
CR skills, the practical outcome would be an educational
intervention that improves CR skills while the theoretical
outcome would be enhanced design principles for develop-
ing interventions that address other higher order out-
comes. An example of a maturing intervention might be a
computer-based simulation that engages students in apply-
ing CR to actual cases with interactive feedback and coach-
ing. An example of a design principle might be that A
case-based simulation is most effective when the formative
assessment of clinical reasoning is embedded in authentic
tasks and feedback is provided immediately rather
than delayed.
Tip 3
Seek theoretical understanding throughout the
research and development process
Throughout an EDR initiative, various research activities
such as needs assessment, literature review, and quasi-
experiments are integrated into the cyclical development
of a solution to the significant problem. These research
and development activities are informed by and in turn
yield different types of theoretical understanding
(McKenney and Reeves 2019):
Descriptive understanding is typically pursued during the
Analysis and Exploration stage, before starting the
actual development of your innovative intervention.
Through literature reviews or empirical exploration, you
derive a better understanding of the current situation
related to the target educational challenge, which sub-
sequently informs the Design and Construction stage.
Figure 1. The process of conducting educational design research (McKenney and Reeves 2019, p. 83, used with permission).
2 W. CHEN AND T. C. REEVES
At the Evaluation and Reflection stage, through the
testing and refinement of your intervention, you can
seek predictive understanding of the desired outcomes,
addressing whether, and to what extent, the designed
intervention addresses the target issue.
While it is useful to assess if your innovation works, it
is also important to simultaneously study how,’‘when,
and what aspects of it work and why that happens
(Regehr 2010). Explanatory understanding, therefore,
explicates how and why the intervention succeeds or
fails, revealing important causal relationships or influen-
tial contextual factors.
Through multiple iterations of investigating the efficacy
of your intervention during the Evaluation and
Reflection stage, you can derive prescriptive understand-
ing, usually in the form of refined design principles.
Ideally, these refined design principles will enable the
design and implementation of similar interventions in
new educational contexts by yourself or others.
Numerous innovations have been developed to enhance
the CR of healthcare learners, such as virtual patients, inter-
active simulations, responsive manikins, illness scripts work-
shops, service learning, problem-based learning, concept
mapping, blended learning, and 3-D games (among
others), but none of these interventions has adequately
met this challenge. At least part of this failure may be
attributed to the all-too-common practice of developing
interventions without a sound theoretical understanding of
CR and the complexities of teaching these skills (Rencic
2011). Koivisto et al. (2016, 2017, 2018) conducted EDR to
enhance nursing students CR skills and eventually devel-
oped a 3-D simulation game to provide their students with
opportunities to apply and refine CR skills. In the process
of evaluating and reflecting upon the outcomes of their
game design, they sought both predictive and explanatory
understanding. They were not only interested in whether
students learned CR through playing the game, but also
how that actually happened.
Tip 4
Identify and apply conceptual frameworks
The Analysis and Exploration phase (see Figure 1) includes
identifying one or more conceptual frameworks as early as
possible to guide your planning of both research and
design activities (Yin 2014). Conceptual frameworks include
theories, models, and principles of practice that are devel-
oped based on observations, validated by empirical
research, or derived from other well-grounded theories.
Each framework provides a lens for examining and inter-
preting complex educational phenomena (Bordage 2009).
In other words, different frameworks bring attention to
interrelated and often complex aspects of the target prob-
lem and influence the types of educational strategies that
might be integrated into an innovative intervention.
If you identify a framework from learning theories
(Zackoff et al. 2019), you are likely to focus on the process
of how learning occurs and how to improve instructional
efforts to support successful learning; on the other hand, if
you derive your framework from system theories (Barr
2013), you will probably investigate factors that influence
the diffusion of your innovation so as to enhance your
organizations capacity to sustain it. Researchers and practi-
tioners conducting EDR have adopted different learning
theories to enhance healthcare learners CR. For example,
Ramaekers et al. (2012) adopted cognitive theories and
built a curriculum with authentic examples and practice to
support learners construction of cognitive understanding
whereas Leggett (2016) emphasized the importance of
self-regulated learning on the performance of CR and
sought to enhance feedback practice to foster students
self-regulated learning skills. On the other hand, Koivisto
et al. (2016, 2017, 2018) focused on the central role that
experience plays in learning. In developing a 3-D game to
enhance learners CR skills, they adopted the experiential
learning theory to investigate and enhance students learn-
ing process. Within the context of medical education, add-
itional examples of learning theories are available in
Zackoff et al. (2019, p. 137138), and Barr (2013) provided
examples of system theories, such as organizational theory
and activity theory. How do you select appropriate concep-
tual frameworks? Start by reviewing the literature on simi-
lar challenges and innovations, and look for theories that
have been consistently applied.
Tip 5
Identify practitioners and researchers willing and able
to collaborate on the EDR initiative
A hallmark of EDR is that it involves the close collaboration
of researchers and practitioners, or more likely in medical
education a group of practitioners willing to engage collab-
oratively in long-term efforts to improve practice and the-
oretical understanding. The success of any intervention in
an educational ecological system is determined by the
innovations interaction with other components of the sys-
tem (Zhao and Frank 2003). The following stakeholders
may play an important role in the ecosystem in which your
EDR project is conducted, and therefore you should seek to
collaborate with them (e.g. Chen et al. 2015):
professionals involved in teaching the core and periph-
eral skills related to the problem your project addresses,
professionals with expertise in program evaluation and
learning assessment,
professionals with expertise in faculty and staff develop-
ment and/or instructional design,
administrative, academic, student, and technology sup-
port personnel, and
representatives of the target learner population.
While you should play key roles in both instructional
development and research, consider including one or more
experienced educational researchers or evaluators on your
team. It has become increasingly challenging for healthcare
professionals to keep up with the evolution of educational
research methods and tools aside from clinical duties.
Including an educational researcher on the team mitigates
this burden. With successful project management (Huggett
et al. 2011), multidisciplinary collaboration can significantly
advance the quality of scholarship (Traynor and Eva 2010).
In our hypothetical study addressing the problem of
enhancing students CR, you should seek to assemble a
MEDICAL TEACHER 3
team of other medical teachers plus other professionals
such as creative instructional designers, highly skilled eval-
uators, and any relevant content experts. In addition, you
might want to engage medical students themselves in the
EDR initiative as they can serve as participants in small-
scale evaluations and as sounding boards for design and
implementation ideas.
Tip 6
Analyze the target problem and learners
To inform your construction of an intervention, you must
acquire a rich descriptive understanding of the target prob-
lem and the existing innovations that others have
employed to address your target problem, and you must
also garner detailed information about the target popula-
tion and the learning environment (Bass and Chen 2016;
Hughes 2016). This kind of information is typically sought
through reviewing the literature, conducting site visits and
observations, and/or surveying learners and their educators.
For instance, any efforts to enhance learners CR will bene-
fit from an understanding of the status quo of CR teaching,
including the impact of healthcare professionals CR skills
on the outcomes of patient care and the effectiveness of
current CR teaching methods. Higgs et al. (2019) provide a
valuable and current overview of the state-of-the-art of CR
innovations in health professions education.
Additionally, you should seek to clarify the following
information about your target learners and their learning
environment (Fink 2013):
learner and instructor characteristics;
the setting of the learning environment, technology,
and other resources;
external expectations from the society, professional
associations, and your institute; and
nature of the subject matter and inherent peda-
gogical challenges.
Tip 7
Design and construct an aligned educational design
The Design and Construction phase of EDR is when you
begin constructing your design plan based on the descrip-
tive understanding acquired during the previous Analysis
and Exploration phase (see Figure 1). First, continue to
review literature related to your conceptual framework(s)
and similar solutions to derive design principles for your
prototype intervention. These principles are turned into
your design plan, which must include three key elements
at a minimum: educational objectives, instructional activities
and materials, and assessment and feedback procedures
(Fink 2013). While you might be eager to start developing
your innovation by identifying the content or exploring a
novel teaching method or technology, instead begin by
composing or reviewing your educational objectives, clearly
describing what learners are expected to do at the end of
the instructional intervention. When you write clear educa-
tional objectives to guide the development of your innov-
ation, you build the foundation for the alignment of the
other two major design elements (instructional activities
and materials and assessment and feedback procedures)
with these objectives. This alignment enhances your stu-
dents opportunities to engage in meaningful learning
(Fink 2013). The International Training and Education
Center for Health (2010) provides useful tools to guide the
writing of specific, measurable, and appropriate objectives.
The process of specifying your educational objectives
naturally prompts you to consider other key design ele-
ments, crafting instructional activities and materials to sup-
port learners achievement of these objectives and
developing relevant assessment and feedback procedures
to demonstrate that the desired learning has occurred.
MedBiquitous Curriculum Inventory Working Group
Standardized Vocabulary Subcommittee (2016) identifies
common instructional and assessment methods in med-
ical education.
Suppose that after the Analysis and Exploration phase,
you and your colleagues have sketched out a design of an
instructional intervention to enhance students CR, specific-
ally a flipped classroom whereby didactic materials are
provided to the learners prior to the scheduled lecture
time [and the] face-to-face time is used to fill in knowledge
gaps and further solidify understanding of the key con-
cepts (Morrissey and Heilbrun 2017). During the iterative
design process, you will revise and revisit each of these
design components to make sure that they are in align-
ment with each other and that the intervention is congru-
ent with the proposed design principles. Ask questions
such as:
1. To what extent does the flipped classroom model
effectively address all the educational objectives and
prepare students for learning assessments?
2. To what extent do the assessment activities cover all
the objectives?
3. How well are your design principles reflected in your
prototype flipped classroom approach?
4. Based on your understanding of the target learners,
learning environment, and resources, how feasible is it
to actually implement this innovation?
Tip 8
Evaluate the implementation of your prototype
intervention
The third phase of EDR involves Evaluation and Reflection
to refine the innovative intervention and extend or
enhance theoretical understanding (see Figure 1). Conduct
periodic assessments to investigate the interventions func-
tioning in both the local and the broader institutional eco-
logical systems and identify factors impacting its success.
Recommended questions include:
1. To what extent have both teachers and learners per-
ceived your intervention as relevant to their needs?
2. How engaged are learners in the intervention?
3. How well does your intervention address its
intended goals?
4. If any unintended outcomes emerge, are they positive
or negative and why have they occurred?
5. What modifications have been made of your original
plan and why were these changes made?
4 W. CHEN AND T. C. REEVES
How can you address these types of questions? There
are many different data collection methods and tools that
can be used in your evaluative work, including observa-
tions, interviews, and questionnaires (Reeves and Hedberg
2003). Linderman and Lipsett (2016, p. 135137) compared
the strengths and limitations of the methods commonly
used to evaluate medical education initiatives. For instance,
during their EDR project involving a simulation game,
Koivisto et al. (2017, 2018) conducted focus group inter-
views and observations of learners playing the game, look-
ing for possible areas to improve the user interface. They
also asked students about their learning experience with
the game and opinions about the various features. Imagine
beginning to implement an instructional innovation such
as a flipped classroom approach to enhance your students
CR. Plan to conduct several iterations of testing and refine-
ment to see whether the innovation is being implemented
as designed and determine how to make any needed
adjustments to its design.
Tip 9
Evaluate the outcomes of your intervention
Although there is no set rule for the number of iterations
of Evaluation and Reflection you carry out to enhance
both your practical and theoretical outcomes, most EDR
studies encompass at least three iterations (McKenney and
Reeves 2019). For each iteration, in addition to evaluating
the implementation of the intervention, analyze the
achievement of both the immediate educational objectives
and the long-term ultimate goals of the EDR initiative,
including unintended effects. A range of outcomes that
can be evaluated are divided into four levels by Kirkpatrick
and Kirkpatrick (2016):
1. learner satisfaction with their learning experience and
perception of the usefulness of the intervention;
2. learning acquired from the intervention as reflected in
the learning assessment outcomes;
3. behaviors changes or transference of new learning to
a performance context; and
4. organizational results following the intervention such
as improved patient safety outcomes.
Determine the appropriate level(s) of outcomes to evalu-
ate based on feasibility, available resources, and your
objectives, and use appropriate evaluation method(s) and
instrument(s) (Cook 2010). In the case of a flipped class-
room approach to improving students CR, you will at least
want to carry out evaluations of the first three levels
described by Kirkpatrick and Kirkpatrick (2016). For Level 1,
you could use questionnaires and observations to deter-
mine how students are reacting to the flipped classroom
approach. For Level 2, seek to assess to what extent learn-
ers CR knowledge and skills are actually improved by this
innovation (Durning et al. 2012). For Level 3, use observa-
tions, interviews, and focus groups to assess the degree to
which the students are using their newly acquired CR skills
in their interactions with real or virtual patients.
Make sure to relate learning outcomes back to your
findings about the implementation process, since most
innovations are subject to differences between the
intended and actual implementation (Reeves and Hedberg
2003). By conducting iterative evaluations of your educa-
tional intervention, you ensure that the information neces-
sary to demonstrate the practical and theoretical impact of
your intervention is captured. These evaluation activities
should lead to the generation of predictive, explanatory, or
prescriptive understanding.
Tip 10
Examine usability when adopting technologies
With the increasing use of technologies in medical educa-
tion, it is necessary to incorporate usability considerations
to ensure that students and teachers can easily use the
technology to effectively and efficiently accomplish their
instructional tasks and truly engage with their learning
experience (Asarbakhsh and Sandars 2013). Usability evalu-
ation can be conducted even before you begin construct-
ing any technology-based intervention (Reeves and
Hedberg 2003). For example, you can use PowerPoint or
other graphics software to illustrate a few prototype
screens and to represent the target learning systems main
functionalities. You can use this prototype to interview tar-
get learners, observing their interaction with the design
plan and asking for their opinion about the functions and
look and feel of the system. Similarly, usability evaluation
can be conducted using your completed or nearly com-
pleted product, with collected feedback guiding the
enhancement of the intervention.
In addition to eliciting perspectives from target users
(Sandars and Lafferty 2010), you can invite others to review
your learning system against usability principles, a method
called heuristic evaluation (Reeves and Hedberg 2003).
Compared with other usability evaluation methods, heuris-
tic evaluation is relatively less resource consuming and yet
effective. For instance, Chen et al. (2017) conducted a heur-
istic evaluation to improve the usability of an online
Geriatrics Clerkship module. Usability evaluations are even
relevant with instructional innovations such as the flipped
classroom if the learning materials provided to students
before lectures are computer-based.
Tip 11
Unleash the power of reflection in your iterations
The possession of a reflective mindset is a key characteris-
tic of those who conduct EDR. Intentional reflection plays
an important role in enhancing both educational research
and instructional practice (Visscher-Voerman and Procee
2007; Hong and Choi 2011; Yin 2014): Thoughtful reflection
enables you to bridge the gap between practice and
research, facilitating the adoption of research findings to
inform your practice and the transformation of acquired
experience into theoretical insights. Meaningful reflection is
empowering. Although reflection involves a critical examin-
ation of past experience, its focus is not on self-criticizing
or simply the detection of errors. Rather, it leads you to
discover new perspectives on your experiences that will
help you identify opportunities to improve practice and
enhance theory.
MEDICAL TEACHER 5
One idea is to approach your experience from different
viewpoints (Visscher-Voerman and Procee 2007; Hong and
Choi 2011), yielding unexpected insights. For instance, after
observing learners engaging in flipped classroom activities,
recall a moment when you feel surprised, confused, frus-
trated, or excited. From where do such feelings come?
Your understanding of the situation can also be enriched
through considering the interpretations by some of the key
stakeholders or different research or practitioner commu-
nity members. Further, you might compare the current
situation with an ideal one, asking questions such as: What,
if any, discrepancies are observed between the ideal and
actual implementation of the flipped classroom design
plan? What would I do differently during the next iteration?
How might I best foster the spread of my intervention
to others?
Tip 12
Strategically disseminate and diffuse outcomes
through presentations or papers at different
EDR stages
Some of you might be eager to build up your research pro-
file and therefore worry that conducting EDR might mean
that you have to wait to publish your work until after the
innovation has achieved its goals. However, you are
encouraged to seek opportunities to share your research
findings from the earliest stages of your EDR project. For
example, during the Analysis and Exploration phase, you
might write a paper describing the theoretical framework
for your EDR initiative, or you might seek to publish a lit-
erature review concerning the problem the EDR is address-
ing. The Evaluation and Reflection phase may yield several
publishable papers, e.g., one for each iteration of data col-
lection, analysis, and refinement of the intervention.
Successful dissemination of research activities and
results not only helps maintain the project momentum,
providing continuous motivation to your team members,
but also contributes to the continuous improvement of
both practical and research products as you collect feed-
back from like-minded colleagues, and engage yourself in
meaningful reflection. These efforts of dissemination subse-
quently encourage the diffusion of your innovation into
novel settings, extending the impact of your intervention
(e.g. Chen et al. 2015). The ultimate goal of an EDR project
is, therefore, not only the successful implementation of
your intervention in your local context, but also successful
spread of your practical ideas and theoretical insights, as
you aspire to introduce your intervention to new contexts
and share your theoretical understanding to inform the
work of other practitioners and researchers. Ideally, other
educators will relay your effort and conduct EDR to study
the outcomes of implementing your intervention or theor-
etical findings in their institutes.
Although the concept of flipping the classroom in med-
ical education has existed for several years (Prober and
Khan 2013), there is still much scholarly work that can be
reported, especially with respect to its use to improve CR.
To get the publication process started during an EDR initia-
tive, consider submitting an abstract of preliminary findings
to present at the department, institutional, regional,
national, or international conferences. Many journals accept
shorter papers that focus on rapid dissemination of innova-
tions. Online platforms, such as MedEdPORTAL, that dis-
seminate peer-reviewed instructional materials can also be
considered. You are encouraged to also explore other pub-
lication possibilities outside of medical education (see
Table 1).
Conclusions
We present 12 tips to healthcare professionals who are
interested in tackling significant instructional problems and
simultaneously advancing their scholarship in teaching and
learning. The 12 tips demonstrate the process of conduct-
ing EDR to address this meaningful twofold challenge (see
Figure 1). We believe that with careful planning, a reflective
and responsive mindset, and teamwork during the imple-
mentation, healthcare professionals can successfully con-
duct EDR initiatives that contribute to closing the gap
between research and practice in the medical educa-
tion community.
Acknowledgment
We are deeply thankful for the time and constructive comments pro-
vided by Drs. Carla M. Allen, Eli S. Williams and reviewers and editor
of Medical Teacher on earlier drafts of this paper.
Disclosure statement
The authors report no conflicts of interest. The authors alone are
responsible for the content and writing of this article.
Notes on contributors
Weichao Chen, PhD, is an instructional designer at the University of
Virginia School of Medicine.
Thomas C. Reeves , PhD, is the Professor Emeritus of Learning, Design,
and Technology in the College of Education at The University
of Georgia.
ORCID
Weichao Chen http://orcid.org/0000-0001-8964-2568
Table 1. Possible outlets.
Medical education specific Educational technology, instructional design, and online teaching related
Short papers summarizing early implementation outcomes: Medical
Educations really good stuff, Medical Teachers short communication,
Medical Science Educators Innovation and Short Communication, and
Perspectives on Medical Educations show and tell.
Additional suggestions available at Yarris and Deiorio (2011, p. S32) and
Gottlieb et al. (2018,p.34).
Journal suggestions curated by Dr. Curtis Bonk: http://www.trainingshare.
com/resources/distance_ed_journals_and_online_learning_books.php
Conference suggestions curated by Dr. Clayton Wright: https://teachonline.
ca/training-opportunities/upcoming-conferences
6 W. CHEN AND T. C. REEVES
References
Ahmed R, Farooq A, Storie D, Hartling L, Oswald A. 2016. Building cap-
acity for education research among clinical educators in the health
professions: A BEME (Best Evidence Medical Education) systematic
review of the outcomes of interventions: BEME Guide No. 34. Med
Teach. 38(2):123136.
Albert M, Hodges B, Regehr G. 2007. Research in medical education:
balancing service and science. Adv Health Sci Educ Theory Pract.
12(1):103115.
Amey L, Donald KJ, Teodorczuk A. 2017. Teaching clinical reasoning to
medical students. Br J Hosp Med (Lond). 78(7):399401.
Asarbakhsh M, Sandars J. 2013. E-learning: the essential usability per-
spective. Clin Teach. 10(1):4750.
Barr H. 2013. Toward a theoretical framework for interprofessional edu-
cation. J Interprof Care. 27(1):49.
Bass EB, Chen BY. 2016. Step 1: problem identification and general
needs assessment. In: Thomas PA, Kern DE, Hughes MT, Chen BY,
editors. Curriculum development for medical education: a six-step
approach 3rd ed. Baltimore: Johns Hopkins University.
Bordage G. 2009. Conceptual frameworks to illuminate and magnify.
Med Educ. 43(4):312319.
Chen W, Cheng HY, Bradley E. 2017. Improving online teaching in a
required geriatrics clerkship using heuristic evaluation. MedSciEduc.
27:871875.
Chen W, Worden MK, Bradley E. 2015. Flipping, engaging, and team-
ing, oh my! Lessons learned from a large scale curriculum reform at
a US medical school. In: 2015 IEEE 15th International Conference on
Advanced Learning Technologies. Hualien, Taiwan. p. 488492.
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=
7265239.
Cook DA, Beckman TJ, Bordage G. 2007. Quality of reporting of experi-
mental studies in medical education: a systematic review. Med
Educ. 41(8):737745.
Cook DA. 2010. Twelve tips for evaluating educational programs. Med
Teach. 32(4):296301.
Dolmans D, Tigelaar D. 2012. Building bridges between theory and
practice in medical education using a design-based research
approach: AMEE Guide No. 60. Med Teach. 34(1):110.
Durning SJ, Artino A, Boulet J, La Rochelle J, Van Der Vleuten C, Arze
B, Schuwirth L. 2012. The feasibility, reliability, and validity of a
post-encounter form for evaluating clinical reasoning. Med Teach.
34(1):3037.
Eva KW, Lingard L. 2008. Whats next? A guiding question for educa-
tors engaged in educational research. Med Educ. 42(8):752754.
Eva KW. 2010. The value of paradoxical tensions in medical education
research. Med Educ. 44(1):34.
Fink L. 2013. Creating significant learning experiences: an integrated
approach to designing college courses. 2nd ed. San Francisco, CA:
John Wiley & Sons.
Gay S, Bartlett M, McKinley R. 2013. Teaching clinical reasoning to
medical students. Clin Teach. 10(5):308312.
Gottlieb M, Dehon E, Jordan J, Bentley S, Ranney ML, Lee S,
Khandelwal S, Santen SA. 2018. Getting published in medical edu-
cation: overcoming barriers to scholarly production. WestJEM. 19:
16.
Gruppen L, Irby DM, Durning SJ, Maggio LA. 2018. Interventions
designed to improve the learning environment in the health profes-
sions: a scoping review. MedEdPublish. 7(3):73. DOI:10.15694/mep.
2018.0000211.1
Higgs J, Jones MA, Loftus S, Christensen N, editors. 2019. Clinical rea-
soning in the health professions e-book. 4th ed. London: Elsevier
Health Sciences.
Hong Y-C, Choi I. 2011. Three dimensions of reflective thinking in solv-
ing design problems: a conceptual model. Education Tech Research
Dev. 59(5):687710.
Huggett KN, Gusic ME, Greenberg R, Ketterer JM. 2011. Twelve tips for
conducting collaborative research in medical education. Med Teach.
33(9):713
718.
Hughes MT. 2016. Step 2: targeted needs assessment. In: Thomas PA,
Kern DE, Hughes MT, Chen BY, editors. Curriculum development for
medical education: a six-step approach. 3rd edition. Baltimore:
Johns Hopkins University.
International Training and Education Center for Health. 2010. Writing
good learning objectives: a technical implementation guide. https://
targethiv.org/sites/default/files/file-upload/resources/TIG%204%
20Learning%20Objectives%202010.pdf.
Kaestle CF. 1993. The awful reputation of education research. Educ
Res. 22:2331.
Kennedy MM. 1997. The connection between research and practice.
Educ Res. 26(7):412.
Kirkpatrick J, Kirkpatrick W. 2016. Kirkpatricks four levels of training
evaluation. Alexandria, (VA): ATD.
Koivisto J-M, Haavisto E, Niemi H, Haho P, Nylund S, Multisilta J. 2018.
Design principles for simulation games for learning clinical reason-
ing: a design-based research approach. Nurse Educ Today. 60:
114120.
Koivisto J-M, Multisilta J, Niemi H, Katajisto J, Eriksson E. 2016.
Learning by playing: a cross-sectional descriptive study of nursing
students experiences of learning clinical reasoning. Nurse Educ
Today. 45:2228.
Koivisto J-M, Niemi H, Multisilta J, Eriksson E. 2017. Nursing students
experiential learning processes using an online 3D simulation
game. Educ Inf Technol. 22(1):383398.
Leggett H. 2016. Helping clinical educators provide effective feedback
to medical trainees on their diagnostic decision making: an educa-
tional design research approach [PhD thesis]. Leeds, (UK): University
of Leeds.
Linderman B, Lipsett P. 2016. Step 6: evaluation and feedback. In:
Thomas PA, Kern DE, Hughes MT, Chen BY, editors. Curriculum
development for medical education: a six-step approach. 3rd ed.
Baltimore: Johns Hopkins University.
McKenney S, Reeves TC. 2019. Conducting educational design
research. London: Routledge.
MedBiquitous Curriculum Inventory Working Group Standardized
Vocabulary Subcommittee. 2016. Curriculum inventory: standardized
instructional and assessment methods and resource types.
Washington, DC: AAMC.
Meyer HS, Durning SJ, Sklar DP, Maggio LA. 2018. Making the first cut:
an analysis of academic medicine editors reasons for not sending
manuscripts out for external peer review. Acad Med. 93(3):464470.
Morrissey B, Heilbrun ME. 2017. Teaching critical thinking in graduate
medical education: lessons learned in diagnostic radiology. J Med
Educ Curric Dev. 4:2382120517696498.
Prober CG, Khan S. 2013. Medical education reimagined: a call to
action. Acad Med. 88(10):14071410.
Ramaekers S, Van Keulen H, Van Beukelen P, Kremer W, Pilot A. 2012.
Effectiveness of a programme design for the development of com-
petence in solving clinical problems. Med Teach. 34(5):e309e316.
Reeves TC, Hedberg JG. 2003. Interactive learning systems evaluation.
Englewood Cliffs: Educational Technology.
Regehr G. 2010. Its NOT rocket science: rethinking our metaphors for
research in health professions education. Med Educ. 44(1):3139.
Rencic J. 2011. Twelve tips for teaching expertise in clinical reasoning.
Med Teach. 33(11):887892.
Sandars J, Lafferty N. 2010. Twelve tips on usability testing to develop
effective e-learning in medical education. Med Teach. 32(12):
956960.
Traynor R, Eva KW. 2010. The evolving field of medical education
research. Biochem Mol Biol Educ. 38(4):211215.
van Enk A, Regehr G. 2018. HPE as a field: implications for the produc-
tion of compelling knowledge. Teach Learn Med. 30(3):337344.
Visscher-Voerman I, Procee H. 2007. Teaching systematic reflection to
novice educational designers. Presented in 2007 Association for
Educational Communications and Technology Convention. https://
research.utwente.nl/files/18427861/Visscher2007teaching.pdf.
Wolcott MD, Lobczowski NG, Lyons K, McLaughlin JE. 2019. Design-
based research: connecting theory and practice in pharmacy educa-
tional intervention research. Curr Pharm Teach Learn. 11(3):
309318.
Yarris LM, Deiorio NM. 2011. Education research: a primer for educa-
tors in emergency medicine. Acad Emerg Med. 18(Suppl 2):S27S35.
Yin RK. 2014. Case study research: design and methods. 5th ed.
London, (UK): Sage.
Zackoff MW, Real FJ, Abramson EL, Li S-T, Klein MD, Gusic ME. 2019.
Enhancing educational scholarship through conceptual frameworks:
a challenge and roadmap for medical educators. Acad Pediatr.
19(2):135141.
Zhao Y, Frank KA. 2003. Factors affecting technology uses in schools:
an ecological perspective. Am J Educ Res. 40(4):807840.
MEDICAL TEACHER 7