gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

Research in medical education: pratical impact on medical training and future challenges

Commentary/Kommentar Medicine

Search Medline for

  • corresponding author Diana H. J. M. Dolmans - Maastricht University, Medicine and Life Sciences, Faculty of Health, Department of Educational Development and Research, Maastricht, Netherland
  • corresponding author Cees P. M. van der Vleuten - Maastricht University, Medicine and Life Sciences, Faculty of Health, Scientific Director of the School of Health Professions Education, Maastricht, Netherland; Maastricht University, Medicine and Life Sciences, Faculty of Health, Chair of the Department of Educational Development and Research, Maastricht, Netherland

GMS Z Med Ausbild 2010;27(2):Doc34

doi: 10.3205/zma000671, urn:nbn:de:0183-zma0006719

Received: October 9, 2009
Revised: October 24, 2009
Accepted: January 21, 2010
Published: April 22, 2010

© 2010 Dolmans et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.en). You are free: to Share – to copy, distribute and transmit the work, provided the original author and source are credited.


Abstract

Medical education research has changed over the years from merely descriptive studies towards justification or curriculum comparison studies and, nowadays, towards a slow introduction of more clarification studies. In clarification studies quantitative and qualitative methods are used to explain why or how educational interventions work or do not work. This shift is described in this paper. In addition, it is explained how research into workplace learning and assessment has impacted developments in educational practice. Finally, it is argued that the participation of teachers within the medical domain in conducting and disseminating research should be cherished, because they play a crucial role in ensuring that medical education research is applied in educational practice.


Lead

Worldwide, medical education research has grown enormously the last twenty years. There have been huge increases in the number of scientific journals and the number of issues published per journal, the number of participants at national and international conferences on medical education and the number of candidates with a career as medical education researchers [1]. But, apart from growth, which developments have we seen in medical education research? Has medical education research had a positive impact on medical training? What future challenges will medical education research have to meet in order to further enhance evidence-based innovations in our medical training programmes? These questions are addressed below.


Changes in medical education research

Within the field of medical education research, there has been a shift in the type of studies that are conducted. It is a shift from merely descriptive studies, explaining which kinds of innovations are implemented in practice, towards justification studies. Justification studies often focus on comparisons of curricula; e.g. does a traditional curriculum result in different outcomes compared to an innovative curriculum [2]? Slowly, more clarification studies are being reported, investigating how different variables influence each other and paying attention not only to outcomes but also to the underlying processes that could explain why and how an intervention does or does not work

There has been much debate in the literature about justification or curriculum comparison studies. To (bio)medically trained researchers controlled experimentation is the hallmark of good research. But, controlling circumstances in educational interventions is very hard and often impossible. Trying to control an educational intervention may actually lead to a rather reductionist and trivial exercise [3]. We do not argue that controlled experiments should never be done, because it is dependent on the research question formulated. We are currently involved in testing the hypothesis that elaboration in a group leads to better knowledge retention [4]. The randomized experimental and control groups are completely standardized (through the use of video) except for an elaboration intervention and the experimentation is conducted in a laboratory situation. Naturally, the price is ecological validity and generalization to authentic contexts.

Currently, more clarification studies are being reported in the literature. This shift is highly valuable. Education is a complex domain in which many different variables interact with each other, such as the student, the teacher, the learning materials, the assessment, etcetera. Because of this complexity, it is not easy to conduct research in this area [5], [6]. The complex interactions between different variables make it difficult to compare curricula and to detect the real cause of better outcomes [7], [8]. Clarification studies try to unravel the processes underlying the observed effects and address the question ‘Why or how did it work [2]?’ These studies are highly valuable because they clarify what works under which circumstances. Different methods can be used to conduct clarification studies, both quantitative and qualitative methods. Nowadays, an increasing number of qualitative studies are being published as opposed to quantitative studies. Many qualitative studies focus on answering the questions why (explanation) and how, leading to deeper understanding of differing perspectives [9]. Not only qualitative studies but also design-based studies are on the increase. In design-based studies, an educational design is developed based on current theoretical insights and evaluated by multiple methods, with the dual goal of refining theory and improving practice [10]. Design-based studies are often conducted in real life settings in which multiple aspects and interactions are evaluated and in which researchers and practitioners interact closely with each other [10]. The ‘ecological validity’ of this research avenue is high, but the proof will be of quite a different nature than we are used to in the conventional RCT approach. An example of this type of research from our own experience is the development of teaching portfolios to stimulate the professional development of teachers [11], [12]. Constructivist theories of learning emphasize that learners actively construct their own knowledge by interpreting events and information based on what they already know. From this perspective, the professional development of teachers can be encouraged by stimulating them to critically reflect on their teaching practice e.g. by means of a teaching portfolio, an authentic assessment tool combining different instruments to measure different competencies and in which feedback plays a crucial role. Modern theories of assessment, teachers’ professional development and teaching portfolios were used to develop a teaching portfolio prototype [13].

In summary, the field of medical education research has not only grown rapidly, it has also changed over the past years. More and more studies are reported that deepen our understanding of how and why education works. Mixed methods and mixed research avenues that complement each other are needed, inspired by theoretical notions that illuminate in some way how and why things work in educational practice.


Impact of research on educational practice

The ultimate question is if and how medical education research changes educational practice. Before answering this question it is first of all important to keep in mind that the relationship between research and practice is not always straightforward. Research often leads to contradicting findings, open-door findings or findings that are highly context-specific and this can make it difficult to apply research findings to medical training programmes. Despite these difficulties, medical education research has definitely contributed to improvements of training programmes over the years. Workplace learning and assessment will be described below as two examples to illustrate the relationship between medical education research and educational practice.


Workplace learning

Workplace learning is considered by medical experts to be the optimal way of learning a profession. In medical curricula, workplace learning has played a dominant role for a long time. In many traditional curricula, students start with theoretical courses during the first years of the training programme and later move on to clinical training in different disciplines in the hospital during which they apply what they have learnt during theoretical training under the guidance of experts. Workplace learning is potentially a very rich learning environment offering students many possibilities to interact with patients and medical experts and to participate in clinical practice [14], [15]. Although workplace learning offers many opportunities for student learning, research has demonstrated that students also experience difficulties [16], [17]. Found that students experienced difficulties when they had to apply in practice what they had learnt during their theoretical courses. In order to diminish the gap between theory and practice and to create a more gradual transition from school-based learning to workplace learning, workplace learning is nowadays introduced earlier in many medical curricula.

Research has also demonstrated that there are considerable variations between students in the skills they perform and the patients they encounter during workplace learning [18]. Learning takes place rather haphazardly in workplace learning, depending on the patients or problems presenting in daily practice. Another major problem, reported in several studies, is that students often receive only limited supervision and feedback [19], [20], [21]. This is a serious problem, since it is known from the literature that direct supervision in the workplace is the key to effective student learning [22]. Quality of supervision has been demonstrated to have a direct impact on students’ clinical competencies [23]. Insights about these shortcomings of workplace learning have led to the development of several interventions to optimize student learning, such as in-training assessment to provide learners with more feedback, structuring of workplace learning experiences, deepening the reflective component of learning based on (rich) information from others, etcetera. In addition, there has been increased awareness of the importance of training clinical staff members and providing them with new knowledge and skills for effective teaching and learning in the workplace [24]. Compared to a few years ago, today much more time is devoted to faculty development activities during which faculty learn more about effective workplace learning and tools they can use to optimize workplace learning.

The attention given in the literature to problems of workplace learning have also led to the development and implementation of instruments for evaluating the quality of the clinical learning environment [25] and the performance of clinical supervisors [26]. Finally, only recently, concerns about the quality of student learning in the workplace have led to the implementation of longitudinal attachments in undergraduate medical training programmes to increase student continuity with patients and supervisors [27]. But, not only undergraduate medical training programmes have changed over the years, postgraduate medical education has also seen rapid changes since 2000 [28].

In sum, research within the domain of workplace learning has contributed to various initiatives aimed at optimizing workplace learning. Apparently, medical education research can lead to changes in educational practice. But, it is also important to keep in mind that it is not easy to implement findings from research in daily practice. For example, although it is known that high quality supervision is the key factor for the success of workplace learning, it remains difficult to stimulate clinical staff to spend more time supervising students during workplace learning, because of competing values and responsibilities between patient care, research and education [29]. Improving education not only requires the introduction of new tools in educational practice, it also requires a cultural change, commitment and involvement from all participants in the workplace and this requires long-term efforts.


Assessment

The area of student assessment is definitely one that is led by research. We will present a few instances and refer for the broader developments to other literature [30], [31], [32]. In the sixties it was found that performance on one assessment exercise (item, station, oral, patient encounter, etc) was hardly predictive of performance on another exercise. The phenomenon has been termed the ‘content specificity problem of clinical competence’ and was later found to occur with virtually all assessment methods, regardless of what was being measured. The phenomenon resonated with findings in cognitive psychology and stimulated a great deal of cognitive expertise research (which in itself had quite some impact on educational practice). The impact on educational practice was that short, single shot assessments were abandoned (e.g. the long case) and that efficient sampling strategies across content were introduced in any method of assessment. It was also found that contextualizing assessment by presenting authentic tasks did not require extensive, complex and resource intensive simulations, but could be achieved with short scenarios or vignettes. It was also found that the stimulus format, the task presented to the assessee, was more important than the response format (open, closed, oral, performance-based, etc.). This has had a tremendous impact on assessment strategies all over the world. For example, licensing examinations across the world have completely changed their practice of written assessment. All written test items have been changed to small but authentic simulations of authentic professional tasks, requiring higher cognitive abilities and application of knowledge. Later this was followed by performance-based assessment strategies using the same approach: efficient, frequent and authentic sampling across a number of clinical encounters using multiple assessors. There are probably very few medical schools around the world that do not use the Objective Structured Clinical Examination (OSCE) one way or the other. It is a very clear example of how educational practice is influenced by research. In the mean time research has considerably professionalized the OSCE approach in general (scoring, standard setting, role playing, equating, etcetera); a whole ‘OSCE-ology’ has emerged from that.

An interesting more recent insight is that objectification is not really a required goal and sometimes even not a desired goal in assessment. Subjective measures can be reliable and objective ones can be unreliable all depending on how the sampling is performed. The key is sampling across elements that influence the measurement. The key is NOT standardizing, structuring, or objectifying the measurement. This is a tremendous insight with dazzling practical implications. The OSCE was invented as a reaction to subjective clinical examinations. It was therefore called ‘Objective and Structured’. However, reliability and validity depend on how sampling is done across content, patients and examiners much more than on how structured or objective the measurement itself is. This insight is the basis for moving back to the unstandardized ‘noisy’ but authentic clinical context and for conducting appropriate sampling. All work-based assessment as it is currently developing is based on these premises [33].

In all, assessment provides an excellent example of how research is able to impact educational practice.


Future directions for medical education research

The professionals involved in medical education research are growing not only in number but also in the diversity of their scientific backgrounds. At the same time medical education research is being accused of a lack of scientific rigour or of insufficient quality [34]. According to some leaders in the field, progress in medical education research has been too slow. They argue that many of the studies reported in the journals have been done before or lack a theoretical background or fail to test theories [34], [35]. Furthermore, there is a lack of understanding about social science research and qualitative methodologies, probably due to the dominance of the biomedical model [34], [35]. These factors hinder the increase of the body of knowledge in the field of medical education research.

Of course, the quality of research should be increased by conducting studies that test theories [35] and by conducting more rigorous qualitative studies and mixed-methods studies. Theories need to be used too. They give researchers different ‘lenses’ through which to look at complicated problems and social issues. Theories broaden our understanding of situations and can be applied in practice [36]. And of course, medical education research should lead to the creation of new knowledge for academics [37] and contribute towards our understanding of the problems encountered in education [1].

But there is one very fundamental aspect of the research in medical education that is quite unique and which holds promise for research impacting educational practice. That aspect is the participation of the medical teachers – the practitioners of medical education - in conducting the research and in disseminating it. In general education, there is much discussion about the gap that separates educational research from educational practice [38]. Education research is accused of being too theory oriented and of failing to address the problems of educational practice. On top of that, the users of general education research, the teachers, are disengaged from participating in the research. We daresay that this does not apply to medical education and that in fact the opposite is true. Very characteristic, but also unique, is that the teachers within the medical domain participate in conducting the research and in disseminating it. There is no other domain that has so many international journals dedicated to education (we counted 15 but lost track), some of which are specifically dedicated to the translation of research to education practice (i.e., Medical Teacher, The Clinical Teacher). The international meetings on medical education have become huge in number of attendees. In part, the explanation for this lies in the amalgamation of what is offered in these meetings (workshops, symposia, hands-on experiences, practical experiences and research). This thriving community of education specialists and representatives from the domain itself, we believe, is the agent of the impact of research on educational practice. This community is slowly but clearly professionalizing in terms of educational research standards and the use of theory. It is crucial, however, that we professionalize at the right pace. We need to strike a careful balance between research that has practical relevance and research that is of high scientific quality and clarifies what works well under which conditions and why. We should never risk becoming disengaged from the medical teacher or any other person having a direct responsibility in educational practice [39]. We believe and are determined to continue to cherish this participative community in medical education. The impact will follow almost automatically.


The authors

1.
Diana H.J.M. Dolmans, PhD, is educational scientist and associate professor within the Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, the Netherlands.
2.
Cees P.M. van der Vleuten, PhD, is professor of Education, Chair of the Department of Educational Development and Research, Scientific Director of the School of Health Professions Education, Faculty of Health, Medicine and Life Sciences, Maastricht University, the Netherlands.

References

1.
Eva K. Broadening the debate about quality in medical education research. Med Educ. 2009;43(4):294-296. DOI:10.1111/j.1365-2923.2009.03342.x External link
2.
Cook DA. Bordage G, Schmidt HG. Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ. 2008;42(2):128-133.
3.
Gruppen LD. Is medical education research "hard" or "soft" research? Adv Health Sci Educ Theory Pract. 2008;13(1):1-2.
4.
Van Blankenstein FM, Dolmans DH, Van der Vleuten CP, Schmidt HG. Which cognitive processes benefit learning during small group discussion? Explaining to others facilitates long-term recall. Paper presented at the ICO Toogdag. Utrecht: ICO Toogdag; 2008.
5.
Berliner DC. Educational research: The hardest science of all. Educ Res. 2002;31(8):18-20. DOI:10.3102/0013189X031008018 External link
6.
Kember D. To control or not to control: the question of whether experimental designs are appropriate for evaluating teaching innovations in higher education. Ass Eval High Educ. 2003;28(1):89-101. DOI:10.1080/02602930301684 External link
7.
Albanese M. Life is though for curriculum researchers. Med Educ. 2009;43(2):199-201. DOI:10.1111/j.1365-2923.2008.03289.x External link
8.
Cook DA. Avoiding confounded comparisons in education research. Med Educ. 2009;43(2):102-104. DOI:10.1111/j.1365-2923.2008.03263.x External link
9.
Kuper A, Reeves S, Levinson W. Qualitative research. An introduction to reading and appraising qualitative research. BMJ. 2008;337: a288. DOI:10.1136/bmj.a288. External link
10.
Barab S, Squire K. Design-based research: Putting a stake in the ground. J Learn Sci. 2004;13(1):1-14. DOI:10.1207/s15327809jls1301_1 External link
11.
Tigelaar DE, Dolmans DH, de Grave WS, Wolfhagen IH, van der Vleuten CP. Participants' opinions on the usefulness of a teaching portfolio. Med Educ. 2006;40(4):371-378. DOI:10.1111/j.1365-2929.2006.02414.x External link
12.
Tigelaar DE, Dolmans DH, Meijer PC, de Grave WS, van der Vleuten CP. Teachers' interactions and their collaborative reflection processes during peer meetings. Adv Health Sci Educ Theory Pract. 2008;13(3):289-308. DOI:10.1007/s10459-006-9040-4 External link
13.
Tigelaar DE, Dolmans DH, Wolfhagen IH, Vleuten van der CP. Using a conceptual framework and the opinions of portfolio experts to develop a teaching portfolio prototype. Stud Educ Eval. 2004;30(3):305-321.
14.
Dornan T, Boshuizen H, Kind N, Scherpbier A. Experience-based learning: a model linking the processes and outcomes of medical students’ workplace learning. Med Educ. 2007;41(1):84-91. DOI:10.1111/j.1365-2929.2006.02652.x External link
15.
Teunissen PW, Scheele F, Scherpbier AJ, Vleuten van der CPM, Boor K, Luijk van SJ, Diemen-van-Steenvoorde JA. How residents learn: Qualitative evidence for the pivotal role of clinical activities. Med Educ. 2007;41(8):763-770. DOI:10.1111/j.1365-2923.2007.02778.x External link
16.
Prince AJ, Wiel van de M, Scherpboer AJ, Vleuten van der CP, Boshuizen HP. A Qualitative Analysis of the Transition from Theory to Practice in Undergraduate Training in a PBL-Medical School. Adv Health Sci Educ Pract. 2000;5(2):105-116. DOI:10.1023/A:1009873003677 External link
17.
Prince AJ, Boshuizen HP, Vleuten van der CP, Scherpier AJ. Students’opinions about their preparation for clinical practice. Med Educ. 2005;39(7):704-712. DOI:10.1111/j.1365-2929.2005.02207.x External link
18.
Van der Hem-Stokroos HH, Scherpbier AJ, Van der Vleuten CM, De Vries H, Haarman HJ. How effective is a clerkship as a learning environment? Med Teach. 2001;23(6):599-604. DOI:10.1080/01421590127200 External link
19.
Daelmans HE, Hoogenboom RJ, Donker AJ, Scherpbier AJ, Stehouwer CD, Vleuten van der C. Effectiveness of clinical rotations as a learning environment for achieving competence. Med Teach. 2004;26(4):305-312. DOI:10.1080/01421590410001683195 External link
20.
Dolmans DH, Wolfhagen IH, Heineman E, Scherpbier AJ. Factors adversely affecting student learning in the clinical learning environment: A student perspective. Educ Health. 2008;20(3):e1-e10.
21.
Grant J, Kilminster S, Jolly B, Cottrell D. Clinical supervision of SpRs: where does it happen, when does it happen and is it effective. Med Educ. 2003;37(2):140-148. DOI:10.1046/j.1365-2923.2003.01415.x External link
22.
Kilminster S, Cottrell D, Grant J, Jolly B. AMEE Guide No. 27: Effective educational and clinical supervision. Med Teach. 2007;29(1):2-19. DOI:10.1080/01421590701210907 External link
23.
Wimmers PF, Schmidt HG, Splinter TA. Influence of clerkship experiences on clinical competence. Med Educ. 2006;40(5):450-458. DOI:10.1111/j.1365-2929.2006.02447.x External link
24.
Steinert Y. Staff development for clinical teachers. Clin Teach. ,2005;2(2):104-110. DOI:10.1111/j.1743-498X.2005.00062.x External link
25.
Boor K, Scheele F, Vleuten CP, Scherpbier AJ, Teunissen PW, Sijtsma K. Psychometric properties of an instrument to measure the clinical learning environment. Med Educ. 2007;41(1):92-99. DOI:10.1111/j.1365-2929.2006.02651.x External link
26.
Stalmeijer RE, Dolmans DH, Wolfhagen IH, Muijtjens AM, Scherpbier AJ. The development of an instrument for evaluating clinical teachers: involving stakeholders to determine content validity. Med Teach. 2008;30(8):e272-e277. DOI:10.1080/01421590802258904 External link
27.
Wamsley MA, Dubowitz N, Kohli P, Cooke M, O’Brien BC. Continuity in a longitudinal out-patient attachment for year 3 medical students. Med Educ. 2009;43(9):895-906. DOI:10.1111/j.1365-2923.2009.03424.x External link
28.
Ten Cate O. Medical Education in the Netherlands. Med Teach. 2007;29(8):752-757. DOI:10.1080/01421590701724741 External link
29.
Hoffman KG, Donaldson JE. Contextual tensions of the clinical environment and their influence on teaching and learning. Med Educ. 2004;38(4):448-454. DOI:10.1046/j.1365-2923.2004.01799.x External link
30.
Van der Vleuten CP. The assessment of Professional Competence: Developments, Research and Practical Implications. Adv Health Sci Educ Theory Pract. 1996;1(1):41-67. DOI:10.1007/BF00596229 External link
31.
Schuwirth LW, van der Vleuten CP. Changing education, changing assessment, changing research? Med Educ. 2004;38(8):805-812. DOI:10.1111/j.1365-2929.2004.01851.x External link
32.
Van der Vleuten CP, Schuwirth LW. Assessment of professional competence: from methods to programmes. Med Educ. 2005;39(3):309-317. DOI:10.1111/j.1365-2929.2005.02094.x External link
33.
Norcini JJ. Work based assessment. BMJ. 2003;326(7392):753-755. DOI:10.1136/bmj.326.7392.753 External link
34.
Albert M, Hodges B, Regehr G. Research in Medical Education: Balancing Service and Science. Adv Health Sci Educ Theory Pract. 2007;12(1):103-115. DOI:10.1007/s10459-006-9026-2 External link
35.
Norman G. Editorial – How Bad is Medical Education Research Anyway? Adv Health Sci Educ Theory Pract. 2007;12(1):1-5. DOI:10.1007/s10459-006-9047-x External link
36.
Reeves S, Albert M, Kuper A, Hodges BD. Qualitative research. Why use theories in qualitative research? BMJ. 2008;337:a949. DOI:10.1136/bmj.a949 External link
37.
Monrouxe LV, Rees CE. Picking up the gauntlet: constructing medical education as a social science. Med Educ. 2009;43(3): 196-198. DOI:10.1111/j.1365-2923.2008.03272.x External link
38.
Badley G. The crisis in educational research: a pragmatic approach. Europ Educ Res J. 2003;2(2):296-308. DOI:10.2304/eerj.2003.2.2.7 External link
39.
Van der Vleuten CP, Dolmans DH, de Grave WS, van Luijk SJ, Muijtjens A, Scherpbier AJ, Schuwirth L, Wolfhagen HA. Education Research at the Faculty of Medicine, University of Maastricht: Fostering the interrelationship between professional and education practice. Acad Med. 2004;79(10):990-996. DOI:10.1097/00001888-200410000-00021 External link