Next Article in Journal
Plasticity of vagal afferent signaling in the gut
Previous Article in Journal
Prevalence and etiology of midfacial fractures: A study of 799 cases
 
 
Medicina is published by MDPI from Volume 54 Issue 1 (2018). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Lithuanian Medical Association, Lithuanian University of Health Sciences, and Vilnius University.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validation of the EFFECT questionnaire for competence-based clinical teaching in residency training in Lithuania

by
Eglė Vaižgėlienė
1,*,
Žilvinas Padaiga
1,
Daiva Rastenytė
2,
Algimantas Tamelis
3,
Kęstutis Petrikonis
2,
Rima Kregždytė
1 and
Cornelia Fluit
4
1
Department of Preventive Medicine, Faculty of Public Health, Medical Academy, Lithuanian University of Health Sciences, Kaunas, Lithuania
2
Department of Neurology, Medical Academy, Lithuanian University of Health Sciences, Kaunas, Lithuania
3
Department of Surgery, Medical Academy, Lithuanian University of Health Sciences, Kaunas, Lithuania
4
Radboudumc Health Academy, Nijmegen, The Netherlands
*
Author to whom correspondence should be addressed.
Medicina 2017, 53(3), 173-178; https://doi.org/10.1016/j.medici.2017.05.001
Submission received: 20 October 2016 / Revised: 6 February 2017 / Accepted: 8 May 2017 / Published: 26 May 2017

Abstract

:
Background and aim: In 2013, all residency programs at the Lithuanian University of Health Sciences were renewed into a competency-based medical education curriculum. To assess the quality of clinical teaching in residency training, we chose the EFFECT (evaluation and feedback for effective clinical teaching) questionnaire designed and validated at the Radboud University Medical Centre in the Netherlands. The aim of this study was to validate the EFFECT questionnaire for quality assessment of clinical teaching in residency training. Materials and methods: The research was conducted as an online survey using the questionnaire containing 58 items in 7 domains. The questionnaire was double-translated into Lithuanian. It was sent to 182 residents of 7 residency programs (anesthesiology reanimathology, cardiology, dermatovenerology, emergency medicine, neurology, obstetrics and gynecology, physical medicine and rehabilitation). Overall, 333 questionnaires about 146 clinical teachers were filled in. To determine the item characteristics and internal consistency (Cronbach’s α), the item and reliability analyses were performed. Furthermore, confirmatory factor analysis (CFI) was performed using a model for maximum-likelihood estimation. Results: Cronbach’s α within different domains ranged between 0.91 and 0.97 and was comparable with the original version of the questionnaire. Confirmatory factor analysis demonstrated satisfactory model-fit with CFI of 0.841 and absolute model-fit RMSEA of 0.098. Conclusions: The results suggest that the Lithuanian version of the EFFECT maintains its original validity and may serve as a valid instrument for quality assessment of clinical teaching in competency-based residency training in Lithuania.

1. Introduction

For the delivery of high-quality patient care, high-quality clinical teaching of residents is essential [1,2]. Clinical teaching is accomplished in real-life situations through health care services for patients under strict control of a residents’ teacher. The quality of this process is crucially important to train young physicians who are able to provide evidence-based health care services, and to acquire necessary clinical skills, knowledge, and competencies [3,4,5].
Following the Standards and Guidelines for Quality Assurance in the European Higher Education Area (ESG 2015), universities have to review their programs on a regular basis ensuring their compliance with international aims meeting learners’ and social needs, especially on quality assurance [6]. After the Lithuanian University of Health Sciences renewed its residency programs according to the methodologies based on the intended learning outcomes and competencies (CBME), urgent need emerged to implement a quality evaluation system for clinical teaching based on scientific evidence [7]. There are many instruments available for the evaluation of clinical teaching [5,8,9]. One of these instruments is the EFFECT questionnaire (evaluation and feedback for effective clinical teaching), a theory-based, reliable, and valid instrument designed and validated by Fluit et al. from the Radboudumc Health Academy in the Netherlands [9].
This study aimed at validating the EFFECT questionnaire for quality assessment of clinical teaching in residency training.

2. Materials and Methods

The EFFECT questionnaire is based on theories of workplace learning and clinical teaching, and incorporates the Canadian Medical Education directives for Specialists (CanMEDS) competences [9]. The authors have validated the questionnaire following five sources of validity described by Downing [10,11]. Although EFFECT relies on an international literature study and is based on the theory that is internationally recognized as highly relevant to medical education, the authors claim the caution that is warranted in extrapolating their findings to other countries with different residency training programs and different feedback cultures as one of its possible limitations [12]. The aim of our study was to assess the validity of the Lithuanian version of EFFECT.
The EFFECT questionnaire consists of 58 items in 7 domains of clinical teaching: role modeling, task allocation, planning, providing feedback, teaching methodology, assessment, and personal support. The role modeling domain contains 4 subdomains: clinical skills, scholarship, general competencies, and professionalism. The items can be scored using a 6-point Likert scale (1, very poor; 2, poor; 3, intermediate; 4, satisfactory; 5, good; 6, excellent; and “not (yet) able to evaluate”). The option “not (yet) able to evaluate” was chosen if a specific item did not (yet) occur during clinical teaching. Having obtained the authors’ agreement to use the questionnaire, we made its double translation from Dutch to Lithuanian by 2 professional translators. In addition to the original items, information on gender, residency program, and the year of training was included.
To determine item characteristics, item means and standard deviation were calculated. For the assessment of internal consistency and reliability, the Cronbach’s alpha was calculated. Finally, structural equation modeling was applied to determine the amount of interdependency between items and constructs using the existing factorial solution as a model for maximum-likelihood estimation. In addition, common incremental measures of the scale fit in structural and equation modeling – the Comparative Fit Index (CFI) and Root Mean Square Error of Approximation (RMSEA) – were calculated [13,14]. Correlations between the dimensions were determined by correlation coefficients from the estimated covariance matrix. Correlation coefficients with a magnitude of 0.7–1.0 indicated interdependency of the factors. All the calculations were run with SPSS 20 and AMOS 20.
The study was approved by Bioethics Centre of the Lithuanian University of Health Sciences. The study was performed as an online survey.

3. Results

The survey data were collected during 2015–2016. A total of 182 residents (48 men and 134 women) were asked to fill in the EFFECT questionnaire about the teachers who were their supervisors within a residency program. The residents could decide how many teachers they wanted to evaluate not necessarily filling in the questionnaire for every teacher they worked with. We received a total of 333 questionnaires: 67.9% (n = 226) were completed by women and 32.1% (n = 107) by men. Description of the study population and the number of questionnaires filled in per residency program are presented in Table 1.
The largest proportion (36.9%) of the questionnaires was filled in by first-year residents, followed by third-year (25.2%), second-year (24.3%), and fourth-year (13.5%) residents.
The results of the item characteristics are provided in Table 2. The items were rated on a 6-point Likert scale. The mean scores ranged from 4.58 (item 29, “reminds me of previously given feedback”, and item 50, “helps and advises me on how to maintain a good work-home balance”) up to 5.40 (the item 9, “applies to guidelines and protocols”). More than 20% of the answers in item 12 “have a bad news conversation”, item 40 “reviews my reports”, item 50 “helps and advises me on how to maintain a good work-home balance” were scored as “not (yet) able to evaluate”, while this proportion was over 70% for all the assessment domain items (51–58). Factor loadings varied from 0.788 (item 30) to 0.957 (item 74).
The Cronbach’s alpha coefficients ranged from 0.91 to 0.97 indicating a high internal consistency of all subdomains (Table 3).
The “role modeling scholarship” subdomain was not included into the confirmatory factorial analysis as it has only one item – “apply academic research results.” The items of the “assessment” domain were not included due to high proportion of “not (yet) able to evaluate” answers. Therefore, only 9 subdomains of the questionnaire were used in analysis.
The examination of the factorial structure of the questionnaire using confirmatory factor analysis resulted in a satisfactory model-fit. The comparative fit index CFI (0.841) means reached the area of a permissible model-fit. The absolute model-fit RMSEA of 0.098 revealed a moderate matching of the postulated factorial structure with the empirical data. The correlations between the factors varied from 0.699 to 0.916 (Table 4).
Confirmatory factor analysis demonstrated that the 9 subscales of the EFFECT Lithuanian version corresponded to 9 different factors, which strongly correlated between themselves.

4. Discussion

The results of this study indicate that the Lithuanian version of the EFFECT questionnaire has acceptable psychometric properties and can be used for the evaluation of clinical teaching within residency training.
All the domains demonstrated a satisfactory reliability coefficient. However, residents indicated that part of the items could not be judged. This especially holds true for the “assessment” domain and some items of the role modeling domain (item 11 and 12). As portfolio assessment is not implemented at the moment, this explains why residents could not fill in these items.
For the “role modeling” items 11 and 12, it is possible that residents do not have much possibilities to observe their supervisors, for instance, when they bring bad news to their patients, or handle complaints and incidents. The same results were also found in the original studies conducted in the Netherlands, it could mean that residents just do these complex tasks, without having good examples in mind [9].
Compared with the original research, there are some differences in our survey that could influence the results [9].
First, the present replication study received fewer questionnaires (333 vs. 407). Nevertheless, the sample size exceeded the lower bound of 5 residents per item and, therefore, was regarded as big enough for a confirmatory factor analysis [15,16].
The original sample recruited its participants based on a department where they at the moment of the survey. In total, 6 departments in 4 different hospitals representing 3 specialties (pediatrics, pulmonary diseases, and surgery) were involved. This approach allowed the residents to evaluate all teachers of a specific department, thereby minimizing memory bias. The residents of our study were recruited based on the residency program, i.e., anesthesiology reanimathology, cardiology, dermatovenerology, emergency medicine, etc. They possibly had to evaluate clinical teachers from different clinical departments whom they met at a different timetable; therefore, this could result in memory bias due to different time of their encounter.
Another difference was the smaller number of domains included into confirmatory factorial analysis. As mentioned in Section 3, the “role modeling scholarship” domain was not included into confirmatory factorial analysis as it has only 1 item. However, we decided to leave it within the questionnaire as application of research results in training is one of the most important requirements [17].
The limitation of our study is related to the exclusion of the “assessment” domain from confirmatory factorial analysis (due to high proportion of “not (yet) able to evaluate” answer, ranging from 71.9% to 75.7% for different items) as it did not allow us to validate the full factorial structure of the EFFECT questionnaire. However, we need to keep this domain within the questionnaire as this is the weakest part of current clinical teaching, which has to be improved. It should be taken into consideration that CBME requires multifaceted assessment that embraces formative and summative approaches [18], the processes which are continuous and frequent [19]. It should also be noted that systematic training of clinical teachers on how to supervise residents in a competency-based curriculum has been started at the Lithuanian University of Health Sciences only recently. Therefore, in order to assess quality of clinical teaching using EFFECT, we will need to revalidate the questionnaire in its full structure once the “assessment” domain will become the daily practice in residency studies.

5. Conclusions

The results of this study indicate that the Lithuanian version of EFFECT has acceptable psychometric properties for evaluation of clinical teachers. Further research should be undertaken to examine the full factorial structure of EFFECT.

Conflicts of interest

The authors state no conflict of interest.

R E F E R E N C E S

  1. Leach, DC. Changing education to improve patient care. Qual Saf Health Care 2001, 10 (December (Suppl. 2)), ii54–8. [Google Scholar] [CrossRef]
  2. Engbers, R; Caluwé, LIA; de Stuyt, PMJ; Fluit, CRMG; Bolhuis, S. Towards organizational development for sustainable high-quality medical teaching. Perspect Med Educ 2013, 2(February (1)), 28–40. [Google Scholar] [CrossRef] [PubMed]
  3. Da Dalt, L; Callegaro, S; Mazzi, A; Scipioni, A; Lago, P; Chiozza, ML; et al. A model of quality assurance and quality improvement for post-graduate medical education in Europe. Med Teach 2010, 32(2), e57–64. [Google Scholar] [CrossRef] [PubMed]
  4. Beckman, TJ; Cook, DA; Mandrekar, JN. What is the validity evidence for assessments of clinical teaching? J Gen Intern Med 2005, 20(December (12)), 1159–64. [Google Scholar] [CrossRef] [PubMed]
  5. Fluit, CRMG; Bolhuis, S; Grol, R; Laan, R; Wensing, M. Assessing the quality of clinical teachers. J Gen Intern Med 2010, 25(December (12)), 1337–45. [Google Scholar] [CrossRef] [PubMed]
  6. ESG_2015.pdf. Available from: http://www.enqa.eu/wp-content/uploads/2015/11/ESG_2015.pdf [cited 30.06.16].
  7. Scheele, F; Teunissen, P; Luijk, SV; Heineman, E; Fluit, L; Mulder, H; et al. Introducing competency-based postgraduate medical education in the Netherlands. Med Teach 2008, 30(January (3)), 248–53. [Google Scholar] [CrossRef] [PubMed]
  8. Beckman. Factor instability of clinical teaching assessment scores among general internists and cardiologists. Medical Education – Wiley Online Library, 2006. Available from: http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2929.2006.02632.x/full [cited 06.10.16].
  9. Fluit, C; Bolhuis, S; Grol, R; Ham, M; Feskens, R; Laan, R; et al. Evaluation and feedback for effective clinical teaching in postgraduate medical education: validation of an assessment instrument incorporating the CanMEDS roles. Med Teach 2012, 34(11), 893–901. [Google Scholar] [CrossRef] [PubMed]
  10. Downing, SM. Validity: on the meaningful interpretation of assessment data. Med Educ 2003, 37(9), 830–7. [Google Scholar] [CrossRef] [PubMed]
  11. Downing, SM. Reliability: on the reproducibility of assessment data. Med Educ 2004, 38(9), 1006–12. [Google Scholar] [CrossRef] [PubMed]
  12. Fluit, CRMG; Feskens, R; Bolhuis, S; Grol, R; Wensing, M; Laan, R. Understanding resident ratings of teaching in the workplace: a multi-centre study. Adv Health Sci Educ 2014, 20(3), 691–707. [Google Scholar] [CrossRef]
  13. Tyrimo_ir_įvertinimo_priemonių_patikimumo_ir_validumo_nustatymas.pdf. Available from: http://www.vu.lt/site_files/LD/Tyrimo_ir_%C4%AFvertinimo_priemoni%C5%B3_patikimumo_ir_validumo_nustatymas.pdf [cited 20.08.16].
  14. Confirmatory Factor Analysis for Applied Research – Timothy A. Brown – Google knygos [Internet]. Available from: https://books.google.lt/books?hl=lt&lrid=tTL2BQAAQBAJ&oi=fnd&pg=PP1&dq=Evaluating+the+use+of+confirmatory+factor+analysis&ots=ajVssP_Q5C&sig=ib_e56X4At8-fXNqqf8QOjSP3G4&redir_esc=y#v=onepage&q=RMSEA&f=false [cited 20.08.16].
  15. Terwee, CB; Bot, SDM; de Boer, MR; van der Windt, DAWM; Knol, DL; Dekker, J; et al. Quality criteria were proposed for measurement properties of health status questionnaires. J Clin Epidemiol 2007, 60(1), 34–42. [Google Scholar] [CrossRef]
  16. Iblher, P; Zupanic, M; Ostermann, T. The Questionnaire DRECT German: adaptation and test theoretical properties of an instrument for evaluation of the learning climate in medical specialist training. GMS Z Med Ausbild 2015, 32(November (5)), Doc55. http://dx.doi.org/10.3205/zma000997. eCollection 2015.
  17. Bourgeois, JA; Hategan, A; Azzam, A. Competency-based medical education and scholarship: creating an active academic culture during residency. Perspect Med Educ 2015, 4(5), 254–8. [Google Scholar] [CrossRef] [PubMed]
  18. Hawkins, RE; Welcher, CM; Holmboe, ES; Kirk, LM; Norcini, JJ; Simons, KB; et al. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ 2015, 49(11), 1086–102. [Google Scholar] [CrossRef] [PubMed]
  19. Holmboe, ES; Sherbino, J; Long, DM; Swing, SR; Frank, JR; Collaborators for the IC. The role of assessment in competency-based medical education. Med Teach 2010, 32(8), 676–82. [Google Scholar] [CrossRef] [PubMed]
Table 1. Description of the study population and number of questionnaires filled-in per residency program.
Table 1. Description of the study population and number of questionnaires filled-in per residency program.
Residency programNo. of residents in the programNo. of teachers in the programNo. of teachers evaluated in the programNo. of questionnaires filled-in within the program
Anesthesiology reanimathology53702157
Cardiology30532245
Dermatovenerology1910837
Emergency medicine177456115
Neurology11221220
Obstetrics and gynecology34371630
Physical medicine and rehabilitation18181129
Total182284146333
Table 2. Item characteristics.
Table 2. Item characteristics.
DomainMeanSDFactor loadingNAE, nNAE, %
Role modeling
Role modeling clinical skills
1. Ask for a patient history5.211.0290.900319.3c
2. Examine a patient5.231.0270.932319.3
3. Perform clinical actions5.31.9810.878216.3
Role modeling general CanMEDS roles
4. Cooperate with other health professionals while providing care to patients and relatives5.251.0470.85592.7
5. Communicate with patients5.201.1100.92351.5
6. Cooperate with colleagues5.021.2490.8501.3
7. Organize his or hers own work adequate5.201.0790.79551.5
8. Apply guidelines and protocols5.40.9900.80461.8
9. Treat patients respectfully5.311.0800.86951.5
10. Handle complaints and incidents5.241.1850.8506318.9
11. Have a bad news conversation 5.261.0710.8607723.1
Role modeling scholarship
12. Apply academic research results5.281.054123.6
Role modeling professionalism
13. Indicates when he/she himself/herself does not know something4.921.2870.834185.4
14. Reflects on his/her own actions4.991.2730.88761.8
15. Is a leading example of how I want to perform as a specialist5.131.2910.93741.2
Task Allocation
16. Gives me enough freedom to perform tasks on my own that suit my current knowledge/skills5.251.0560.85641.2
17. Gives me tasks that suit my current level of training5.201.1100.8563.9
18. Stimulates me to take responsibility5.221.0210.80572.1
19. Gives me the opportunity to discuss mistakes and incidents4.991.3720.884185.4
20. Teaches me how to organize and plan my work4.821.3210.861267.8
21. Prevents me from having to perform too many tasks irrelevant to my learning4.731.4830.835288.4
Planning
22. Reserves time to supervise/counsel me4.941.3090.93251.5
23. Is available when I need him/her during my shift5.081.3530.9105917.7
24. Sets aside time when I need him/her5.111.1770.942133.9
Giving feedback
Quality of the feedback
25. Bases feedback on concrete observations of my actions4.911.2330.883339.9
26. Indicates what I am doing correctly4.921.3050.904144.2
27. Discusses what I can improve4.761.2550.915226.6
28. Allows me think about strengths and weaknesses4.721.3070.9143510.5
29. Reminds me of previously given feedback4.581.3680.8816519.5
30. Formulates feedback in a way that is not condescending or insulting4.951.4020.788288.4
Content of the feedback
31. My clinical and technical skills5.061.1310.9164212.6
32. How I monitor the boundaries of my clinical work 4.971.2040.9425315.9
33. How I collaborate with my colleagues in patient care4.981.1210.9485616.8
34. How I apply evidence-based medicine to my daily work4.991.1510.9224613.8
35. How I make ethical considerations explicit4.981.1830.9425516.5
36. How I communicate with patients5.001.1670.9545416.2
Teaching abilities (methodology)
37. Reviews the learning objectives4.691.3310.8024914.7
38. Asks me to explain my choice for a particular approach (status, resign form, etc.).4.991.1110.8064312.9
39. Discusses the possible clinical courses and/or complications5.061.0600.855267.8
40. Reviews my reports4.931.1410.7957522.5
41. Stimulates me to find out things for myself5.091.0810.810175.1
42. Stimulates me to ask questions5.241.2010.85630.9
43. Stimulates me to actively participate in discussions5.021.2820.908226.6
44. Explains complex medical issues clearly5.101.2050.890175.1
Personal support
45. Treats me respectfully5.201.2430.8901.3
46. Is an enthusiastic instructor/supervisor5.161.1470.8381.3
47. Allows me know I can count on him/her5.051.3470.9572.6
48. Supports me in difficult situations (e.g. during morning report)5.031.4120.945247.2
49. Is open to personal questions/problems4.951.4020.9174714.1
50. Helps and advises me on how to maintain a good work-home balance4.581.6170.8899628.7
Assessment
51. Prepares progress reviews4.891.03625175.1
52. Makes a clear link with previously set learning objectives during these reviews5.061.02124673.7
53. Gives me the opportunity to raise issues of my own5.351.12424071.9
54. Formulates next-term learning objectives during these reviews with me5.131.12424372.8
55. Explains how staff was involved in the assessment4.891.25525375.7
56. Reviews my portfolio during the assessment5.071.15324673.3
57. Pays attention to my self-reflection4.971.27624473.1
58. Gives a clear and exhaustive assessment5.161.20924172.2
Mean scores (scale 1 = very unsatisfactory, 6 = excellent) with corresponding standard deviation and factor loadings per item. Columns 4 and 5 contain frequencies (number of questionnaires and percentage) per item that residents indicated “not (yet) able to evaluate” (NAE) of the EFFECT questionnaire. For deleted items no factor loadings were calculated.
Table 3. Cronbach’s alpha of the domains.
Table 3. Cronbach’s alpha of the domains.
DomainNo. of itemsCronbach’s alpha
Role modeling clinical skills30.93
Role modeling CanMEDS roles80.95
Role modeling scholarship1
Role modeling professionalism30.91
Task allocation60.94
Planning30.94
Quality of the feedback60.95
Content of the feedback60.97
Teaching methodology80.95
Personal support60.96
Assessment80.97
Table 4. Correlations between factors of the questionnaire.
Table 4. Correlations between factors of the questionnaire.
Factors123456789
11.0
20.8941.0
30.8860.9161.0
40.7150.8130.8701.0
50.7500.8290.9020.9091.0
60.6770.7640.7950.8560.8191.0
70.7240.7910.7740.8130.7600.9041.0
80.7980.8500.9080.9000.8920.8820.8571.0
90.7080.8180.8810.8770.9150.7890.7550.8901.0
1, role modeling clinical skills; 2, role modeling general CanMEDS roles; 3, role modeling professionalism; 4, task allocation; 5, planning; 6, feedback (quality); 7, feedback (content); 8, teaching methodology; 9, personal support.

Share and Cite

MDPI and ACS Style

Vaižgėlienė, E.; Padaiga, Ž.; Rastenytė, D.; Tamelis, A.; Petrikonis, K.; Kregždytė, R.; Fluit, C. Validation of the EFFECT questionnaire for competence-based clinical teaching in residency training in Lithuania. Medicina 2017, 53, 173-178. https://doi.org/10.1016/j.medici.2017.05.001

AMA Style

Vaižgėlienė E, Padaiga Ž, Rastenytė D, Tamelis A, Petrikonis K, Kregždytė R, Fluit C. Validation of the EFFECT questionnaire for competence-based clinical teaching in residency training in Lithuania. Medicina. 2017; 53(3):173-178. https://doi.org/10.1016/j.medici.2017.05.001

Chicago/Turabian Style

Vaižgėlienė, Eglė, Žilvinas Padaiga, Daiva Rastenytė, Algimantas Tamelis, Kęstutis Petrikonis, Rima Kregždytė, and Cornelia Fluit. 2017. "Validation of the EFFECT questionnaire for competence-based clinical teaching in residency training in Lithuania" Medicina 53, no. 3: 173-178. https://doi.org/10.1016/j.medici.2017.05.001

Article Metrics

Back to TopTop