Full Issue (Special issue on "Research in Medical Education")
Planning organisation and delivery of educational program(s), culminating in purposeful learning require strong basis of principles of adult learning along with a sound knowledge and requisite skills in both psychology as well as technology of medical education. Assessing effectiveness of a CME program is as important as the organization of learning activities and delivery of academic program as these may provide further directions for enhancing the efficacy of the CME delivery system.
Objective: (i) The purpose of this study was to investigate the effectiveness of well planned and conducted CME program in terms of enhancing knowledge and competence of the participants. (ii) To explore if the gain in knowledge and competence, if any, can be attributed to the interactive design of the educational process.
Methods: The study was conducted during NAMS-AIIMS Regional Symposium on Sleep Medicine at AIIMS, Jodhpur as part of NAMSCON 2013. After explaining the objectives of the study to the participants and assurance of confidentiality, a validated and pre-tested questionnaire consisting of 30 multiple choice, single response questions, was administered to 103 participants. Following intervention consisting of didactic lectures by experts in different aspects of sleep medicine, interactive sessions and problem triggered sessions consisting of clinical data, participants were re-administered post test questions which were, however, different from pre-test but had similar difficulty level.
Result: The response rate of participants was 89%. Pre-intervention scores were 11.76 ± 4.4, with only 26 % of participants achieving an arbitrary pass score of 50 %. Comparison of paired score of participants who attempted both pre and post tests (n=59) showed improvement from 12.1 ±4.6 to 18.3 ± 3.8 which was significant (p <0.05). 84.7 % of participants secured above pre decided 50% score. The mean increase in the score was 6.2 with 95% CIs 4.8; 7.5 (P <0.001). Higher gain in
knowledge and competencies is attributed to intense interactive involvement of participants during the problem triggered sessions, feedback provided during interaction and system of reward and incentive introduced at time of sessions. The study concludes that well designed educational intervention based on the principles of adult learning brings positive gain in the knowledge and enhances competence of the participants.
Key words : Pre-post test, retrospective post-pre test, program evaluation, evaluation of educational intervention.
Kern DE, Thomas PA, Howard DM, Bass EB (1998). Curriculum Development for Medical Education: A Six-Step Approach. Johns Hopkins University Press: Baltimore, Maryland.
Kirkpatrick D L, Kirkpatrick JD (2006). Evaluating Training Programs: The four Levels, 3rd edition, Barrette and Koehler Publisher. Inc. San Francisco, CA.
Rockwell SK, Kohn H (1989). Post then Pre evaluation. J Extension. 27(2): 1-5
Nimon K, Zigarmi D, Allen J (2011). Measures of program effectiveness based on retrospective Pretest data: Are all created equal? Am J Eval. 32: 8-28
Gallagher R, Roach K, Belshaw J, Kirkness A, Sadler L, Warrington D (2013). A pre-test post-test study of a brief educational intervention demonstrates improved knowledge of potential acute myocardial infarction symptoms and appropriate responses in cardiac rehabilitation patients. Aust Crit Care. 26(2):49-54
Bell D S, Harless C E, Higa J K, et al (2008). Knowledge Retention after an Online Tutorial: A Randomized Educational Experiment among Resident Physicians. J Gen Intern Med. 23(8): 1164–1171
Weiner S J, Jackson J L, Garten S. Measuring Continuing Medical Education Outcomes: A Pilot Study of Effect Size of Three CME Interventions at an SGIM Annual Meeting (2009). J Gen Intern Med. 24(5): 626–629.
Davis D, Bordage G, Moores CK, et al (2009). The Science of Continuing Medical Education: Terms, Tools, and Gaps: Effectiveness of Continuing Medical Education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 135 (3_suppl): 8S-16S
Jerardi K, Solan L, DeBlasio D, et al (2013). Evaluating the impact of interactive and entertaining educational conferences. Perspect Med Educ. 2(5-6): 349–355.