As in previous years, the European Orthodontic Teachers’ Forum will take place in connection with the Annual Congress of the EOS. Registration is through the registration system for the congress.
You must be registered for the EOS Annual Congress to attend the Teachers’ Forum.
The lecture title is:
“Assessing competence with high stakes exams”.
Speaker: Magnus Hultin, Umeå University (Sweden)
Date and time: Tuesday 31 May, 12:30-16:15
Carob Mill Complex
Vasilissis street, 3602,
12:30 – arrival and check-in
12:45 – light lunch
13:15 – meeting
16:15 – end
Magnus Hultin, MD PhD, Associate Dean of Clinical Education, Associate Professor of Anesthesiology and Critical Care Medicine, Umeå University, Sweden.
From 2014 to 2021, he chaired the Medical Program at Umeå university with 1100 students. During this period, computer-based testing was introduced for written exams, full-class OSCEs were introduced, and a new curriculum was developed and initiated. In 2015, Magnus Hultin was assigned to establish and chair a new proficiency test for medical doctors who want to become licensed physicians in Sweden based on medical education outside EU/EES. A similar setup is used for testing dental practitioners educated outside EU/EES.
The workshop focuses on state-of-the-art principles for developing and running a proficiency test, with one part testing the theoretical knowledge and one part testing practical skills. Benchmarking and quality assurance are essential for buy-in from stakeholders. The journey from designing a blueprint to deciding pass scores is covered with theoretical lectures intermixed with practical work and discussions.
The purpose of a test has vast implications for how it should be designed. Theoretical knowledge can be measured with multiple-choice questions, which allows for objective scoring. However, this requires well-written questions with plausible distractors. In an ideal setting, the questions have been tested on an appropriate test group, but if this is not possible, quality assurance must be ensured in other ways.
Practical skills may be measured with OSCE (Objective Structured Clinical Examination). Objectifying and standardizing what is tested and how it is tested increases the validity of the measurement. Blueprinting the skills to be tested, using different stations for different skills, and using different assessors decreases the risk of errors in measurement, i.e., passing a person who should fail or failing a person that should pass the test.
The lecture focuses on high-stakes exams.
Test design and blueprinting
Writing well-written Single-Best-Answer questions (MCQ)
Ensuring question quality, pre- and post-test
Setting pass-score on theoretical tests
Measuring practical skills
Delivering an OSCE test
Scoring systems for practical tests
Writing an OSCE station
Quality assurance of an OSCE-station
Post-test – setting the passing score