Application of OSCE using a Virtual Assessment Platform

,


Introduction
In comparison to a traditional in-person learning experience, virtual learning can offer increased accessibility and flexibility (Ryan & Poole, 2019). Relevant even in clinical training, virtual learning platforms have improved dental and medical school education through the incorporation of multimedia learning modules, convenience of immediate assessment feedback, and potential for synchronous and asynchronous learning (Broudo & Walsh, 2002;Uddin, Rice, & Reynolds, 2007). While virtual learning environments in dentistry were once seen as elective, the perception towards virtual learning has generally improved due to increased technological familiarity and the necessity of technological literacy in addressing the increasingly digital healthcare environment (Hendricson et al., 2004;Schönwetter, Reynolds, Eaton, & DeVries, 2010;Mattheos et al., 2008;Guze, 2015). Virtual learning can additionally play a role in expanding educational access in dentistry (France, Hangorsky, Wu, Sollecito, & Stoopler, 2021). One study suggests that virtual learning could help alleviate the global shortage of clinical academics needed to train future dental and health care clinicians (Schönwetter et al., 2010).
The objective structured clinical examination (OSCE) is recognized as an effective method of assessing clinical competency in health professions training to simulate clinical encounters in a deliberate teaching environment (Davidson, Duerson, Rathe, Pauly, & Watson, 2001;May, Park, & Lee, 2009;Harden & Gleeson, 1979). First developed for use in medical education, the OSCE has since been incorporated into dentistry and other health professions training (Brown, Manogue, & Martin, 1999;Schoonheim-Klein et al., 2006;Eberhard et al., 2011;Clark, 2015;Park, Anderson, & Karimbux, 2016). The movement towards more objective measures of clinical assessment coincides with the increasing digitalization of clinical education, especially in the realm of online learning. and was generally well-received (Novack et al., 2002;Triola et al., 2006;Palmer et al., 2015;Sartori, Olsen, Weinshel, & Zabar, 2019). Despite their practical benefits, virtual OSCEs have not been adopted in mainstream clinical education, some citing development and implementation costs (Palmer et al., 2015;. However, the COVID-19 pandemic accelerated the trend of incorporating virtual learning into clinical education, especially the virtual OSCE. The disruption of in-person educational activities and the continued need for competency-based assessment in graduation requirements has created a unique opportunity for wider adoption of virtual OSCEs as a potentially disruptive technology (Donn, Scott, Binnie, & Bell, 2020;Hytönen et al., 2020;Lim, Lee, Karunaratne, & Caliph, 2020;Hannon et al., 2020).
While virtual OSCEs have since been implemented in clinical dental education as a result of the pandemic, whether the new virtual format impacts performance and satisfaction remains understudied (Donn et al., 2020;Hytönen et al., 2020;Iyer, Aziz, & Ojcius, 2020). The aim of this study was to evaluate the implementation of virtual OSCE and the preliminary outcome measures in student's performance and student and faculty satisfaction.

Methods
This study received approval from the Institutional Review Board at Harvard Medical School and the Harvard School of Dental Medicine (IRB21-0212).

Virtual OSCE Operational Logistics
The OSCE is given to D2 students at the end of Year 2, D3 students at the end of Year 3, and D4 students in the winter of Year 4. Each OSCE is centered around a patient case of appropriate complexity for the students' expected level of competency, and students rotate through 10 different discipline stations, each asking 2-3 questions related to the case. The D2 case substitutes a biomedical science station for the prosthodontics station in the D3 and D4 cases.
For each OSCE, the Discipline Directors identified two faculty evaluators per discipline in the areas of operative dentistry, endodontics, periodontics, pediatric dentistry, prosthodontics, oral radiology and pathology, oral and maxillofacial surgery, oral health policy and epidemiology, orthodontics, and treatment planning. A case of appropriate complexity level was selected and a PDF was created of the following elements: Case Write-Up and History, Diagnostic Charting, Odontogram, Perio Charting, Radiographs, Clinical Photos, and Digital study cast images.
A faculty calibration meeting was held via Zoom (Zoom Video Communications, Inc. San Jose, CA) to review the case and all materials for accuracy. A secure Shared Google doc (Alphabet, Inc., Mountain View, CA) was created for faculty to upload their discipline-specific questions. An orientation for the new virtual format was provided to students and faculty to review the logistical details and to support technical resources. Additionally, a mock virtual OSCE was conducted prior to the actual exam date to familiarize the students, faculty and staff with the new assessment platform.
The Zoom session for the OSCE comprised of Breakout Rooms, two per discipline, and one additional Breakout Room, which served as the Rest Station. All participants were asked to enable the Waiting Room function and all participants were able to Screen Share.
A PDF of the case and all materials were transferred into a secure Google doc that was shared and made available to each student group for one hour prior to the exam for the Case Review portion. The students were emailed the Google doc link at the beginning of the Case Review time and the Zoom link for the exam itself was posted on the class Canvas (Canvas Instructure Inc., Salt Lake City, UT) site. A Qualtrics (Qualtrics International Inc. Washington, D.C.) form was created for the faculty evaluation, with the student names pre-populated in a dropdown menu. This link was emailed to the faculty evaluators prior to the exam.
Faculty were asked to login to the Zoom session early on exam day and the Host moved each faculty evaluator to their designated Breakout Room where they were asked to remain for the entirety of the exam. Students rotated through each station virtually, six minutes for each station evaluation, and one minute in between sessions to allow for virtual travel time. In each discipline, students were evaluated out of eight for their knowledge, and two for their presentation ability, for a total of 10 possible points per discipline. The sum of all 10 discipline scores were calculated to get the student's overall OSCE score.

Student and Faculty Satisfaction Survey
Surveys were sent through Qualtrics to students in the classes of 2021 and 2022, and to all current faculty and postdoctoral residents who had participated in at least one virtual OSCE. Two reminders were sent for each survey, one week apart. The student survey consisted of three questions, one of which was a series of seven agree or disagree statements on a five-point Likert scale about the virtual OSCE experience, and two of which were free-response questions about what students liked best and liked least about the virtual OSCE.
The evaluator survey consisted of four questions, one of which was a series of eight aspects of the OSCE for which evaluators indicated their preference on a five-point Likert scale for virtual or in-person experiences; one of which was a series of eight agree or disagree statements on a five-point Likert scale about the virtual OSCE experience; and two of which were free-response questions about what evaluators liked best and liked least about the virtual OSCE.

Statistical Analysis
Student OSCE scores and dental school application data were obtained from the Office of Dental Education. All identifiers were separated from the raw data to create an analytical dataset, and all scores over the years were merged using random unique identifiers.
First, a descriptive analysis was performed to report the preferences of students and evaluators of their OSCE experience. Then, univariate and bivariate analyses were performed to describe the students' overall gender distribution and their average admission scores, as well as stratified by the method of exam delivery. After that, a linear regression analysis was used to estimate the crude and adjusted average differences in OSCE scores with the change in the method of exam delivery (in-person vs. virtual). The expected change in the overall average students' performance of the three OSCEs with the 95% confidence interval (95%CI) was reported, accounting for individual's repeated measurements, in addition to the average change in each discipline. In the full model, the analysis was adjusted for gender, overall grade point average (GPA), Dental Admission Test (DAT), Perceptual Ability Test (PAT). The estimates were deemed statistically significant at alpha <0.05. All statistical analyses were done using Stata/MP 16.1 (StataCorp, College Station, TX).

Outcome Data comparing Virtual and In-Person Student Performance
A total of 584 OSCE results were assessed for 241 students from the DMD graduating classes of 2016 through 2022. One hundred seventy-one students had all their OSCEs in-person (classes of 2016-2020), for a total of 445 results. The class of 2021 had their first OSCE in-person (34 results), but transitioned to a virtual one for their remaining two OSCEs (68 results) along with the class of 2022 for their first OSCE (37 results Linear regression analysis of the average difference in overall OSCE score detected a borderline significant higher score among those who took the exam virtually by 1.04 out of 100 compared to in-person exam (95%CI=0.00, 2.09) ( Table 2). However, this difference disappeared after adjusting for students' gender and admission scores (average difference=0.66; 95%CI=-0.45, 1.78). When the discipline-specific scores were examined, virtual scores were higher than in-person exam in oral and maxillofacial surgery (adjusted average difference=2.46; 95%CI=0.51, 4.41), and periodontics (adjusted average difference=2.61; 95%CI=0.29, 4.92).

Survey Data
The student survey was given to the classes of 2021 and 2022, a total of 72 students. Fifteen students from the class of 2021 (44%) and 20 students from the class of 2022 (53%) completed the survey.
Overall, the vast majority of students (94%) strongly or somewhat agreed that the virtual OSCE was effective and 97% strongly or somewhat agreed that the virtual OSCE experience had been easy to navigate (Figure 2). When asked what they liked most about the virtual OSCE, many students liked the increased efficiency (no physical movement between stations) and the ability to access the OSCE from home (Table 3). had elapsed for each station, and a desire for a feedback session to go over the correct answers for all stations.
The evaluator survey was given to all current faculty and postdoctoral residents who have participated in at least one virtual OSCE. A total of 14 evaluators, comprising seven full-time faculty members, one part-time faculty member, and six residents, completed the survey.
Overall, the evaluators had no particular preference for virtual over in-person OSCEs. The results of a series of questions asking whether evaluators much preferred or somewhat preferred virtual OSCEs, were neutral. Replies to questions asking whether evaluators much preferred or somewhat preferred in-person OSCEs were overwhelmingly neutral (Fig. 3).

Figure 3. Evaluator preferences between in-person and virtual OSCEs
Evaluators were also asked to agree or disagree with a series of statements about the virtual OSCE (Fig. 4).  Where there was a majority opinion, it was usually neutral. For the three statements that had significant non-neutral opinions, 64% strongly or somewhat agreed that they would have preferred an in-person OSCE; 23% strongly agreed and 23% somewhat disagreed that it was easier for them to participate in a virtual OSCE: and 29% somewhat agreed, 36% neither agreed nor disagreed, and 21% somewhat disagreed that a virtual OSCE offers a comparable experience to an in-person OSCE.
When asked what they liked about the virtual OSCE, most evaluators mentioned ease and convenience, including lack of travel to the dental school and lack of waiting for students to travel between exam stations. When asked what they disliked, common themes were timekeeping in the sessions and internet issues.

Discussion
The purpose of this study was to determine whether the implementation of a virtual OSCE in a predoctoral dental curriculum correlated with changes in student performance and student and faculty satisfaction with the OSCE experience. While the in-person OSCE is a well-established and reliable method of assessing clinical competency in dentistry and other health professions, the virtual OSCE has been less studied and implemented (Brown et al., 1999;Schoonheim-Klein et al., 2006;Clark, 2015). However, virtual OSCE shares the core strength of the traditional OSCE in its objective assessment of clinically relevant skills (Pell, Fuller, Homer, & Roberts, 2010;Graham, Bitzer, Mensah, & Anderson, 2014). Virtual OSCE outcomes, like that of their traditional OSCE counterpart, can also be used to revise curricula when desired student competency assessed via curricular objectives need to be met (Kumar & Gadbury-Amyot, 2019). Ultimately, it was necessary to evaluate whether the virtual OSCE could continue to play an effective role in the predoctoral dental curriculum after the COVID-19 pandemic.
The results of the linear regression analysis support the feasibility and reliability of future virtual OSCE examinations. The lack of a difference in overall scores between in-person and virtual OSCE suggests that the method of exam delivery does not play a significant contributory role in student performance. Consequently, primarily considering its clinical impact, the virtual OSCE can be a reasonable alternative to the in-person OSCE, especially when accounting for additional variables such as assessing students on externship, navigating limited clinic availability, or overcoming limited faculty physical availability.
Many of the qualitative results from the current study complement prior studies regarding the barriers and benefits of online learning (Dyrbye, Cumyn, Day, & Heflin, 2009;O'Doherty et al., 2018). An integrated review found repeated themes of time constraints, lack of technical skills, infrastructural barriers (i.e. internet accessibility), and poor institutional support as challenges facing online learning in medical education (O'Doherty et al., 2018). Qualitative feedback from the survey similarly mentioned concerns regarding 'timekeeping' and 'internet disconnection' (Table 1). One faculty participant also commented on the difficulties of keeping 'energy levels up over long periods of time' in a virtual learning environment, colloquially known as Zoom-fatigue. Difficulties building relationships online may serve as another infrastructural barrier to online learning (Dyrbye et al., 2009).
Inversely, online learning offers aspects of convenience and efficiency (O'Doherty et al., 2018;Maloney et al., 2012). The same review that understood time constraint-limited technical skills as a barrier to online learning implementation acknowledged that online learning could save time given technical competency (O'Doherty et al., 2018). The qualitative results from the survey support this finding, as many cited the convenience of 'stay[ing] home' and the lack of a commute as their most favorite aspects of the virtual OSCE experience (Table 1).
Additionally, the online experience may be a more economically efficient way to learn. According to a break-even and cost-benefit analysis conducted to calculate the cost of a web-based versus a traditional in-person education for health professionals, the online experience produced a break-even point that was lower than that of its in-person counterpart (Maloney et al., 2012). Incidentally, it was discovered that hosting the OSCE virtually allowed for the HSDM student teaching practice and clinic sessions to remain operational, previously impossible on OSCE days due to limited clinic space, which further supports the economic efficiency of the virtual OSCE experience.
A deeper look at the quantitative results shows some differences between student and evaluator responses to the virtual OSCE. Only 14% of faculty agreed that virtual OSCEs should be offered in the future, compared to 74% of students (Fig.  2). However, the majority of faculty evaluators neither agreed nor disagreed that the virtual OSCE was a 'comparable learning experience' to the in-person OSCE, suggesting that the difference in attitudes toward future offerings of the virtual OSCE are due to aspects of the virtual experience that are less about clinical assessment and more about infrastructure. It is possible that differing levels of technological proficiency between student and evaluator groups played a role in the differing attitudes regarding future virtual OSCE offerings.
Comparing the differences in responses regarding technological accessibility support this hypothesis. While the vast majority (97%) of the students agreed or strongly agreed that the online experience was 'easy to navigate', a majority of the faculty (71%) were neutral on the accessibility of the experience (Fig. 2). It is well-documented that the technological proficiency of educators plays a role in the successful implementation of online learning (O'Doherty et al., 2018). Institutional efforts to support educators achieving technical mastery of relevant technology may positively impact perceptions on the feasibility of future virtual learning offerings.
The fact that most of our students and evaluators on average had no preference for online versus in-person OSCE suggests that there are advantages and disadvantages to both OSCE formats. While most of the current benefits associated with virtual OSCE have only spoken to its convenience and efficiency, virtual forms of assessment may be increasingly necessary as the role of technology in healthcare delivery increases and more health professional schools integrate telehealth training into curricula (Donn et a.l, 2020;Hannon et al., 2020;Jumreornvong, Yang, Race, & Appel, 2020;Prettyman, Knight, & Allison, 2018). Importantly, after the COVID-19 pandemic, the virtual OSCE can be a standardized avenue through which future healthcare providers assess and develop the skills for the practice of teledentistry, telemedicine, and remote care (Palmer et al., 2015). Consequently, the continued role and evolution of the virtual OSCE will depend, in part, on the growing need of training clinicians competent in telehealth delivery. The utilization of the virtual OSCE and other forms of standardized assessment through virtual and online mediums may increase with the growing role of telehealth in healthcare delivery and the increasing need to train clinicians competent in providing health in both in-person and online mediums.
The implementation of the virtual OSCE was facilitated in several ways. Faculty and student orientation to the technology was understood as paramount, and a mock virtual OSCE was provided prior to the exam date. The mock exam was especially helpful in setting realistic expectations for the actual exam. Participants gained a better familiarity of exam flow through Zoom Breakout rooms, and a chance to assess their internet connectivity. Faculty were provided with an additional calibration meeting to ensure accuracy of test materials and responsibilities, allowing the mock exam session to truly pinpoint challenges related with the technology itself. Additionally, an instructional technologist was available to field all technical issues during the exam, and streamline the resolution of any potential issues that might arise. Consequently, successful implementation of virtual OSCE was made possible by efforts both before and during the exam, respectively through mock virtual exam and information technology support.
Limitations of the survey study included a small population size preventing statistical analysis. Further studies of larger population size will be necessary to extrapolate these findings to the greater population. Additionally, the unavailability of the in-person OSCE experience during the pandemic may have influenced participants' perceptions on the effectiveness of the virtual OSCE. Additional research when both in-person and virtual OSCEs are available will be necessary to further solidify the effectiveness of the virtual OSCE experience.

Conclusion
This study found no statistically significant difference in overall student OSCE scores based on the method of exam delivery, whether in-person or virtual. This study also found that the majority of students found the virtual OSCE to be an effective and accessible method of assessment, but faculty were more neutral regarding the virtual OSCE experience and its continued implementation in the future. These results suggest that the virtual OSCE could be considered an alternative to the in-person OSCE from an assessment standpoint, and that developing a culture of institutional technological proficiency could aid future implementation.