The Objective Structured Clinical Examination (OSCE) is an important component of nurse practitioner (NP) education programs that provides a controlled venue for evaluation of practice-based competencies. In the OSCE, students demonstrate professional skills and diagnostic reasoning in clearly defined clinical scenarios while their performance is observed and scored by an examiner using a predetermined marking guide (Harden & Gleeson, 1979; Snodgrass, Ashby, Rivett, & Russell, 2014). Given that all students in an OSCE are evaluated based on identical predetermined guidelines, OSCEs can provide educational programs with objective, reliable, and valid evidence that students have the required professional knowledge, skills, and attitudes to progress in their studies (Meskell et al., 2015).
Although the value of OSCE in assessment of clinical competencies is well established, there are several barriers to their optimal integration in educational programs. The considerable resource investment required to administer an OSCE including time, human resources, and materials is the primary drawback cited in literature (Meskell et al., 2015; Snodgrass et al., 2014). The validity of traditional checklists used for OSCE evaluation have also been critiqued as being subjective and prone to marking error and omissions (Patricio, Julião, Fareleira, & Carneiro, 2013). Recognizing these potential shortfalls of OSCE evaluation, the College of Nursing at the University of Saskatchewan decided to explore technology solutions to support evaluative efficiency and objectivity for OSCEs. This educational innovation describes the process and outcomes of implementing an electronic OSCE (eOSCE) management program in a Master of Nursing Primary Health Care Nurse Practitioner program to improve OSCE evaluation.
Situated in the middle of the Canadian prairies, in a province with a vast land mass (588,000 square kilometers) and low population density (1.8 person per square kilometer), the the University of Saskatchewan College of Nursing focuses on education of NPs to address the primary care needs of rural, remote, and vulnerable populations. With part-time and full-time study options, the Master of Nursing Primary Health Care Nurse Practitioner program admits 20 to 25 NP students each year. NP curriculum includes 33 credit units of graduate-level theory and clinical courses with a minimum of 720 preceptor clinical hours.
To support evaluation of student clinical competencies, all NP students participate in a six-station OSCE on completion of their core theory courses in advanced assessment and diagnostic reasoning. This OSCE is positioned immediately prior to the first clinical practicum course and is designed to assess entry-level competencies required for safe and effective participation in the clinical setting. OSCE stations consist of four 15-minute focused assessments and two 30-minute integrated assessments of common medical conditions encountered in primary care. The 15-minute OSCE stations require students to conduct a focused history, physical examination, or provide health promotion counseling. The 30-minute integrated OSCE stations require students to conduct a comprehensive assessment, determine differential diagnosis, and communicate management plan. Student competence at OSCE stations is assessed by experienced NP examiners using a standardized checklist.
In preparation for the OSCE, NP students are provided with an OSCE manual detailing the purpose, process, expectations, and study resources for OSCE. Opportunities for students to practice simulated OSCE scenarios are also integrated into core NP courses. Self-directed learning for OSCE is encouraged, and students often form small study groups to practice skills in the clinical laboratory facilities available on campus, with NP faculty available to provide direction, give feedback, and answer questions on request.
Recognizing the potential of technology innovations to enhance OSCE efficiency, objectivity, and organization in the spring of 2016, NP faculty at the College of Nursing consulted with the Information Technology (IT) department to review the growing body of evidence and options related to eOSCE management systems. Based on that review, a list of desirable features for an eOSCE management system was established (Table). The benefits and drawbacks of available options were then weighed to select a management system that best fit the available resources and program needs. The program selected used touch screen iPad® checklists for OSCE grading and included features to support organization (i.e., scheduling), as well as result analysis.
Desirable Features for eOSCE System
Two NP faculty members with experience developing and implementing traditional paper-based OSCEs and one IT specialist collaborated as the implementation team for this project. The NP faculty took responsibility for the clinical and organizational requirements of OSCE administration, developing the OSCE stations, marking guides, and schedule. The IT specialist secured required IT resources and supported technology implementation, inputting the OSCE marking guides, and schedule into the eOSCE program. Faculty then reviewed the data for accuracy and consulted with IT to make any necessary changes.
Pilot testing is often beneficial when trialing a new innovation, providing a smaller scale venue for assessing and addressing implementation difficulties. To pilot test eOSCE, the implementation team decided to use electronic marking with 15-minute focused OSCE stations and continue use of traditional paper-based checklist marking guides for 30-minute integrated OSCE stations. The pilot involved a cohort of 15 NP students and eight NP examiners.
Four examiners were assigned to use eOSCE at the 15-minute focused OSCE stations and the other four examiners were assigned to use paper-based checklists at the 30-minute integrated OSCE stations. Ten days prior to the examination, each examiner was provided with information on the station they would be responsible for evaluating. A few days prior to the examination, each examiner was also provided with in-person training on their specific station. The four examiners assigned to the eOSCE stations were provided with 1 hour of hands-on training, in which they had the opportunity to practice using the eOSCE iPad application. All examiners felt this was ample time to become familiar and comfortable in using the iPad marking. To provide consistency, the same examiner evaluated the same station for all NP students. Students were also informed in advance that eOSCE would be used for marking the focused OSCE stations, rather than the traditional paper-based checklists, so they were not surprised and thus their performance affected by the different marking system.
Additional contingency planning was implemented to address potential technical problems that could occur. The iPads were tested in advance to ensure the battery life would last the duration of the examination, each iPad was fully charged, with a battery life registering 100%, and iPad charging stations were placed at each OSCE station. Two backup iPads were uploaded with examination materials and paper copies of the marking checklists were placed at each eOSCE station to be used in the event of technical problems. In addition, both NP faculty involved in implementing eOSCE were available throughout the examination to troubleshoot questions or concerns.
When the OSCE was completed, feedback was elicited from eOSCE users and implementers to evaluate the pilot. Criteria used to evaluate eOSCE included examiner and administrator perceived ease of use; perceived examiner objectivity in evaluating student performance; examiner and administrator preference for eOSCE compared with traditional paper-based methods; number and type of technical problems encountered; accuracy (completed grading sheets; tabulation); and timeliness in set-up, implementation, and results tabulation.
In general, the implementation team found eOSCE to be user friendly and an asset in running efficient, objective OSCE examinations. Although some time was required to input schedules and paper-based checklists into the eOSCE program, the availability of video and written guidelines, as well as online support, facilitated the process. Although the team gained experience using eOSCE, less time was required to input data. Given that the initial implementation of the eOSCE management system occurred in the spring of 2016, it has been used four additional times for NP OSCEs with a total of 65 students. Each successive use of eOSCE has required less set-up time.
No technical problems with use or function of the iPads was encountered during the OSCE. Examiners did not need to use back-up paper checklists or backup iPads and their consultations with NP OSCE coordinators related to minor technical questions. After 3 hours of use, the battery on one of the iPad's was at 48%, and therefore the iPad was connected to a charging station for the remaining hour of the OSCE. The other iPads retained charges between 72% to 85% at the end of the 5-hour examination.
Examiners reported that eOSCE was easy to use, provided objective assessment, and allowed them to pay more attention to student performance because they were not busy writing comments on a paper. All users reported a preference for eOSCE more than traditional paper-based checklists. OSCE coordinators were able to tabulate and verify final OSCE scores quickly and efficiently, providing timely feedback to students. Unfortunately, results had to be exported to another program (Excel®) for cumulative tabulation as the eOSCE program did not have this functional capacity. However, exporting, review, and tabulation of cumulative OSCE results within Excel was accomplished within a few hours of examination completion.
Another advantage of the eOSCE management system was that assessment data were complete for all students. No checklist items were left unmarked because eOSCE prompted and required examiners to complete all marking areas for them to submit a student's score. This is an advantage more than traditional paper-based OSCE marking, in which item numbers are frequently left unmarked, leaving uncertainty regarding student performance on these items and inaccurate results. Given the success of eOSCE implementation in the pilot, subsequent NP OSCEs have used the iPad marking for all OSCE stations—both 15-minute focused stations and 30-minute integrated stations. Potential concern was initially expressed with use of eOSCE in 30-minute stations because of the need to scroll through a longer checklist to grade student performance and the effect this may have on ease and accuracy of grading. However, these concerns proved false. Rather, examiners found it easier to scroll through the iPad checklists than search through pages of the paper-based checklists.
When considering implementing innovative assessment strategies, such as eOSCE management systems, it is important to identify and evaluate potential advantages and disadvantages associated with the innovation. Several advantages of eOSCE management systems have been identified in the literature and are consistent with the experiences of this pilot. These advantages include improved efficiency of marking, high examiner satisfaction, improved accuracy and objectivity of marking, decreased perceived mental effort of examiners, and overall preference of examiners for electronic OSCE more than traditional checklists (Hochlehnert et al., 2015; Meskell et al., 2015; Snodgrass et al., 2014).
The ability of eOSCE management systems to provide prompt feedback to students on OSCE performance is another reported advantage (Meskell et al., 2015; Snodgrass et al., 2014). Students value receiving their OSCE grades promptly, as well as opportunity to review strengths and areas for improvement in skills performance, thus, facilitating the learning process (Taylor & Green, 2013; Wardman, Yorke, & Hallam, 2017). Researchers have found that use of eOSCE management systems to provide specific, timely feedback, and oriented to ways of improving student performance of clinical competencies, is valued by students and staff (Ashby, Snodgrass, Rivett, & Russel, 2016; Wardman et al., 2017). Future plans at the College of Nursing include exploring opportunities to enhance OSCE feedback through eOSCE and the effect of such feedback on student development of clinical competencies.
Drawbacks of eOSCE management systems that have been acknowledged in the literature include financial costs, resistance to change, and the additional work and support required to initially learn and implement the system (Meskell et al., 2015). The team in this pilot project was fortunate to have several supports in place to address these drawbacks and facilitate successful implementation. The involvement of faculty, staff, and administration who were innovators and early adopters, willing and eager to trial new ideas to enhance student teaching and evaluation, was key to implementation. Other important supports included a dedicated, actively engaged IT specialist assigned to assist with the project, as well as availability of physical and financial resources. Given the many advantages of eOSCE management systems found in the literature and supported through this pilot, it is anticipated the initial additional resource investment in eOSCE will be reabsorbed long term by increased examination efficiency and objectivity (Hochlehnert et al., 2015; Meskell et al., 2015).
Implementation of the eOSCE program in the Master of Nursing Primary Health Care NP program at the University of Saskatchewan has proved to be advantageous in improving OSCE evaluation. Future opportunities for scholarship in teaching and learning related to eOSCE management systems have also emerged from this project, including more rigorous economic analysis of eOSCE, exploring automated electronic feedback to students on OSCE performance, video recording of OSCE, and the reliability and validity of OSCE in evaluating clinical competencies (Ashby et al., 2016).
OSCE is a resource-intensive, yet essential evaluative component of NP education. By objectively assessing student knowledge and skills in providing clinical care, OSCEs help ensure safety and clinical competence of students. Application of an eOSCE management systems has the potential to enhance efficiency, objectivity, and accuracy of OSCE evaluation, promoting high-quality assessment of clinical performance.
- Ashby, S.E., Snodgrass, S.H., Rivett, D.A. & Russell, T. (2016). Factors shaping e-feedback utilization following electronic objective structured clinical examinations. Nursing and Health Sciences, 18, 362–369. doi:10.1111/nhs.12279 [CrossRef]
- Harden, R.M. & Gleeson, F.A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 41–54. doi:10.1111/j.1365-2923.1979.tb00918.x [CrossRef]
- Hochlehnert, A., Schultz, J.H., Möltner, A., Tımbıl, S., Brass, K. & Jünger, J. (2015). Electronic acquisition of OSCE performance using tablets. GMS Journal for Medical Education, 32(4), Doc41. http://doi.org/10.3205/zma000983
- Meskell, P., Burke, E., Kropmans, T.J., Byrne, E., Setyonugroho, W. & Kennedy, K.M. (2015). Back to the future: An online OSCE management information system for nursing OSCEs. Nurse Education Today, 35, 1091–1096. doi:10.1016/j.nedt.2015.06.010 [CrossRef]
- Patricio, M.F., Julião, M., Fareleira, F. & Carneiro, A.V. (2013). Is the OSCE a feasible tool to assess competencies in undergraduate medical education?Medical Teacher, 35, 503–514. http://dx.doi.org/10.3109/0142159X.2013.774330 doi:10.3109/0142159X.2013.774330 [CrossRef]
- Snodgrass, S.J., Ashby, S.E., Rivett, D.A. & Russell, T. (2014). Implementation of an electronic objective structured clinical exam for assessing practical skills in pre-professional physiotherapy and occupational therapy programs: Examiner and course coordinator perspectives. Australasian Journal of Educational Technology, 30, 152–166. https://doi.org/10.14742/ajet.348 doi:10.14742/ajet.348 [CrossRef]
- Taylor, C.A. & Green, K.E. (2013). OSCE feedback: A randomized trial of effectiveness, cost-effectiveness and student satisfaction. Creative Education, 4(6A), 9–14. doi:10.4236/ce.2013.46A002 [CrossRef]
- Wardman, M.J., Yorke, V.C. & Hallam, J.L. (2017). Evaluation of a multi-methods approach to the collection and dissemination of feedback on OSCE performance in dental education. European Journal of Dental Education, 22, e203–e211. https://doi.org/10.1111/eje.12273 doi:10.1111/eje.12273 [CrossRef]
Desirable Features for eOSCE System
|Evidence of past successes|
|Touchscreen tablet or iPad® checklists|
|Compatibility with university IT infrastructure|
|Good consumer support (i.e., easily accessible and helpful support staff)|
|Exam security—encryption, local storage of data|
|Reasonable cost, free trial|
|Psychometric analysis capabilities|
|Ability to add free-text comments|