Journal of Nursing Education

EDUCATIONAL INNOVATION 

Web-Based Testing Procedure for Nursing Students

Mary Jo Gilmer, PhD, RN; Jerry Murley, MEd; Emily Kyzer, BS

Abstract

ABSTRACT

Web-based testing has been used to assess achievement of course objectives in a nursing school. Initial logistical challenges were met with collaboration and problem solving. Students and faculty reported satisfaction with Web-based testing.

Abstract

ABSTRACT

Web-based testing has been used to assess achievement of course objectives in a nursing school. Initial logistical challenges were met with collaboration and problem solving. Students and faculty reported satisfaction with Web-based testing.

Advances in technology have enabled the World Wide Web to be used as an alternative conduit for teaching and learning activities that are less bound by time and space than traditional methods. However, development of methods for assessment of student learning in Web-based formats that equal or surpass traditional techniques remains a challenge unmet by educators. This article describes one nursing program's creative approach to a Web-based testing procedure that may be applicable to other institutions.

In its 1999 report to the National Research Council, the Committee on Developments in the Science of Learning concluded, "information to be tested has the greatest influence on guiding students' learning" (Bransford, Brown, & Cocking, 1999, p. 233). For education to be continuous and interdependent, assessment must indicate teaching strategies have been effective and student learning has occurred. Therefore, the emergence of new technologies and teaching strategies challenges educators to determine the best means to assess student learning.

To measure the competence of nursing graduates to practice at the entry level, the National Council of Licensing Examination (NCLEX) implemented a computerized testing format in 1994 (Bosma, 1994). Graduates of practitioner programs also use a computerized testing format for certification examinations. Nursing faculty members have responded by ensuring their graduates are prepared for this testing format by facilitating attainment of an adequate level of computer knowledge and skills to reduce students' anxiety related to computer-based testing (Bloom & Trice, 1997; Forker & McDonald, 1996). The use of computers has gained prominence in instruction, but Web-based testing for assessment of learning in course work has been slower to gain acceptance. Web-based testing in nursing curricula warrants further investigation because it has the potential for significant educational benefits related to flexibility of time and place. Just as students have a variety of learning styles, they may differ in needs regarding testing procedures.

Literature Review

Web-based testing has benefits for both nursing faculty and students. Advantages include flexibility in scheduling and time use, timely feedback of test results, flexibility in testing location, and immediate storage of student records in a central database.

The ability to schedule Web-based tests outside of the lecture period frees up time usually allotted for test administration and review (Bloom & Trice, 1997). The result is a net gain of time that can be dedicated to teaching and learning (Anna, 1998; Bloom & Trice, 1997). Faculty time also is saved due to a decrease in the amount of time spent scoring tests and recording grades (Anna, 1998).

Although subject to the availability of facilities and proctors (if desired), testing can be scheduled for a range of dates and times, allowing students to select a time compatible with their academic and personal commitments (Anna, 1998; Bloom & Trice, 1997; Tomey, 2000). In one study, 99% of students surveyed indicated they appreciated the scheduling flexibility computerized testing affords (Anna, 1998).

Bloom and Trice (1997) evaluated differences in test scores of both junior-level and senior-level nursing students taking either paper-and-pencil or computer-based tests. Results showed that overall computer-based test performance was equal to or better than paper-and-pencil format. The authors concluded computer-based testing did not produce any detrimental effects to students' grades or learning level.

Computer-based testing also provides the option for immediate score feedback and display of rationales for correct answers, which provides immediate reinforcement of the material (Anna, 1998; Bloom & Trice, 1997). Other benefits include uniform and unbiased grading methods, and standardization of the testing environment, instructions, and time allotted (Bloom & Trice, 1997; Tomey, 2000). In many cases, reports that identify students' areas of strengths and weaknesses can be printed. Difficulty and discrimination indexes also are readily available, and faculty members can use this feedback to modify the course content and teaching methods (Tomey, 2000).

No studies were found that described the advantages of offering several testing locations, yet providing immediate storage of student records in a central database. Depending on the need for proctor supervision, Web-based tests can be taken from homes, offices, libraries, or other distant sites. Although few studies documented advantages of Web-based testing for course evaluation purposes, no studies identifying disadvantages of Web-based testing were found. However, before the strategy is widely implemented, more information is needed.

Procedure

At Vanderbilt University School of Nursing, students can enter the nursing curriculum with at least 72 hours of undergraduate coursework. Students enrolled in the program then take a three-semester sequence of generalist nursing courses, leading to a three-semester sequence of specialty nursing courses, culminating in a master of science in nursing degree. During planning meetings, faculty expressed interest in developing strategies that would afford flexibility to students in the assessment of their learning during coursework. An innovative testing program was developed through a grant from Healthstream, Inc.

The testing program was piloted in a course focusing on the theoretical basis of practice that provides the scientific knowledge base needed to respond to actual or potential health problems. The course content was closely linked with content tested on the NCLEX, and faculty members suggested students' may be better prepared for the NCLEX through use of similarly formatted questions.

On the first day of the course, the 99 students enrolled in the course were informed of the computer-based format and provided with instructions for scheduling their own examination times. The entire process was included in the course syllabus. Students who expressed apprehension about accessing and navigating Web-based information were informed of resources available in the school's instructional media center (IMC). Testing announcements and information (e.g., case studies) were posted on the course Web page, and students were encouraged to discuss the case studies with their peers before the examination.

Testing

Facilities. The IMC computer laboratory in which most of the tests were conducted contains 15 desktop computers with Internet connections. Nursing students use the computer laboratory throughout the day to complete class assignments.

Students have different learning styles and testing capabilities. One student with an identified learning disability stated she needed more than the 2-hour time period and had difficulty with close proximity of other students in the IMC. Arrangements were made for this student to access the same Webbased examinations on a computer located in another campus facility. The testing program was developed so students could take the examination from remote locations by having identified proctors during the examination.

Security. Because this 5-credithour course is part of a challenging curriculum, faculty members were concerned students may feel pressured to excel in their testing performance and be tempted to deviate from the university's honor code for test taking. Several strategies were used to maximize test security. To protect the integrity of the test and ensure student accountability, course instructors required the test be taken under proctor supervision. During testing hours, staff or faculty members affiliated with the IMC proctored students by verifying identification, issuing verbal and written instructions, and logging into the examination with a user name and password. Each student then used a login procedure. Use of proctors had the added benefits of orienting students to the process and observing and quickly ameliorating impediments in the testing environment. The university's honor code statement for test taking appeared on the first screen, and students were required to affirm they would adhere to university standards of intellectual integrity before continuing with the examination.

Test Format. Questions relating to the same topics were coded and grouped to enable both faculty members and students to track performance. The test questions and answers within each group were in random order to minimize the risk of students either intentionally or inadvertently sharing information during tests. For example, questions focusing on stress were grouped, and the order of the questions and answers were randomized. This format and the honor code affirmation minimized students' sharing answers during the testing period. Printing of the test was not allowed.

Because the examination was Web based, information about individual scores in content areas was automatically e-mailed to students on submission of their responses. Students were allowed to move freely among the questions and could change their answers during that process. Students were informed they would be restricted to answering single, stand-alone questions on the NCLEX and would be unable to navigate among the questions. Currently, faculty are considering limiting students' ability to change answers to better prepare them for the NCLEX.

Scheduling. Faculty and IMC staff determined that students could selfschedule the 2-hour exam at any time within a 7-day period. Instructional media center staff organized a signup procedure. Two hours were allotted for each testing period because the numbers of test items on the three examinations ranged from 74 to 89, but most students completed the examinations within 70 minutes.

To accommodate the 99 students enrolled in the course, seven 2-hour "quiet" blocks were scheduled, and sign-up sheets were posted in the IMC. Care was taken to schedule blocks of time when students were not in other classes or clinical experiences. Students were informed when sign-up sheets were posted and were advised to schedule their testing period early to secure a time compatible with their individual needs. Students were allowed to change their scheduled block, as long as space was available. For example, this flexibility allowed two students to reschedule their examinations at the end of the testing week because of illness during their originally scheduled times. An alternative approach that permits scheduling via the Web has been developed and currently is used.

Test Results. After completion of each examination, students were provided with their test scores, including a breakdown of performance percentages by content categories, time expended, and average class score to date. No other information was provided until all students had completed the examination. Students then could return individually after the 7-day block of time reserved for testing to electronically review the specific questions they had answered incorrectly. The review included each question, the possible answers, the student's answer, and the correct answer with rationale. Testing data were available to course faculty throughout the testing period, so they were able to monitor the performance of students who had completed the examination and obtain up-to-date item analyses of examination questions.

Evaluation

After completion of the first Webbased examination, a survey was distributed to students at the end of a class period. Ninety of the 99 students enrolled in the course completed the survey anonymously. Approval to report the survey results was obtained from the behavioral sciences institutional review board. Four items were used to assess students' satisfaction with computer-based testing compared to students' satisfaction with paper-and-pencil testing. Questions addressed alternative testing times, testing instructions, testing environment, and testing review. A 5-point Likert scale with responses ranging from strongly disagree to strongly agree was used.

Results

Alternative Testing limes. Students reported strong agreement that having alternative tunes for testing was helpful to them (range = 1 to 5, mean = 4.81, SD = .52).

Testing Instructions. Students strongly agreed the testing instructions were easy to understand and follow (range = 1 to 5, mean = 4.63, SD = .57). No student comments indicated difficulty understanding the instructions. The availability of the proctor to answer any questions precluded students' misunderstanding of the instructions.

Testing Environment. Students agreed that, compared to a large classroom, the computer room was a suitable environment for testing (range = 1 to 5, mean = 3.96, SD = .98). They did report noise from a printer in the area as a distraction. Therefore, the printer was moved to an adjoining room to eliminate the noise and traffic problems, signs stating that students were taking tests nearby and requesting quiet in the area were posted in the halls and elevators, and quiet blocks of testing time were instituted.

Test Review. Students agreed it was helpful to have the opportunity to return to the testing center to review their answers (range = 1 to 5, mean = 3.97, SD = 1.07). When students returned after the testing period, they could only review the questions and possible answers for items they had answered incorrectly and receive a description of the concept being tested. Initially, faculty members thought it would be a better learning experience for students if only an explanation of the underlying concept for each question was provided, leaving students to determine the best answer. However, students requested the correct answer be stated explicitly, and this suggestion was incorporated into future test reviews. Thereafter, students could view their answer and the correct answer to each question they had answered incorrectly or omitted.

Twenty-one percent of students (n - 21) took advantage of the review opportunity on the first Web-based examination compared to 18% of students who used the test review after taking the paper-and-pencil test. An advantage of the computerized-based test review was that faculty did not have to be present to conduct the review or explain correct answers.

Test Scores. Student scores on the Web-based examination did not differ significantly from those of the students who had taken the paper-andpencil version the previous year. The paper-and-pencil students' mean score of 86.25 closely resembled the computer-based students' mean score of 86.04 (p >. 10).

Discussion

Student responses to the survey were helpful in further development of the computer-based testing procedure, and improvements were made in response to many student suggestions. Scheduling was revised based on student input to include a wider range of times. Students could schedule time to take the examination either during periods designated as "quiet" or during periods when the computer laboratory was open for general use. Faculty members and IMC staff also maintained flexibility and responded rapidly as concerns and challenges arose (e.g., printer noise in the computer room, students who needed to take the examination in a different location). A separate computer laboratory nearby currently is designated as the testing center.

Case Studies

For faculty of higher education, one frustrating aspect of computerbased, randomized, multiple-choice examinations is the inability to use long case studies for groups of questions. The IMC programmers overcame this obstacle by assigning questions to cluster groups. Therefore, whenever a case study appeared on the examination, the scenario appeared at the top of the group of questions, the group of related questions would remain together, and a link appeared after each question that would return students to the scenario. In addition, by posting copies of the case studies on the course Web page before each examination, students were able to analyze the case studies in preparation for the examination.

Interdisciplinary Collaboration

Communication between the course coordinator and the IMC director was essential to minimize difficulties in initiating the computerbased testing process. The interdisciplinary team was proactive in problem solving to ensure students remained well informed. While pilot testing the first examination, it was noted that because the answer choices were randomized, the response "all of the above" would not be appropriate. This problem was discussed quickly, and the response was changed to "all of these choices are correct."

When IMC staff posted the testing sign-up sheets requested by the course coordinator, the staff informed the course coordinator so she could make an announcement to the students. Staff also added examination opportunities as soon as the course coordinator communicated the need for alternative testing times.

Communication

Advance communication between course faculty and IMC staff, the ability to make timely modifications to the examination programming, availability of appropriate laboratory resources, and consideration of the demands placed on the IMC by others were essential to ensure a smooth and efficient process. Although the testing format had been well piloted in the 3-month period before implementation, course faculty requested additional features after reviewing the format. Because IMC programmers had instructional design experience, the format of the examination was developed to ensure navigational ease. Instructions and expectations were presented to the students the week before the examination, enabling students to prepare for the entire experience of using the new examination format and process.

Faculty Demands and Challenges

A time commitment was required from faculty to coordinate this testing procedure. However, frequent planning meetings, coupled with administrative and staff support, contributed to fairly smooth implementation of the procedure. Faculty who created test items for this format had the additional responsibility of providing a rationale for each correct answer. Although this task involved additional time, it ensured careful construction of questions and their justification.

Challenges remain in planning Web-based testing for distance education students. To ensure honor code compliance, proctors may need to be designated at distant sites.

Conclusion

Faculty and students were supportive of this Web-based testing approach. The approach includes controlled randomization of questions and answers to ensure each student's test, while different in order of presentation, covers equivalent material and executes instructor preferences regarding content organization and wording. Web-based testing facilitates development of carefully sequenced test questions as a means to cluster types of question content (e.g., a series of questions related to a case study).

In this method, tracking, an essential ingrethent in the assessment of learning and evaluation of learning programs, is more detailed and consistent than in traditional, in-class, multiple-choice testing methods because results are entered automatically into a central database. Students have access to the Webbased testing process outside of the formal classroom, resulting in additional opportunities during class time for clarification and answering questions. This extra class time is not dedicated to new content but to reinforcement of previous content. Because instructors can provide learners with information about their strengths, challenges, and additional resources, self-directed learning is reinforced. As the survey results indicate, students were positive about the testing procedure and even provided unsolicited comments and accolades for the process.

References

  • Anna, D. J. (1998). Computerized testing in a nursing curriculum. Nurse Educator, 23(4), 22-26.
  • Bloom, K.C., & Trice, L.B. (1997). The efficacy of individualized computerized testing in nursing education. Computers in Nursing, 15, 82-88.
  • Bosma, J. (1994). New approach to NCLEX. Nursing Health Care, 15(3), 115.
  • Bransford, J.D., Brown, A.L., & Cocking, R.R. (Eds.). (1999). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press, National Research Council.
  • Forker, J.E., & McDonald, M.E. (1996). Methodologie trends in the healthcare professions: Computer adaptive and computer simulation testing. Nurse Educator, 21(5), 13-14.
  • Tomey, A.M. (2000). Computer-based testing system. Nurse Educator, 25(3), 108109.

10.3928/0148-4834-20030801-11

Sign up to receive

Journal E-contents