Journal of Nursing Education

Improving Examination Performance Using Exam Analysis

Vaneta Mabley Condon, RN, MS; David Eli Drew, PhD

Abstract

ABSTRACT

The Exam Analysis is a diagnostic and prescriptive program in which a student and an instructor identify why the student failed to answer specific questions correctly on an examination. Problems related to incorrect answers are then categorized into five problem areas. This analysis is used to develop a plan which will help the student utilize specific skills in order to improve examination performance. This article describes the Exam Analysis program, gives specific information which will enable an instructor to assist a student in completing an Exam Analysis, and discusses the results of a survey in which 105 nursing students evaluated the impact of participating in the program on subsequent examination performance. A large majority of the students surveyed agreed that participation in the Exam Analysis program resulted in improved examination performance. Improved exam-taking skills, better study skills, decreased test anxiety and increased self-confidence were identified as Exam Analysis program outcomes by the students included in the study.

Abstract

ABSTRACT

The Exam Analysis is a diagnostic and prescriptive program in which a student and an instructor identify why the student failed to answer specific questions correctly on an examination. Problems related to incorrect answers are then categorized into five problem areas. This analysis is used to develop a plan which will help the student utilize specific skills in order to improve examination performance. This article describes the Exam Analysis program, gives specific information which will enable an instructor to assist a student in completing an Exam Analysis, and discusses the results of a survey in which 105 nursing students evaluated the impact of participating in the program on subsequent examination performance. A large majority of the students surveyed agreed that participation in the Exam Analysis program resulted in improved examination performance. Improved exam-taking skills, better study skills, decreased test anxiety and increased self-confidence were identified as Exam Analysis program outcomes by the students included in the study.

The Problem

Attempting to solve the problem of how to help the student who studies hard yet scores poorly on examinations resulted in the development of the Exam Analysis program. The problem has been recognized by educators for many years. Phoebe Helm of Triton College expressed this concern in 1977 when she wrote:

Many students seem to have good study habits, acquire information rather easily, and perform well in class discussions and clinical situations, and yet score lower on multiple choice tests than their general performance as students would indicate. Students in health sciences and professional schools who may be facing for the first time situational types of questions, designed to measure application of information to practice, seem to be particularly plagued by the problem. In discussing this problem with students as well as their instructors, one often hears evidence of feelings of frustration, lack of control and despair (Helm, 1977).

The purpose of this article is to describe the Exam Analysis program, discuss how it can be used to improve examination performance and to report the results of an opinion survey administered to 105 nursing students following participation in the program. The objective of the survey was to collect opinions concerning the effectiveness of the Exam Analysis program in improving examination performance.

Literature Review

Literature in education and nursing for the past 15 years was reviewed in order to identify methods used to diagnose and treat problems causing individual college students to perform poorly on examinations. Failure to pass examinations is often attributed to insufficient knowledge of the subject matter. Ineffective study skills and information processing skills such as note taking, textbook reading, memory development and time management contribute to this knowledge deficit (Campbell & Davis, 1990; Christ & Adams, 1979; Curley, Estrin, Rohwer & Thomas, 1987; Demery, 1988; Deny & Murphy, 1986; Foster, Zimmerman & Condon, 1991; Gadzella, Ginther & Williamson, 1987; Kulik, Kulik & Shwalb, 1983; Sherman, 1985; Smith, 1990).

A student's English language skills may be inadequate to understand the reading assignments and to correctly interpret and answer the examination questions. Problems with reading comprehension and speed, vocabulary, and writing notes during classroom lectures may result in poor grades on examinations because students come from English as a Second Language (ESL) backgrounds, minority groups or educationally under-prepared backgrounds (Astin, 1982; Corlett & Schendel, 1987; Gerace & Mestre, 1983; Hiekinheimo & Shute, 1986; Ivie, 1982; Keane, 1993; Memmer & Worth, 1991; Nettles, Thoeny & Gosman, 1986; Noel, Levitz, Saluri & Assoc., 1987; Rosberg, 1983; Royer, Marchant, Sinatra & Lovejoy, 1990; Tinto, 1987; Wood, 1988). Students from minority backgrounds may find that cultural differences and lack of familiarity with multiple choice questions and problem solving types of questions have an adverse effect on examination performance (Aquino, 1979; Frierson, 1986a; Frierson, Malone & Shelton, 1993; Gerace & Mestre, 1983; Mine, 1987).

Poor scores on examinations may be related to extreme anxiety or "test panic." Inability to focus on what the question asks, to remember what was learned, to think logically while taking the examination and to remember to use effective exam techniques are symptoms of disabling anxiety that may result in poor exam performance (Brown & Nelson, 1983; Bruch, Pearl & Giordano, 1986; Chandler & Mako, 1987; Frierson & Hoban, 1987; Helm, 1977; Hembree, 1988; Hudesman, Loveday & Woods, 1984; Kirkland & Hollandsworth, 1980; Sarason, 1980; Tyron, 1980).

The use of poor exam-taking techniques decreases a student's examination performance resulting in lower scores. Brozo, Schmelzer, & Spires (1984), Bruch, Pearl & Giordano (1986) and Frierson (1986a) found that students who use effective exam taking methods score higher than other students on examinations. Studies involving nursing students and medical students showed that when students were taught good test-taking techniques their performance improved on both standardized and teachermade examinations (Condon, 1994; Frierson, 1984b; Frierson, 1986b; Frierson, Malone & Shelton, 1993; Reed & Hudepohl, 1985; Sarnacki, 1981).

"Test wiseness" is a cognitive ability, or a set of skills, which a test-taker can use to improve a test score regardless of his or her knowledge of the content area (Millman, Bishop & Ebel, 1965; Sarnacki, 1979). Research involving undergraduate college students as well as graduate students has shown that test-wise cognitive guessing strategies can be taught and that the use of these strategies improves examination scores (Brozo, Schmelzer, & Spires, 1984; Dolly & Williams, 1986; Frierson, 1984a; Frierson, 1984b; Sarnacki, 1979; Sarnacki, 1981).

The review of literature identified a number of reasons for poor examination performance by students and discussed measures which have been successful in helping students improve their examination performance. Frierson stressed the importance of "helping students diagnose and correct problems of test taking" so that their performance on examinations will "accurately reflect their acquired knowledge" (Frierson, 1984a). Several reports of methods used to assist students with preparation for examinations were found in the educational literature. A method to analyze and improve the accuracy of a student's perceived readiness for examination performance (PREP) is described by Pressley and colleagues (Pressley & Ghatala, 1988; Pressley, Snyder, Levin, Murray & Ghatala, 1987). An accurate assessment of readiness for taking an examination assists the student in appropriate preparation for the examination. Foos (1992) reports a study which demonstrated that students work harder when they expect an exam to be difficult.

Nist and Simpson (1984, 1989) have developed a metacognitive study strategy to improve independent learning effectiveness. PLAE involves preplanning, listening, activating and evaluating study strategies for a specific examination. The evaluation step includes assessing the adequacy of time spent in preparation, the appropriateness of study strategies utilized, knowledge of information needed to correctly answer the questions, care in reading and processing questions, patterns of errors in answering questions and plans for strategies to be used in preparation for future examinations. The authors report that use of this method resulted in improved examination preparation and performance for 73 college freshmen (Nist & Simpson, 1989).

Takahiro Sato, a Japanese engineer, has developed a student-problem (S-P) Chart which can be used to analyze both the performance of students and the difficulty of the questions on an examination (Dinero & Blixt, 1988). A "Modified Caution Index" can be calculated to evaluate a student's pattern of correct and incorrect responses and to advise the student of general problems which appear to be interfering with examination performance.

Nursing education literature has reported several instruments or strategies for identifying problems causing poor performance on State Board review questions (Bininger, Mason, Potts & Wilson, 1990; Eddy, 1986; Helm, 1981; Lagerquist, 1986; and Rollant, 1988). However, the only method/program found with specifically developed tools for both identifying and treating individual students' problems related to poor performance on exams was the Exam Analysis program (Condon, 1986; Condon, 1987; Condon, 1994).

Exam Analysis Program

The Exam Analysis was first developed in 1982 by the staff of the Learning Assistance program at Loma Linda University School of Nursing. It is a diagnostic and prescriptive program used to assist a student in identifying problems related to poor examination performance.

A checklist (Figure 1) is used which categorizes items into 5 problem areas identified in the review of literature as having negative effects on examination performance. A step-by-step description of the Exam Analysis process is shown in Figure 2. Recommended exam-taking techniques and interventions for improving examination performance based on specific problems identified during the Exam Analysis are developed by the student and the instructor as a part of the Exam Analysis process. Suggested exam-taking techniques and suggested interventions for identified problems are discussed in previous papers and are available upon request (Condon, 1986; Condon, 1987).

Over 500 Exam Analyses have been done since 1982 at Loma Linda University. The program has been presented at six national educational conferences. As a result, the Exam Analysis program is being used by a growing number of educators across North America.

Frank Christ and the Learning Assistance Center practitioners at California State University, Long Beach have used the Exam Analysis program since 1986. Christ states, "The LAC staff; faculty and students at CSULB have found the Exam Analysis program to be helpful in recognizing test-taking problems and in providing specific recommendations for remediation and improvement" (Christ, 1990).

Research Objectives

Although the Exam Analysis has been used since 1982, its effectiveness in improving examination performance has not been reported in the literature. The present study surveyed students' opinions regarding the effectiveness of participation in the Exam Analysis program in improving examination performance. Specific objectives included surveying students' opinions regarding:

1. The effectiveness of participation in the Exam Analysis program toward improving: a) Examination performance; b) Management of study time; c) Study skills and methods; d) Test-taking skills and methods; and e) Self-confidence and decreasing exam anxiety.

2. The relative importance of specific exam-taking techniques recommended in the Exam Analysis program.

3. The relative importance of specific possible outcomes of Exam Analysis program participation.

4. How to improve the Exam Analysis program.

5. Other aspects of the Exam Analysis program that students wish to comment upon.

Research Methods

The research sample included all current or former students who had completed an Exam Analysis at Loma Linda University School of Nursing. LAP records identified 197 students who had completed 358 Exam Analyses. Questionnaires were mailed to the 142 students for whom a correct current address could be obtained. Of this number, 106 questionnaires (74%) were returned. The study included the 105 students who returned both pages of the Questionnaire in time to be included in the survey.

The 2-page questionnaire included 10 items about exam performance which used a Likert scale format, two items which asked the student to rank the importance of specific exam-taking techniques or possible Exam Analysis outcomes, two open-ended questions and a short section focusing on demographic data. Before mailing, the questionnaire was evaluated by ten nursing faculty members and six students. The students also filled out the survey as a pilot study. Several minor changes were made as a result of this evaluation in order to improve the clarity of the questionnaire items and the directions.

A coding guide was written and the completed questionnaires were coded for computer input. Descriptive statistics included frequencies, means, and standard deviations for individual items and main categories. T-tests for independent groups and a one-way analysis of variance were undertaken to identify significant differences related to demographic variables. The SPSS program was used for computer analysis of the data. The findings were discussed, conclusions drawn and implications for further study made.

Figure 1. Checklist categorizes items into five problem areas.

Figure 1. Checklist categorizes items into five problem areas.

Research Results

The research sample included 105 students who had previously participated in the Exam Analysis program. Analysis of pertinent data showed that the students ranged from 20 to 54 years of age with students over 25 making up 49% of the study sample. Since the sample was made up of nursing students, only 11% of the sample was male.

The racial background of the students included: Asian (33%), White/Anglo (30%), African-American/Black (19%), Hispanic (13%), and other/missing data (5%). The primary language used by 65% of the students was English but 51% of them had at least one parent whose primary language was not English.

The Table shows the percentage of students expressing strong agreement (1), agreement (2), disagreement (3) and strong disagreement (4) with statements about the effectiveness of the exam analysis in improving exam performance, study skills/methods, management of study/exam time, exam-taking skills and related attitudes. Means, standard deviations and the number of students responding to items within each of these categories are also shown in the Table.

The average values for the combined means in each category were: improved exam skills (1.56), improved exam performance (1.65), improved study methods/skills (1.87), improved management of study/exam time (1.89), and decreased exam anxiety/increased self-confidence (2.03). All of these support the hypothesis that in the opinion of students who have participated in the program the Exam Analysis is effective in improving examination performance.

FIGURE 2Exam Analysis Procedure

FIGURE 2

Exam Analysis Procedure

Analysis of the data related to specific items reveals the strongest agreement with the following: recommendation of the Exam Analysis to a friend who wished to improve exam performance (1.44), improved exam taking skills (1.56), use of Exam Analysis techniques on State Board Exams (1.59), improved exam performance following Exam Analysis (1.65) and not spending too long on one question as an exam time management technique (1.65). The least amount of agreement was expressed by students for the following items: improved time management (2.10), decreased exam anxiety (2.06), increased self-confidence (2.0), and increased prediction of exam questions (2.0) following an Exam Analysis. However, it is important to note that the means of all of these items still show agreement that the Exam Analysis was helpful in each of these areas. Time management techniques were not taught as part of the Exam Analysis program but were discussed with students who missed a number of questions due to inadequate knowledge of the subject matter.

Students ranked the exam-taking techniques recommended in the Exam Analysis program from most important (1) to least important (4) as follows:

1. Identifying key words in the questions.

2. Formulating an answer to a multiple choice question before reading the answer options given.

3. Considering each answer option carefully/marking each option as true or false.

4. Using "test-wise" guessing strategies.

The importance of specific Exam Analysis outcomes was also ranked by participants in the study from most important (1) to least important (5):

1. Improved test-taking skills.

2. Improved study skills/methods.

3. Decreased exam anxiety.

4. Increased understanding of important concepts/ English vocabulary.

5. Improved self-confidence.

Two open-ended questions related to recommendations for improving the Exam Analysis program and comments about the Exam Analysis were categorized by common themes. Over 80% of the remarks were positive such as: the exam analysis needs no improvement, it should be shared with other students, and it increased academic success. There were no negative comments. Suggestions for improvement included the need for more LAP staff/services and the need to do Exam Analysis earlier (before the student is actually failing a course).

T-tests used to analyze significant differences between groups by age, gender and primary language and one-way analysis of variance by racial background showed some statistically significant differences among groups. ESL (English as a second language) students expressed more agreement than primarily English users with: the importance of using "test-wise" guessing strategies, answering easy questions first, recommending an Exam Analysis to a friend who wished to improve exam performance and increasing self- confidence as an important Exam Analysis outcome. Primarily English speaking students rated the importance of generating your own answer on multiple choice questions more important than ESL students did. Older students, more than younger students, felt that generating your own answer was an important exam-taking skill. These findings are of particular interest when compared with the results of a study reported by Crocker and Schmitt (1987). These researchers found that generation of an answer before selecting a response on multiple-choice questions led to higher examination performance for low-test anxiety examinees but not for highly anxious examinees. It may be that ESL students and younger students have more test anxiety than primarily English speaking students and older, more experienced students do. If so, generating an answer before selecting a multiple choice response may not have been found helpful for these students.

Table

TABLEExam Analysis Outcomes: Students' Opinions Regarding the Effectiveness of the Exam Analysis in Improving Examination Performance, Study Skills/Methods, Exam Skills and Related Attitudes

TABLE

Exam Analysis Outcomes: Students' Opinions Regarding the Effectiveness of the Exam Analysis in Improving Examination Performance, Study Skills/Methods, Exam Skills and Related Attitudes

Females and older students rated the importance of understanding important concepts and improved English language/vocabulary higher than males and younger students did. African-American/Black students expressed significantly more agreement than did Anglo/White students that participation in the Exam Analysis program increased their use of study groups or tutoring.

Summary and Conclusions

The data indicate that of the 105 students surveyed, a large majority had a positive opinion of the program. The students expressed much more agreement than disagreement with the idea that participation in the Exam Analysis program improved their examination performance. The students in the study also tended to agree that the Exam Analysis helped them improve in: study skills, use of recommended exam-taking skills/ methods, knowledge of subject matter, self-confidence, understanding important concepts and improved English language/vocabulary, and decreased exam anxiety. These findings agree with the literature review in which all of these factors were cited as problems interfering with performance on examinations. However, this study was limited in scope, addressing only the students' perceptions that the Exam Analysis was helpful in improving examination performance in these specific ways. It did not investigate differences in examination scores before and after Exam Analyses. Therefore, we do not know whether or not there was a significant change in examination performance following participation in the Exam Analysis program or whether some of the findings may represent a halo effect.

Improved exam-taking skills followed by improved study skills were the outcomes ranked as most important by students who participated in the Exam Analysis program. Exam taking skills which were seen as most important included identifying key words in the question, generating your own answer and marking each answer option as either true or false on multiple choice questions. Test-wise guessing strategies were ranked as least important.

There were a few significant differences due to age, gender and racial background. Students with ESL backgrounds, older students, and African-American/Black students differed significantly from traditional students on several items. These students appear to especially benefit from participation in the Exam Analysis program in specific ways that were identified.

The Exam Analysis program is a practical technique that can be used by nursing faculty, faculty in other disciplines and learning assistance practitioners in a variety of settings to assist individuals who wish to improve exam scores. Of the 105 students who were surveyed in this study, 97% agreed that participation in the Exam Analysis program resulted in improved examination performance. Nursing faculty, learning assistance practitioners, and other educators who have used the Exam Analysis have stated that they find it useful in helping students do better on examinations. Empirical research is needed in order to validate the Exam Analysis as an accepted method for improving exam performance. However, the available evidence indicates the students and faculty who have participated in the program feel that it has been helpful.

Suggestions for Further Study

The present survey showed that students who have used the Exam Analysis believe that it is effective as a method to improve examination performance. However, this study was limited in scope.

Suggestions for further study include:

1. Repeating the study using a larger population of students from a variety of majors, disciplines and ethnic backgrounds.

2. Repeating the study using a qualitative research approach including open-ended interview questions.

3. Comparing the opinions of students with high GPAs with those having low GPAs.

4. Studying completed Exam Analysis results to determine the frequency of problems related to poor examination performance.

5. Identifying the changes in examination performance and Exam Analysis results following participation in the Exam Analysis program.

References

  • Aquino, N.S. (1979). Factors related to foreign nurse graduates' test taking performance. Nursing Research, 28, 111-114.
  • Astin, A. W. (1982). Minorities in higher education. San Francisco: Jossey-Bass Inc.
  • Bininger, C.J., Mason, E.J., Potts, N.L., & Wilson, D.E. (1990). Preparing for state boards. Nursing '90, 20, 129-136.
  • Brown, S.D., & Nelson, T.L. (1983). Beyond the uniformity myth: A comparison of academically successful and unsuccessful test-anxious college students. Journal of Counseling Psychology, 30(3), 367-374.
  • Brozo, WG., Schmelzer, R.V., & Spires, H.A. (1984). A study of test-wiseness clues in college and university teacher-made tests with implications for academic assistance centers. (College Reading and Learning Assistance Technical Report 84-01.) East Lansing, MI: National Center for Research on Teacher Learning. (ERIC Document Reproduction Service No. ED 240928)
  • Bruch, M.A., Pearl, L., & Giordano, S. (1986). Differences in the cognitive processes of academically successful and unsuccessful test-anxious students. Journal of Counseling Psychology, 33(2), 217-219.
  • Campbell, A., & Davis, S. (1990). Enrichment for academic success: Helping at-risk students. Nurse Educator, 25(6), 33-37.
  • Chandler, T. A. & Mako, T.J. (1987). Exam performance as a function of exam completion time, state anxiety, and ability. Paper presented at the 95th Annual Convention of the American Psychological Association, New York, NY. (ERIC Document Reproduction Service No. ED 290117)
  • Christ, FL. (1990). Unpublished statement written at the College Reading and Learning Assistance Conference, Irvine, California.
  • Christ, FL. & Adams, WR. (1979). You can learn to learn. Englewood Clifts: Prentice Hall, Inc.
  • Condon, V.M. (1986, June). Exam analysis procedure. Paper presented at 2nd Annual Conference for College Learning Assistance Professionals, Long Beach, CA. (ERIC Document Reproduction Service No. ED 273196)
  • Condon, V.M. (1987). The exam analysis. Journal of College Reading, 20, 147-154.
  • Condon, VM. (1994). Factors related to self-reported improvement in examination performance following participation in the Exam Analysis program. Paper presented at the 12th Annual Conference on Research in Nursing Education, National League for Nursing, Orlando, FL.
  • Corlett, D., & Schendel, R. (1987). Basic skills across the disciplines at the university level: Reading, language arts and reference skills. Portland, OR: University of Portland. (ERIC Reproduction Service No. ED 282479)
  • Crocker, L., & Schmitt, A. (1987). Improving multiple-choice test performance for examinees with different levels of test anxiety. Journal of Experimental Education, 55(4), 201-205.
  • Curley, R.G., Estrin, E.T., Rohwer, WD., Thomas, J. W (1987). Relationships between study activities and achievement as a function of grade level and course characteristics. Contemporary Educational Psychology, 12, 324-343.
  • Demery, M. (1988). Academic skills module. Natchitoches, LA: Northwestern State University.
  • Deny, S.J., & Murphy, D. A. (1986). Designing systems that train learning ability. Review of Educational Research, 56(1), 1-39.
  • Dinero, T., & Blixt, S. (1988). Information about tests from Sato's S-P chart. College Teaching, 36(3), 123-128.
  • Dolly, J.P., & Williams, KS. (1986). Using test-taking strategies to maximize multiple-choice test scores. Educational and Psychological Measurement, 46(3), 619-625.
  • Eddy, M. (1986). Success packet (Video cassette and instruction book). Mountain View, CA: Med-Ed Program Planners, Inc.
  • Foos, P. (1992). Test performance as a function of expected form and difficulty. Journal of Experimental Education, 60(3), 205-211.
  • Foster, P, Zimmerman, G., & Condon, V. (1991). Assessing student outcomes in a nursing learning assistance program. Journal of Nursing Education, 30, 352-359.
  • Frierson, H.T. Jr. (1984a). Enhancing success in a test: Performance oriented meritocracy. Chapel Hill, NC: University of North Carolina. (ERIC Document Reproduction Service No. ED 255140)
  • Frierson, H. (1984b). Impact of an intervention program on minority medical students' National Board Part I performances. Journal of the National Medical Association, 76, 1185-1190.
  • Frierson, H.T. Jr. (1986a). Enhancing minority college students' performance on educational tests. Journal of Negro Education, 55(1), 38-45.
  • Frierson, H.T. Jr. (1986b). Two intervention methods: Effects on groups of predominantly black nursing students' board scores. Journal of Research and Development in Education, 19, 18-23.
  • Frierson, H.T. Jr., & Hoban, D. (1987). Effects of test anxiety on performance on the NBME Part I Examination. Journal of Medical Education, 62, 431-433.
  • Frierson, H.T. Jr., Malone, B., & Shelton, P. (1993). Enhancing NCLEX-RN performance: Assessing a three-pronged intervention approach. Journal of Nursing Education, 32, 222-224.
  • Gadzella, B.M., Ginther, D.W., & Williamson, JD. (1987). Study skills, learning processes and academic achievement. Psychological Reports, 61(1), 167-172.
  • Gerace, W.J., & Mestre, J.P. (1983). Identifying learning handicaps of college age Spanish-speaking bilingual students majoring in technical subjects. Amherst, MA: Massachusetts University. (Bilingual Research Project ERIC Document Reproduction Service No. ED 241278)
  • Heikinheimo, P.S., & Shute, JCM. (1986). The adaptation of foreign students: Students views and institutional implications. Journal of College Student Personnel, 27(5), 399-406.
  • Helm, P. ( 1981). Strategies for success on nursing exams (Cassette Recording). San Francisco, CA: Review for Nurses Tapes Inc.
  • Helm, P. (1977). Recall vs recognition: A strategy for reading and responding to multiple choice test items. Proceedings of the Tenth Annual Western College Reading Association Conference, 10, 143-147.
  • Hembree, R. (1988). Correlates, causes, effects and treatment of text anxiety. Review of Educational Research, 58(1), 47-77.
  • Hudesman, J, Loveday, C, & Woods, N. (1984). Desensitization of test anxious urban community-college students and resulting changes in grade point average: A replication. Journal of Clinical Psychology, 40(1), 65-67.
  • Ivie, S.D. (1982). Why black students score poorly on the NTE. High School Journal, 65, 169-175.
  • Keane, M. (1993). Preferred learning styles and study strategies in a linguistically diverse baccalaureate nursing student population. Journal of Nursing Education, 32, 214-221.
  • Kirkland, K., & Hollandsworth, JG. Jr. (1980). Effective test taking: Skills acquisition versus anxiety reduction techniques. Journal of Consulting and Clinical Psychology, 48, 431-438.
  • Kulik, CL., Kulik, JA, & Shwalb, B.J. (1983). College programs for high risk and disadvantaged students: A meta-analysis of findings. Review of Educational Research, 53(3), 397-414.
  • Lagerquist, S. (1986). Practice questions and answers for NCLEXRN. San Jose: Review for Nurses Tapes Inc.
  • Memmer, M., & Worth C. (1991). Retention of English-as-asecond language (ESL) students: Approaches used by California's 21 generic baccalaureate nursing programs. Journal of Nursing Education, 30, 389-396.
  • Millman, J, Bishop, CH., & Ebel, R. (1965). An analysis of test-wisenese. Educational and Psychological Measurement, 25, 707-726.
  • Mine, H. (1987, July). A study of the effects of child rearing patterns on test anxiety in late adolescence. Paper presented at the 9th Biennial meeting of the International Society for the Study of Behavioral Development, Tokyo, Japan.
  • Nettles, M.T., Thoeny, A.R., & Gosman, E.J. (1986). Comparative and predictive analyses of Black and White students' college achievement and experience. Journal of Higher Education, 57, 289-318.
  • Nist, S., & Simpson, M. (1984). PLAE: A model for planning successful independent learning. Journal of Reading, 28(3), 218-223.
  • Nist, S., & Simpson, M. (1989). PLAE: A validated study strategy. Journal of Reading, 33(3), 182-186.
  • Noel, L., Levitz, R., Saluri, D., & Associates. (1987). Increasing student retention. San Francisco: Jossey-Bass Publishers.
  • Pressley, M., & Ghatala, E. (1988). Delusions about performance on multiple-choice comprehension tests. Reading Research Quarterly, 23(4), 454-464.
  • Pressley, M., Snyder, B., Levin, J., Murray, H. & Ghatala, E. (1987). Perceived readiness for examination performance (PREP) produced by initial readings of text and text containing adjunct questions. Reading Research Quarterly, 22(2), 219236.
  • Reed, S., & Hudepohl, N. (1985). High-risk students: Evaluating a student retention program. Nurse Educator, 10(5), 32-38.
  • Rollant, P (1988). Test-taking strategies for the RN-NCLEX exam. Imprint, 35(2), 117-131.
  • Rosberg, WH. (1983). Students in English as a Second Language classes: A community college experience. Cedar Rapids, IA: Kirkwood Community College. (ERIC Document Reproduction Service No. 234843)
  • Royer, J, Marchant, H., Sinatra, G., & Lovejoy, D. (1990). The prediction of college course performance from reading comprehension performance. Evidence for general and specific prediction factors. American Educational Research Journal, 27(1), 158-179.
  • Sarason, 1.0. (Ed.) (1980). Test anxiety: Theory, research, and applications. Hillsdale, N.J.: Lawrence Erlbaum Associates, Publishers.
  • Sarnacki, R.E. (1979). An examination of test-wiseness in the cognitive test domain. Review of Educational Research, 49(2), 252-279.
  • Sarnacki, R.E. (1981). The effects of test-wiseness in medical education. Evaluation and the Health Professions, 4, 207-221.
  • Sherman, T.M. (1985). Learning improvement programs: A review of controllable influences. Journal of Higher 56(1),
  • Smith, V (1990). Nursing student attrition and implications pre-admission advisement. Journal of Nursing Education, 215-218.
  • Tinto, V (1987). Leaving college: Rethinking the causes and of student attrition. Chicago: The University of Chicago Press.
  • Tyron, G.S. (1980). The measurement and treatment of test anxiety. Review of Educational Research, 50(2), 343-372.
  • Wood, P.H. (1988). Predicting college grades and helping colleagues to assist poor readers to succeed in college courses. Paper presented at the National Reading and Language Arts Educator's Conference, Kansas City, MO. (ERIC Document Reproduction Service No. ED 309699)

TABLE

Exam Analysis Outcomes: Students' Opinions Regarding the Effectiveness of the Exam Analysis in Improving Examination Performance, Study Skills/Methods, Exam Skills and Related Attitudes

10.3928/0148-4834-19950901-05

Sign up to receive

Journal E-contents