Journal of Nursing Education

Major Article 

The HLKES-2: Revision and Evaluation of the Health Literacy Knowledge and Experiences Survey

Danielle Walker, PhD, RN, CNE; Carol Howe, PhD, RN, CDE; Marion Dunkerley, EdD, RN; Joy Deupree, PhD, MSN, RN, WHNP-BC; Catherine Cormier, PhD, CNE, RN



Low health literacy impacts individual health and the health care system. The Health Literacy Knowledge and Experience Survey (HLKES) was created to evaluate preparedness of nurses to provide health literate care. However, the instrument was developed a decade ago and needs revision. The purpose of this study was to update and shorten the HLKES into a feasible, valid, and reliable instrument.


The HLKES was refined into a 14-item instrument (10 knowledge questions and four experience questions). Expert review was obtained. Face validity was assessed, and pilot and field testing with students was conducted.


Scale content validity index was 0.95, and individual questions demonstrated appropriate item difficulty and discrimination. Cronbach's alpha coefficient was .565 for the 10 multiple choice questions and .843 for the four Likert-type questions, indicating good reliability.


A reliable and valid HLKES-2 was developed to evaluate health literacy knowledge and experiences in a contemporary setting. [J Nurs Educ. 2019;58(2):86–92.]



Low health literacy impacts individual health and the health care system. The Health Literacy Knowledge and Experience Survey (HLKES) was created to evaluate preparedness of nurses to provide health literate care. However, the instrument was developed a decade ago and needs revision. The purpose of this study was to update and shorten the HLKES into a feasible, valid, and reliable instrument.


The HLKES was refined into a 14-item instrument (10 knowledge questions and four experience questions). Expert review was obtained. Face validity was assessed, and pilot and field testing with students was conducted.


Scale content validity index was 0.95, and individual questions demonstrated appropriate item difficulty and discrimination. Cronbach's alpha coefficient was .565 for the 10 multiple choice questions and .843 for the four Likert-type questions, indicating good reliability.


A reliable and valid HLKES-2 was developed to evaluate health literacy knowledge and experiences in a contemporary setting. [J Nurs Educ. 2019;58(2):86–92.]

The Patient Protection and Affordable Care Act of 2010 addressed the need for health literacy initiatives to promote patient-centered prevention, assure equity, and improve health outcomes. Increasingly, health literacy is recognized as a necessary asset for health (Rikard, Thompson, McKinney, & Beauchamp, 2016). The World Health Organization (WHO, 1998) defined health literacy as “the cognitive and social skills which determine the motivation and ability of individuals to gain access to, understand and use information in ways which promote and maintain good health” (p. 10). Currently, more than 36% of American adults struggle to understand basic health information (Sentell & Braun, 2012).

Health literacy levels of U.S. adults have a significant impact on health care cost and health outcomes. Vernon, Trujillo, Rosenbaum, and DeBuono (2007) estimated that low literacy levels may cost the health care system as much as $238 billion annually. A cost analysis of low literacy within the Veterans Health Administration indicated a significant relationship between low health literacy and higher medical costs (Haun et al., 2015). Individuals with low health literacy have poorer health outcomes, including increased mortality rates among senior citizens, fewer self-management skills for chronic disease, increased medication errors, and decreased participation in health screenings (Berkman, Sheridan, Donahue, Halpern, & Crotty, 2011; DeWalt et al., 2011; Sudore et al., 2006).

Health care providers should be prepared to effectively communicate with all patients using health literate practices based on providers' previous knowledge, experience, and education. For new graduates, this can be a daunting task. Nevertheless, health literate communication is at the heart of quality patient-centered care. To ensure patient understanding, health care providers are expected to provide clear, concise, and easy to understand information to all patient populations using Health Literacy Universal Precautions (U.S. Department of Health and Human Services, 2015). Despite efforts of national and international health care organizations (Centers for Disease Control and Prevention, 2017) to provide health literacy training, knowledge gaps regarding health literacy best practices continue to exist among health professionals (Rajah, Ahmad Hassali, Jou, & Murugiah, 2018).

Because of the need for new graduates and all RNs to understand the complexities of health literacy, a number of studies in recent years have examined the educational preparation for nurses. Data support positive associations between formal education in health literacy and clinician competency (Howe, Walker, & Watts, 2017; Koo, Horowitz, Radice, Wang, & Kleinman, 2016). Multiple studies have found that health literacy concept deficiencies exist in undergraduate nursing curricula (Cornett, 2009; Jukkala, Deupree, & Graham, 2009; Smith & Zsohar, 2011). Cormier and Kotrlik (2009) found significant knowledge gaps of senior baccalaureate nursing students in identifying high-risk groups and assessing guidelines for written health care information. The American Academy of Nursing (Loan et al., 2018) recently issued a policy statement encouraging nurse educators to incorporate health literacy models of care into nursing school curricula to prepare future nurses to practice Health Literacy Universal Precautions with all patients. Rather than screening patients for low health literacy, nurses should assume that all patients may have difficulty understanding health information (DeWalt et al., 2011). In addition, the American Association of Colleges of Nursing (2008) identified core competencies for baccalaureate-prepared nurses, including the use of “evidence-based practices to guide health teaching” and “a commitment to health of vulnerable groups.”

As nursing schools incorporate health literacy into curricula, reliable and valid measures for health literacy competency are needed. The Health Literacy Knowledge and Experience Survey (HLKES), developed in 2006, was designed to assess the health literacy knowledge and experiences of baccalaureate nursing students enrolled in Louisiana State Universities (Cormier, 2006). The instrument included three sections: health literacy knowledge, health literacy experiences, and demographic data. Because health literacy remains a global health issue (WHO, 2013) that directly impacts patient safety (U.S. Department of Health and Human Services, 2015), it is vital for nurses to possess the knowledge and skills required to assist patients with their health care needs. Developing a contemporary knowledge and experience survey that addresses current foundational concepts and statistics of health literacy can help identify knowledge gaps and direct health literacy training. The purpose of this study was to establish the reliability and validity of HLKES-2, the second version of the HLKES.


Five nurse researchers with health literacy expertise from four states convened to adapt and shorten the existing 38-question HLKES, which includes 29 multiple-choice knowledge questions and nine Likert scale experience questions (Cormier, 2006). During the initial meeting, the team reviewed the key concepts of the original instrument for relevancy, including health literacy screening, foundational knowledge of health literacy principles, appropriate health literate communication techniques, evaluation of written materials, and expected behaviors of people with low health literacy. Since the development of the original HLKES, health literacy screening no longer represents best practice and therefore was considered not relevant, justifying the elimination of six health literacy screening questions.

With the intention of minimizing the number of questions, the team used the enduring four original HLKES content areas to guide evaluation of the remaining questions for congruence with current health literacy best practice guidelines and item analysis standards. Questions were removed that repeated the same concept or did not meet minimum item analysis standards during original testing of the HLKES. Modifications were made to some questions to potentially improve item difficulty or discrimination. Finally, the team revised questions to emphasize application or synthesis of knowledge within the context of the nursing process. The knowledge section of the HLKES was decreased to 14 questions as an outcome of this review process. The four original concepts are represented in the 14 knowledge questions.

The experience section of the HLKES originally contained nine 4-point Likert-type scale questions. Questions asked about the frequency of health literacy focused practices, such as “How often do you evaluate the cultural appropriateness of health care materials?” Each question was assessed for relevance, emphasizing key nursing actions related to health literate communication. Four questions from the original experience section were retained, and five questions were eliminated due to irrelevance to current practice. After an initial draft of the HLKES-2 was complete, institutional review board exempt status was obtained, a content validity index (CVI) was performed, cognitive interviews were conducted with a small group of students, and a pilot and field study were conducted.

Content Validity

To establish content validity, eight health professionals with expertise in health literacy were asked to review the HLKES-2. The process was identical to that used in the original study by Cormier and Kotrlik (2009). Six of the eight experts agreed to participate; four provided a complete review, rating all 18 of the HLKES-2 questions. Comments provided by the two other experts were considered during final revisions of the HLKES-2 but were not used to calculate the CVI. The four reviewers with completed ratings were doctorally prepared nurses with research experience and publications related to health literacy. Each rated survey questions using a 4-point Likert scale, with 1 = not relevant, 2 = somewhat relevant, 3 = quite relevant, and 4 = highly relevant. Scores were dichotomized into relevant (3, 4) and not relevant (1, 2) to calculate both the question level and overall scale content validity. The CVI for each question was calculated as the proportion of experts rating the question as relevant (Ruhio, Berg-Weger, Yiehh, Lee, & Rauch, 2003). The question CVI ranged from 0.5 to 1.0, including one question with a CVI of 0.5, two questions with a CVI of 0.75, and 17 questions with a CVI of 1.0. Scale CVI was calculated by averaging the item-level CVIs, summing all item-level CVIs, and dividing by the number of questions. Scale CVI was calculated as 0.95, which was above the 0.80 considered acceptable (Polit & Beck, 2006).

Face Validity

Cognitive interviews were conducted to establish face validity of the instrument. Three groups of students in their last semester of an associate/baccalaureate nursing degree program participated in cognitive interviews conducted at three nursing schools. The research team engaged students in a think-aloud process to identify any misinterpretation (Desimone & Le Floch, 2004). Each HLKES-2 question was read aloud by the researcher as students read along with a hard copy. The researcher then asked students, “What were you thinking as I read the question?” The researcher used more specific probes such as: “Tell me how you were thinking as I read the question aloud. Can you repeat the question in your own words?” “Were the multiple choice responses clear?” “What were you thinking as I went through each answer choice?” Unscripted probes were used at the discretion of the research team to allow for exploration of unexpected discussions. The research team also observed students as they responded to questions to ensure no problems existed with the survey format and structure.

Tape recordings of the focus group confirmed written notes taken during the cognitive interviews, facilitating the use of this information to revise HLKES-2 question stems, answers, and distractors. Question stems were shortened as focus group participants reported that fellow students may skip lengthy questions. In addition, students identified jargon within the questions. For example, students were confused by the term “target population” in the answer choice worded “obtain feedback from the target population” when asked about best ways to create culturally and linguistically appropriate educational materials. The students perceived this as jargon, and in discussion, students proposed that using the term “cultural group” was clearer.

Pilot Study

The research team conducted a pilot study of the 18-question HLKES-2 (14 knowledge questions and 4 experience questions) with 117 students enrolled in three nursing programs from three states. Using convenience sampling, students were notified of the HLKES-2 pilot survey via e-mail and a class announcement. The research team assured students that completion of the HLKES-2 was voluntary and had no bearing on their course grade, and that participation was anonymous and confidential. In addition, information regarding the purpose of the study and procedures were explained. Participation in the survey implied consent. Students who wished to participate provided demographic data such as age, gender, ethnicity, level in the nursing program, and how frequently health literacy was emphasized in their program. Students completed the HLKES-2 online, using either their laptop computer or cell phone, typically at the beginning of a class.

Data were entered into SPSS® version 23 to conduct test item analysis. Initial question difficulty and question discrimination of the knowledge portion were at less than acceptable levels for select questions, suggesting the need for further revisions of the survey questions before field testing. The mean score on the 14-question knowledge portion of the HLKES-2 was 8.96 (SD = 1.94). Initial internal consistency of the HLKES-2 knowledge scale was low (Cronbach's α = .289). These results led to additional revisions to improve the question stems and distractors. At the same time, four questions were added to the knowledge section to provide a more robust assessment of health literacy knowledge. The mean score on the four-question experience portion was 5.6 (SD = 2.41). Because the four questions in the HLKES-2 experience scale had high reliability (Cronbach's α = .743), no changes were made to this section.

Field Testing

Following the same procedures, the revised 22-question version of the HLKES-2 (18 knowledge and 4 experience questions) was administered to 345 HLKES-2 naïve students either at the end of their first semester of nursing coursework or in their last year of nursing courses. Data were entered into IBM®SPSS version 23 to describe student demographics and calculate item analysis and scale psychometrics.


A convenience sample of 345 nursing students enrolled in a baccalaureate nursing program, either in their first semester (n = 95) or last year of nursing school (n = 250), completed the HLKES-2. The majority (90%) of students were women. Sixty-nine percent of the students were between the ages of 18 and 22 years, and 23% were between the ages of 23 and 27 years. The majority of the participants were White (69%), followed by Hispanic (15.6%), Asian (7%), and Black (3.2%); the remaining 5.2% of students did not indicate race/ethnicity. The majority of students were from Texas (n = 186, 53.9%); the remaining students were from California (n = 114, 33%), Louisiana (n = 14, 4.1%), and Alabama (n = 29, 8.4%). Although a small portion of students (n = 16, 4.6%) reported that health literacy was never emphasized in their curriculum, the majority of students (n = 278, 81%) reported health literacy was emphasized sometimes or frequently.

HLKES-2 Knowledge Scale

Evaluation of the HLKES-2 knowledge questions was conducted through statistical analysis and not by concept area. The research team evaluated the item difficulty and item discrimination of all 18 HLKES-2 knowledge questions. Item difficulty, defined as the proportion of students who answered the question correctly, was considered ideal if above 0.3; elimination was considered for any question with an item difficulty between 0.2 and 0.3. Item discrimination, another indicator of question quality and defined as the ability of a question to differentiate between high and low performers on the test, was calculated by a point biserial correlation. An item discrimination between 0.4 and 0.7 indicated a very good question (McDonald, 2008). Questions with an item difficulty below 0.3 or an item discrimination below 0.2 were eliminated, resulting in the removal of five questions from the instrument.

The remaining 13 questions were evaluated using correlations and the Cronbach's alpha if the questions were deleted. Three questions were eliminated because the corrected item total correlation for these questions was below 0.12, and the reliability of the instrument improved when these questions were deleted.

After statistical analysis was conducted and 10 items remained, only three of the four concept areas were represented: foundational knowledge of health literacy principles, appropriate health literate communication techniques, and expected behaviors of people with low health literacy. Questions about written materials did not correlate well with other questions on the knowledge scale and did not perform at the desired item difficulty or discrimination levels. This eliminated the concept area from the knowledge portion of the instrument. The remaining 10-item knowledge scale reliability (Table 1) was alpha = 0.57. The mean total score on the HLKES-2 knowledge scale was 6.8 (SD = 1.89), indicating a minimally proficient knowledge of health literacy. Differences between groups based on demographic characteristics were not assessed.

HLKES-2: Knowledge ScaleHLKES-2: Knowledge Scale

Table 1:

HLKES-2: Knowledge Scale

To assess validity, the research team evaluated whether the HLKES-2 could distinguish between nursing student groups that it theoretically should be able to distinguish between. More specifically, an independent t test was performed to differentiate HLKES-2 scores between students with more knowledge (senior nursing students, n = 250) and students with less knowledge (first-semester students, n = 95). Senior students had a significantly higher average HLKES-2 score compared with first-semester nursing students (7.2 [SD = 1.84] versus 5.9 [SD = 1.89], t(343) = 5.926, p = .000).

HLKES-2 Experience Scale

Student responses for each question ranged from 0 (never) to 3 (always). The minimum score on the scale was 0, and the maximum score on the scale was 12. The mean total score was 5.31 (SD = 2.94). A series of Mann-Whitney U tests were conducted to assess differences in individual HLKES-2 experience questions. Students in their first semester of nursing school had significantly lower levels of experience related to using written material when providing health care information compared with senior students (1.27 [.972] versus 1.58 [.784], U = 9513.5, p = .002). The four-question experience scale (Table 2) demonstrated good internal reliability (Cronbach's α = .843).

HLKES-2: Experience Scale

Table 2:

HLKES-2: Experience Scale

Discussion and Recommendations

Since the HLKES was first published in 2009, numerous researchers around the world have used or adapted the instrument to target specific health care professions, including advanced practice nurses, respiratory therapists, and occupational therapists (Cafiero, 2013; Dunkerley, 2016; Hartman, 2014; Knight, 2011; Mullan et al., 2017). At the onset of this project, the research team was keenly aware of the importance of modifying the HLKES to maintain its relevance in the changing field of health literacy. Thus, the purpose of this study was to revise the HLKES to ensure congruence with best practices in health literacy and to establish initial reliability and validity statistics of the revised instrument, the HLKES-2. The research team also recognized the importance of developing a shorter version of the HLKES to capture health literacy knowledge and experiences, improving its feasibility and usability in busy nursing education programs.

The HLKES-2 retained three of the five original concepts (foundational knowledge of health literacy principles, appropriate health literate communication techniques, and expected behaviors of people with low health literacy). Health literacy screening was removed as it no longer represents best practice. Rather than screening individual patients, health literacy experts now advocate that health care providers should use Health Literacy Universal Precautions, assuming that all patients may have difficulty comprehending health care information (Brega et al., 2015). The concept of evaluation of written materials was removed after statistical analysis demonstrated it did not perform well on item analysis and did not correlate with other questions on the knowledge portion of the scale. After much discussion, the research team decided removal of this concept was acceptable. The questions from the original and revised instrument focused on the effective creation of health literate educational materials. The knowledge required for evaluation and development of written materials is a subspecialty of health literacy knowledge and is not the role of the bedside RN. Evaluation of written materials remains in the experience scale of the HLKES-2, as this is an important function of the RN.

The results of this study demonstrate that the revised, shortened HLKES-2 is a valid instrument for assessment of health literacy knowledge and experiences among nursing students. Initial face and content validity suggested that the HLKES-2 adequately measured the construct of health literacy. In addition, initial concurrent validity was confirmed as scores on the HLKES-2 demonstrated the ability to discriminate between first-semester and last-year nursing students in an expected theoretical fashion. Question analysis statistics provided additional confirmation regarding the quality of HLKES-2 questions and the instrument as a whole.

The research team prioritized developing a practical tool to measure the health literacy knowledge and experiences of nursing students to increase the ease of administration and decrease the time required for participants to complete the survey. The team shortened the 39-question HLKES to the 14-question HLKES-2, improving the usability of the instrument. Cafiero (2013) reported a Cronbach's alpha of .57 for the HLKES knowledge scale. The Cronbach's alpha was .56 after removal of 19 questions and revision. Although the HLKES-2 Cronbach's alpha value is less than the commonly accepted value of .7, it may yet be an acceptable value considering the limited number of items and the multiple choice format. Reliability above .5 generally is considered appropriate for a 50-item multiple choice examination. The HLKES-2 experience scale demonstrates improved reliability from Cafiero's (2013) report of moderate reliability (alpha = .69) to high reliability (alpha = .843).

Significant limitations of this study include a small sample size (n = 345), convenience sampling, and homogeneity. These characteristics limit generalizability to all nursing programs. Broader testing of the HLKES-2 is necessary to determine reliability and validity among other groups. Future research must include factor analysis to explore and confirm dimensionality of the instrument.

Although the purpose of this study was the design and evaluation of a health literacy assessment tool, the results provide insight into current nursing curricula. Although senior students participating in the study demonstrated basic proficiency in health literacy knowledge, self-reported experiences were surprisingly low. Participants reported minimal health literacy experiences, with most students replying “sometimes.” The only significant difference in experiences between novice and more experienced nursing students occurred in one category, using written material to provide education. These findings suggest nurse educators need to develop intentional learning experiences designed to equip entry-level nurses with health literacy best practice required for the delivery of safe, quality patient-centered care.


Efforts of nursing faculty to promote student knowledge and experience with health literacy support the position statement of the American Association of the Colleges of Nursing (2008) outlined in The Essentials of Baccalaureate Education. Nurse educators struggle with many competing demands on time and effort to cover topics in the nursing curricula that are critical for preparing students for entry into practices. Ideally, health literacy is woven into all aspects of nursing curricula. The HLKES-2 provides a brief assessment of nursing students' health literacy knowledge and experience acquisition throughout a nursing program, assessing both student achievements and program outcomes.


  • American Association of Colleges of Nursing. (2008). The essentials of baccalaureate education for professional nursing practice. Washington, DC: Author.
  • Berkman, N.D., Sheridan, S.L., Donahue, K.E., Halpern, D.J. & Crotty, K. (2011). Low health literacy and health outcomes: An updated systematic review. Annals of Internal Medicine, 155, 97–107. doi:10.7326/0003-4819-155-2-201107190-00005 [CrossRef]
  • Brega, A.G., Freedman, M.A., LeBlanc, W.G., Barnard, J., Mabachi, N.M., Cifuentes, M. & West, D.R. (2015). Using the health literacy universal precautions toolkit to improve the quality of patient materials. Journal of Health Communication, 20(Suppl. 2), 69–76. doi:10.1080/10810730.2015.1081997 [CrossRef]
  • Cafiero, M. (2013). Nurse practitioners' knowledge, experience, and intention to use health literacy strategies in clinical practice. Journal of Health Communication, 18, 70–81. doi:10.1080/10810730.2013.825665 [CrossRef]
  • Centers for Disease Control and Prevention. (2017). Health literacy. Retrieved from
  • Cormier, C.M. (2006). Health literacy: The knowledge and experiences of senior level baccalaureate nursing students (Doctoral dissertation, Louisiana State University). Retrieved from
  • Cormier, C.M. & Kotrlik, J.W. (2009). Health literacy knowledge and experiences of senior baccalaureate nursing students. Journal of Nursing Education, 48, 237–248. doi:10.3928/01484834-20090416-02 [CrossRef]
  • Cornett, S. (2009). Assessing and addressing health literacy. Online Journal of Issues in Nursing, 14, 1–13.
  • Desimone, L.M. & Le Floch, K.C. (2004). Are we asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26, 1–22. doi:10.3102/01623737026001001 [CrossRef]
  • DeWalt, D.A., Broucksou, K.A., Hawk, V., Brach, C., Hink, A., Rudd, R. & Callahan, L. (2011). Developing and testing the health literacy universal precautions toolkit. Nursing Outlook, 59, 85–94. doi:10.1016/j.outlook.2010.12.002 [CrossRef]
  • DeWalt, D.A., Callahan, L.F., Hawk, V.H., Broucksou, K.A., Hink, A., Rudd, R. & Brach, C. (2010). Health literacy universal precautions toolkit. Rockville, MD: Agency for Healthcare Research and Quality.
  • Dunkerley, M.C.A. (2016). Health literacy: The knowledge, attitudes, and behavior of beginning baccalaureate nursing students of the use of health literacy strategies to improve patient understanding (Doctoral disseration). Available from ProQuest Dissertations and Theses Global database. (Order No. 10137832)
  • Hartman, E. (2014). Nurses lack skills to teach: Increasing undergraduate nursing skills related to patient education (Doctoral dissertation). Retrieved from
  • Haun, J.N., Patel, N.R., French, D.D., Campbell, R.R., Bradham, D.D. & Lapcevic, W.A. (2015). Association between health literacy and medical care costs in an integrated healthcare system: A regional population based study. BMC Health Services Research, 15, 249. doi:10.1186/s12913-015-0887-z [CrossRef]
  • Howe, C.J., Walker, D. & Watts, J. (2017). Use of recommended communication techniques by diabetes educators. HLRP: Health Literacy Research and Practice, 1, 145E–152E.
  • Jukkala, A., Deupree, J.P. & Graham, S. (2009). Knowledge of limited health literacy at an academic health center. The Journal of Continuing Education in Nursing, 40, 298–302. doi:10.3928/00220124-20090623-01 [CrossRef]
  • Knight, G.D. (2011). An evaluation of the health literacy knowledge and experience of registered nurses in Georgia (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (Order No. 3464456)
  • Koo, L.W., Horowitz, A.M., Radice, S.D., Wang, M.Q. & Kleinman, D.V. (2016). Nurse practitioners' use of communication techniques: Results of a Maryland oral health literacy survey. PloS One, 11, 0146545E. doi:10.1371/journal.pone.0146545 [CrossRef]
  • Loan, L.A., Parnell, T.A., Stichler, J.F., Boyle, D.K., Allen, P., VanFosson, C.A. & Barton, A.J. (2018). Call for action: Nurses must play a critical role to enhance health literacy. Nursing Outlook, 66, 97–100. doi:10.1016/j.outlook.2017.11.003 [CrossRef]
  • McDonald, M.E. (2008). Developing trustworthy classroom tests. In Penn, B. (Ed.),Mastering the teaching role: A guide for nurse educators (pp. 275–286). Philadelphia, PA: F.A. Davis.
  • Mullan, J., Burns, P., Weston, K., McLennan, P., Rich, W., Crowther, S. & Osborne, R.(2017). Health literacy amongst health professional university students: A study using the Health Literacy Questionnaire. Education Sciences, 7, 54. doi:10.3390/educsci7020054 [CrossRef]
  • Patient Protection and Affordable Care Act, 42 U.S.C.A. § 18001 et seq. (2010).
  • Polit, D.F. & Beck, C.T. (2006). The content validity index: Are you sure you know what's being reported? Critique and recommendations. Research in Nursing & Health, 29, 489–497. doi:10.1002/nur.20147 [CrossRef]
  • Rajah, R., Ahmad Hassali, M.A., Jou, L.C. & Murugiah, M.K. (2018). The perspective of healthcare providers and patients on health literacy: A systematic review of the quantitative and qualitative studies. Perspectives in Public Health, 138, 122–132. doi:10.1177/1757913917733775 [CrossRef]
  • Rikard, R.V., Thompson, M.S., McKinney, J. & Beauchamp, A. (2016). Examining health literacy disparities in the United States: A third look at the National Assessment of Adult Literacy (NAAL). BMC Public Health, 16, 975. doi:10.1186/s12889-016-3621-9 [CrossRef]
  • Ruhio, D., Berg-Weger, M., Yiehh, S.S., Lee, E.S. & Rauch, S. (2003). Objectifying content validity: Conducting a content validity study. Social Work Research, 27, 94–104. doi:10.1093/swr/27.2.94 [CrossRef]
  • Sentell, T. & Braun, K.L. (2012). Low health literacy, limited English proficiency, and health status in Asians, Latinos, and other racial/ethnic groups in California. Journal of Health Communication, 17, 82–99. doi:10.1080/10810730.2012.712621 [CrossRef]
  • Smith, J.A. & Zsohar, H. (2011). Teaching health literacy in the undergraduate curriculum: Beyond traditional methods. Nursing Education Perspectives, 32, 48–50. doi:10.5480/1536-5026-32.1.48 [CrossRef]
  • Sudore, R.L., Mehta, K.M., Simonsick, E.M., Harris, T.B., Newman, A.B., Satterfield, S. & Ayonayon, H.N. (2006). Limited literacy in older people and disparities in health and healthcare access. Journal of the American Geriatrics Society, 54, 770–776. doi:10.1111/j.1532-5415.2006.00691.x [CrossRef]
  • U.S. Department of Health and Human ServicesOffice of Disease Prevention and Health Promotion. (2015). AHRQ health literacy universal precautions toolkit (2nd ed.). Retrieved from
  • Vernon, J.A., Trujillo, A., Rosenbaum, S.J. & DeBuono, B. (2007). Low health literacy: Implications for national health policy. Washington, DC: Department of Health Policy, School of Public Health and Health Services, The George Washington University. Retrieved from
  • World Health Organization. (1998). Health promotion glossary. Retrieved from
  • World Health Organization. (2013). Health literacy: The solid facts. Retrieved from

HLKES-2: Knowledge Scale

Question Stem and Answer ChoicesQuestion DifficultyQuestion Discrimination
Low health literacy is most prevalent among which age group?.37.136
  15 to 30 years
  31 to 44 years
  45 to 60 years
  65 to 85 years
What should a nurse consider when conducting health teaching with a patient?.59.301
  The last grade completed in school accurately reflects a patient's reading ability
  Most patients read three to five grade levels lower than the last year of school completed
  Most patients with low literacy will ask questions if they do not understand information
  Literacy levels of high school graduates are adequate to manage health care needs
What is the likelihood that a nurse will encounter a patient with low health literacy?.65.284
  1 in 3 patients
  1 in 6 patients
  1 in 9 patients
  1 in 12 patients
Which health behavior is common among patients with low health literacy?.69.266
  Lack of participation in preventative health care
  Disinterest in learning about health care problems
  Unwillingness to make lifestyle changes to improve health
  Frequently asking questions to clarify health care instructions
Patients with low health literacy skills compared to those with adequate health literacy skills:.77.234
  Regularly participate in preventative health care
  Are less likely to use emergency room services
  Consistently see the same health care provider for medical treatment
  Are hospitalized more frequently for management of chronic illness
What should the nurse consider when developing a plan of care for a client with low health literacy?.93.376
  These patients often seek health care prematurely
  It is relatively easy to identify patients with low literacy
  Patients with low literacy may avoid asking questions
  Patients with low literacy will readily admit difficulty reading
What is the priority action of the nurse when conducting health teaching?.95.185
  Speak slowly
  Draw pictures
  Provide a handout
  Use simple language
What is the best method for the nurse to evaluate the effectiveness of health care teaching?.85.254
  Administer a pretest and posttest with instructions
  Have the patient teach back the information to the nurse
  Ask “Do you understand the information I just gave you?”
  Verbally ask the patient a series of questions following instructions
The nurse is caring for a patient newly diagnosed with a health condition. What should be the priority focus during the first teaching session?.58.292
  A detailed explanation of the disease pathophysiology
  All treatment options available to manage the health condition
  Information related to the incidence and prevalence of the health condition
  One main message and a specific action for management of the health condition
Which of the following questions would provide the nurse with the best estimate of reading skills of the patient?.47.259
  “Do you have difficulty reading?”
  “Do you need eye glasses to read?”
  “What is the last grade you completed in school?”
  “Would you read the label of this medication bottle for me?”

HLKES-2: Experience Scale

Question StemSenior Students (n = 250)First-Semester Students (n = 95)Up

How often do you evaluate the reading level of written health care materials before using them for patient teaching?1.130.8231.020.98910680.116
How often do you evaluate the cultural appropriateness of health care materials?1.330.9031.320.99211648.771
How often do you evaluate the use of illustrations in written health care materials before using them for patient teaching?1.390.8681.401.03611866.991
How often do you use written materials to provide health care information to a patient or community group?1.580.7841.270.9729513.5.002

Dr. Walker is Assistant Professor and Dr. Howe is Assistant Professor, Texas Christian University, Fort Worth, Texas; Dr. Dunkerley is Assistant Professor, California Baptist University, Riverside, California; Dr. Deupree is Assistant Professor, University of Alabama Birmingham, Birmingham, Alabama; and Dr. Cormier is Associate Professor, Louisiana State University Alexandria, Alexandria, Louisiana.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Danielle Walker, PhD, RN, CNE, Assistant Professor, Texas Christian University, TCU Box 298620, Fort Worth, TX 76129; e-mail:

Received: July 27, 2018
Accepted: October 11, 2018


Sign up to receive

Journal E-contents