Journal of Nursing Education

Major Article 

Validation of the Self-Assessment of Nursing Informatics Competencies Scale Among Undergraduate and Graduate Nursing Students

Jeungok Choi, PhD, RN; Suzanne Bakken, DNSc, RN

Abstract

This study investigated the psychometrics of the Self-Assessment of Nursing Informatics Competencies Scale for nursing students in undergraduate (n = 131) and graduate (n = 171) programs. The scale had a valid five-factor structure, accounting for 69.38% of the variance, high internal consistency reliabilities (0.96 for the total scale and 0.84 to 0.94 for subscales), and good responsiveness (standardized response mean = 0.99), as well as significantly improved scores in nursing students with diverse demographic and educational backgrounds after taking an informatics course. Our factor structure was similar to the original scale, differing slightly in four items’ loadings. This difference may reflect current informatics practice or the greater diversity of our sample. Further research is needed on the factor, data/information management skills, and related item loadings. This scale could be used to assess informatics competencies and develop educational strategies that prepare nursing students as informatics-competent graduates in information technology–rich environments. [J Nurs Educ. 2013;52(5):275–282.]

Dr. Choi is Assistant Professor, School of Nursing, University of Massachusetts, Amherst, Massachusetts; and Dr. Bakken is Alumni Professor of Nursing, Professor of Biomedical Informatics, and Director, Center for Evidence-Based Practice in the Underserved, Columbia University, New York, New York.

This project was supported by the Health Services Resource Administration grant D11 HP07346.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Jeungok Choi, PhD, RN, Assistant Professor, School of Nursing, University of Massachusetts, 120 Skinner Hall, 651 North Pleasant Street, Amherst, MA 01003-9304; e-mail: jeungokc@nursing.umass.edu.

Received: March 19, 2012
Accepted: December 05, 2012
Posted Online: April 12, 2013

Abstract

This study investigated the psychometrics of the Self-Assessment of Nursing Informatics Competencies Scale for nursing students in undergraduate (n = 131) and graduate (n = 171) programs. The scale had a valid five-factor structure, accounting for 69.38% of the variance, high internal consistency reliabilities (0.96 for the total scale and 0.84 to 0.94 for subscales), and good responsiveness (standardized response mean = 0.99), as well as significantly improved scores in nursing students with diverse demographic and educational backgrounds after taking an informatics course. Our factor structure was similar to the original scale, differing slightly in four items’ loadings. This difference may reflect current informatics practice or the greater diversity of our sample. Further research is needed on the factor, data/information management skills, and related item loadings. This scale could be used to assess informatics competencies and develop educational strategies that prepare nursing students as informatics-competent graduates in information technology–rich environments. [J Nurs Educ. 2013;52(5):275–282.]

Dr. Choi is Assistant Professor, School of Nursing, University of Massachusetts, Amherst, Massachusetts; and Dr. Bakken is Alumni Professor of Nursing, Professor of Biomedical Informatics, and Director, Center for Evidence-Based Practice in the Underserved, Columbia University, New York, New York.

This project was supported by the Health Services Resource Administration grant D11 HP07346.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Jeungok Choi, PhD, RN, Assistant Professor, School of Nursing, University of Massachusetts, 120 Skinner Hall, 651 North Pleasant Street, Amherst, MA 01003-9304; e-mail: jeungokc@nursing.umass.edu.

Received: March 19, 2012
Accepted: December 05, 2012
Posted Online: April 12, 2013

The call for change in nursing education has been resonating since 1999 when the Institute of Medicine (IOM) focused attention on health care delivery, highlighting recommendations for dramatic restructuring of all health professionals’ education (Gebbie, Rosenstock, Hernandez, & Committee on Educating Public Health Professionals for the 21st Century, 2003; Greiner, Knebel, & Committee on the Health Professions Education Summit, 2003; Kohn, Corrigan, & Donaldson, 2000). Key among these recommendations is that health professionals should be educated to deliver patient-centered care as members of an interdisciplinary team, with emphasis on the application of best evidence into practice through quality improvement efforts based in a solid foundation of informatics knowledge and clinical application. More recently, the IOM and Robert Wood Johnson Foundation (2011) recommended that competency-based teaching should be the hallmark of professional education, again highlighting technology as an essential component of nursing education for optimal application to practice.

To ensure that all nurse graduates are competent in the fundamentals of informatics, particularly in an era of expanding electronic health care delivery, assessment of nursing informatics competency must be incorporated into the nursing curriculum. This assessment is needed so that informatics courses and courses that thread key requisites of informatics throughout the curricula can be refined as needed. As a first step, nursing students’ current level of informatics competencies must be accurately assessed (Jenkins, Wilson, & Ozbolt, 2007).

However, standardized tools for evaluating informatics competencies of nursing students were lacking in the mid-2000s (Hart, 2008). To respond to the increasing need for instruments to produce reliable and valid scores, the Self-Assessment of Nursing Informatics Competencies Scale (SANICS) was developed to assess informatics competencies in nursing students and practicing nurses (Yoon, Yen, & Bakken, 2009). The 30-item SANICS was based on published and locally developed competency statements (Staggers, Gassert, & Curran, 2001, 2002) to ensure that graduates were prepared to use information technologies to promote safe and evidence-based nursing care.

The SANICS is a standardized instrument with well-established factorial validity, internal consistency reliability, and responsiveness, but it was developed and tested in a sample of baccalaureate nursing students who were young and predominantly White, non-Hispanic (61.8%), and who had a high level of computer knowledge and skills (Yoon et al., 2009). Because an instrument’s reliability and validity scores depend on the characteristics of the sample, the psychometric properties of the SANICS must be assessed in nursing students with diverse educational and demographic backgrounds. In addition, considering the rapidly evolving nature of informatics competencies (Chang, Poynton, Gassert, & Staggers, 2011), it is essential to validate that the SANICS, which was developed in 2006, reflects current and pertinent nursing informatics practice.

Background

Definition of Competency Assessment

Evaluating nursing students’ informatics competencies has been of interest since the mid-1980s when computers became available (Grobe, 1988; Peterson & Gerdin-Jelger, 1988). In the early 2000s, informatics competencies were defined using the Delphi method and were validated for nurses at four levels of practice: beginning, experienced, informatics nurse specialist, and informatics innovator (Staggers et al., 2001, 2002). These role definitions and related competencies were incorporated into the American Nurses Association’s (2008) position statement on the scope and standards of nursing informatics practice.

In 2011, the early definitions of informatics competencies (Staggers et al., 2002) were updated (Chang et al., 2011) by extending the literature review from 1998 to 2004, abstracting 45 additional items, and adding two new subcategories (evidence-based and information literacy) to the informatics knowledge category. Using a Delphi technique, Chang et al. (2011) tested a list of 323 informatics competencies with nurse administrators and nurse educators in Taiwan and validated 317 (98%) of the original competencies and most of the added competency statements. That study supports the early findings (Staggers et al., 2002) and adds competency statements reflecting current practice environments. However, these updated competencies need to be validated with nurses in the United States.

Recently, nursing informatics competencies were defined by two initiatives: the Technology Informatics Guiding Education Reform (TIGER) initiative (2012a) and the Quality and Safety Education for Nurses (QSEN) project (Cronenwett et al., 2009). The TIGER initiative established the TIGER Informatics Competencies Collaborative to formulate a unified vision of nursing informatics as a core competency for all practicing nurses and graduating nursing students. That collaborative identified competency for three categories: basic computer competencies, information literacy, and information management (TIGER Initiative, 2012b).

The QSEN project was initiated to help nursing faculty address the IOM’s recommendations to improve the education of health professionals by emphasizing evidence-based practice, quality improvement approaches, and informatics (Cronenwett et al., 2009). To overcome the QSEN-perceived barrier of in-sufficient teaching examples for nurse educators, the QSEN project began developing IOM-inspired role definitions and competencies, including those for informatics. The QSEN project used the knowledge, skills, and attitudes framework, which is applicable to all levels of nursing. Efforts continue today to enhance the ability of faculty to teach quality and safety competencies, which are based on a collection of specific teaching ideas that are available to nursing faculty on the QSEN Web site (QSEN, 2012).

Informatics Competency Assessment

The standard for articulating, leveling, and measuring informatics competencies today is based on the original work of Staggers et al. (2002). For example, informatics competency items abstracted from the work of Staggers et al. (2002) were expanded for nurse practitioners by adding competencies relevant to evidence-based practice (Curran, 2003). In 2005, the original list of informatics competencies (Staggers et al., 2002) was expanded to nursing students to assess the effect of an informatics curriculum on nursing informatics competencies (Desjardins, Cook, Jenkins, & Bakken, 2005). Items related specifically to informatics for evidence-based practice and information literacy were added to the original list of competencies. Similarly, Ornes and Gassert (2007) created a tool based on the original categories of informatics competencies for beginning nurses (Staggers et al., 2002) to evaluate whether nursing informatics competencies were included in a baccalaureate program. In 2009, 43 competency items, based on Staggers et al.’s (2001) articulation of novice nurse competencies, were used to examine graduating baccalaureate nurses’ information technology skills (Fetter, 2009). However, the psychometric properties of these tools have not been established nor have their validity and reliability been examined and validated. Considering that instrument reliability and validity are critical to study validity (Burns & Grove, 2009), psychometrically sound measurement tools are needed to assess informatics competencies.

Self-Assessment of Nursing Informatics Competencies Scale

The SANICS (Yoon et al., 2009) was developed based on a set of informatics competency statements for beginning and experienced nurses (Staggers et al., 2001, 2002), with additional items related to standardized terminologies, evidence-based practice, and wireless communications (Desjardins et al., 2005). Each item is rated on a 5-point Likert scale from 1 = not competent to 5 = expert. The 30-item SANICS was found by exploratory principal components analysis with oblique promax rotation to have a five-factor structure (Yoon et al., 2009), which explained 63.7% of the variance with high internal consistency reliability. Correlations among factors ranged from 0.30 to 0.75. These five factors were: (a) basic computer knowledge and skills (Cronbach’s alpha = 0.94 [15 items]), (b) applied computer skills: clinical informatics (Cronbach’s alpha = 0.89 [four items]), (c) clinical informatics role (Cronbach’s alpha = 0.91 [five items]), (d) clinical informatics attitudes (Cronbach’s alpha = 0.94 [four items]), and (e) wireless device skills (Cronbach’s alpha = 0.90 [two items]). The SANICS responsiveness was supported by significantly higher factor scores following an informatics course. The psychometric properties of the SANICS were examined with a sample of 336 nursing students in the baccalaureate portion of a combined baccalaureate–master’s nursing program. The sample was predominantly female (76.8%), aged 20 to 30 years (55.4%). Most (87.1%) respondents used a computer several times a day and had used computers for more than 2 years (98.7%;Yoon et al., 2009).

This review indicates the increasing need for psychometrically valid and reliable informatics competency assessment tools. Although research has supported the psychometric properties of the SANICS, it was developed and validated in only one group of undergraduate nursing students. To maximize use of the SANICS, it needs to be validated in students with diverse educational and demographic backgrounds, which represent the entire target population. Furthermore, the evolving nature of informatics competencies calls for measurement tools to reflect the current nursing informatics practice environment. Thus, the purpose of this study was to assess the psychometric properties (factor structure, internal consistency reliability, and responsiveness) of the SANICS among nursing students with diverse educational and demographic backgrounds. We surveyed students enrolled in three undergraduate tracks (traditional baccalaureate, registered nurse to baccalaureate in nursing [RN-to-BSN], and accelerated baccalaureate in nursing [accelerated BSN]) and two graduate programs (Clinical Nurse Leader [CNL] and Doctor of Nursing Practice [DNP]).

Method

Sample

On the basis of recommended practice (Nunnally & Bernstein, 1994; Tabachnick & Fidell, 2006), we targeted a minimum of 300 participants (10 participants per 30 items). Of 537 nursing students enrolled in three undergraduate tracks (traditional baccalaureate, RN-to-BSN, and accelerated BSN) and two graduate nursing programs (CNL and DNP) at a northeastern state university, 302 participants were surveyed from September to December of 2011.

Procedure

After the university institutional review board approved the study, the first author (J.C.) obtained the distribution lists of students in the undergraduate tracks and graduate programs from the director of each track and program. Student anonymity was preserved by using the university’s electronic mailing list for each track and program to contact students about the study.

At the beginning of the semester (September 2011), students were sent e-mails that explained the study and its purpose and introduced the SANICS. The e-mail also contained a link to the Internet survey package, SurveyMonkey, with enhanced security (SSL [https:]), which included the SANICS, demographic questions, and an online consent form. SurveyMonkey was chosen because of its security, usability, and ability, not only to analyze data in real time but to export data directly into SPSS® statistical software for further analysis. Students’ consent to participate was indicated by returning the completed SANICS via the Internet to the SurveyMonkey secure site and checking off the appropriate consent box. In mid-November, the electronic mailing list was used to send reminder e-mails to all students to encourage their participation in the study.

To test for responsiveness of the SANICS, a separate sample of DNP students was recruited from Informatics for Nursing Practice, which is the only informatics course offered at the first author’s (J.C.) university, in fall 2011. All 31 DNP students enrolled in this course were recruited. The SANICS was administered twice—in the first and last weeks of the semester—using the online survey tool embedded in the Blackboard Vista learning–management system.

The 14-week online informatics course provided an overview of the current information systems/technology-enabled health care environment, functions of the informatics nurse specialist, and cutting-edge issues in health care informatics that impact the role of the DNP-prepared nurse in various health care settings. Throughout the semester, students participated in two projects—one individual (e.g., health literacy assessment of health-related Web pages) and one group (e.g., development of a proposal for a computerized system for evidence-based clinical practice guidelines). For these projects, students used three online communication boards: (a) chat/whiteboard, (b) discussion board, and (c) Wimba voice discussion board (Wimba, Inc., 2012). The first board is a synchronous system in which the instructor and students interact in real time. The second board is an asynchronous system in which activities are self-directed and students have greater freedom to process information and pace their learning and understanding. The third board is a synchronous system to record, post, and share audio messages with others.

Data Analysis

To facilitate comparisons across scales, the mean scores of each subscale and the full scale were calculated for each student respondent. The psychometric properties of the SANICS were evaluated in several ways. First, a principal component analysis with oblique (promax) rotation was selected to examine the factor structure of the SANICS. After reviewing the correlation matrix, communalities, and factor loadings, the promax rotation with Kaiser normalization was chosen because of correlations among factors (Ho, 2006; Tabachnick & Fidell, 2006).The minimum number of extracted factors required was evaluated using multiple criteria, including Kaiser’s eigenvalue greater-than-one rule, Cattell’s scree test, and a factor loading of > 0.40 on one factor (Nesselroade & Cattell, 1988; Tabachnick & Fidell, 2006). Items were selected based on of salient loadings, after which the quality of each item within the subscale was assessed regarding its intercorrelation with other items. Item–item correlations and item–total correlations were examined to determine whether any items had lower correlations with the set of remaining items (Tabachnick & Fidell, 2006). Second, the internal consistency of the instrument was assessed using Cronbach’s alpha for the entire scale and for each subscale. Third, the construct validity of the SANICS was evaluated using a known group approach (Burns & Grove, 2009) by comparing the scores of students in undergraduate and graduate programs. We hypothesized that graduate students would score higher on the SANICS than undergraduate students. Differences were examined by independent t tests. Fourth, responsiveness of the SANICS was examined by calculating a standardized response mean (SRM; Husted, Cook, Farewell, & Gladman, 2000). Responsiveness is an indicator of an instrument’s sensitivity to detect score change over time. The SRM has been widely used as an indicator of responsiveness (Bremander et al., 2012; Husted et al., 2000; Kassam, Glozier, Leese, Henderson, & Thornicroft, 2010). The SRM was calculated as the difference between mean scores before and after taking the informatics course, divided by the standard deviation of the difference (Husted et al., 2000). Values of 0.20, 0.50, and 0.80 have been proposed to represent small, moderate, and large responsiveness, respectively (Husted et al., 2000). Differences between scores before and after taking the informatics course were determined by paired t test. SPSS version 19.0 software was used for all analyses.

Results

The initial survey response rate was 49.7% (n = 267). After sending the e-mail reminder, the response rate increased to 56.2% (n = 302). The samples before and after sending the reminder did not differ significantly in demographics and computer use: age (χ2 = 1.72, p = 0.63), gender (χ2 = 0.64, p = 0.50), frequency of computer use (χ2 = 1.59, p = 0.66), race/ethnicity (χ2 = 5.69, p = 0.58), length of computer use (χ2 = 2.70, p = 0.26), and years of working in nursing (χ2 = 3.99, p = 0.55).

Sample Characteristics

The majority (171 [56.6%]) of students were in graduate programs; 158 (52.3%) were in DNP and 13 (4.3%) were in CNL. The remaining 131 (43.4%) students were in undergraduate tracks; 78 (25.8%) were in traditional baccalaureate, 32 (10.6%) were in RN-to-BSN, and 21 (7%) were in the accelerated BSN programs. The majority of survey respondents in the RN-to-BSN (24 [75%]) and accelerated BSN (18 [85.7%]) programs were in their first year of the program. In the traditional baccalaureate program, 24.4%, 26.9%, 28.2%, and 20.5% of respondents were in their first, second, third, and fourth years, respectively.

The sample was mostly female (93.6%), the majority (50.7%) was aged 20 to 30 years. and respondents were racially diverse (White, non-Hispanic = 72%, Black, non-Hispanic = 10.5%, Hispanic/Latino = 6.9%, Asian/Pacific Islander = 4 .9%, and Other [includes Native American Indian] = 5.7%). Most (97.2%) respondents used a computer several times per day and most (93.2%) had used computers for more than 2 years. Students’ working experience in a field of nursing varied from none (28.0%) to 2 years or less (41.2%) to more than 10 years (32.8%). Detailed participants’ characteristics are shown in Table 1.

Study Respondents’ Characteristics

Table 1: Study Respondents’ Characteristics

SANICS Psychometrics

Overall, students’ mean SANICS score was 3.15 (SD = 0.71), indicating competence in informatics as defined by a minimum SANICS score of 3. Mean scores ranged from 3.60 (SD = 0.55) for traditional undergraduate students to 3.23 (SD = 0.70) for DNP students.

Factor Structure. We first conducted separate factor analyses for each undergraduate and graduate sample, to examine whether their factor structures differed, and then for the combined sample. The separate factor analyses revealed two similar factor structures with the same five-factor solution and similar factor loadings. Comparison with the combined sample yielded a few differences in factor loadings. For example, the items “use operating systems,” “navigate Windows®,” and “identify the basic components of the computer system” were loaded with smaller factor loadings (0.31, 0.33, and 0.33) in the undergraduate student sample than in the graduate student sample (0.51, 0.75, and 0.59) and in the combined sample (0.41, 0.57, and 0.53). Given that the factor analyses for the undergraduate and graduate student samples were similar, we report only the combined sample factor analysis.

The results of principal component analysis with promax rotation in the combined sample are presented in Table 2. First, the factorability of correlation matrices was confirmed. Sampling adequacy was high, as shown by the Kaiser-Meyer-Olkin measure (0.93) and a significant Bartlett’s test of sphericity (chi-square = 4974.56, p < 0.001) (Tabachnick & Fidell, 2006).

Study Results of Principal Component Analysis: Five-Factor Solution

Table 2: Study Results of Principal Component Analysis: Five-Factor Solution

Factor analysis of the 30-item SANICS yielded five factors—basic computer knowledge and skills; applied computer skills: clinical informatics; clinical informatics role; clinical informatics attitudes; and data/information management skills, which accounted for 69.38% of the variance. All 30 items were loaded across five factors, with factor loadings > 0.4. After factor extraction, communalities had values > 0.5, with the lowest value (0.52) for the item “use networks to navigate systems.” Each subscale had a coherent set of 4 to 11 items (Table 2).

The component correlation matrix indicates moderate correlation among subscales, with correlations among the subscales ranging from 0.41 (basic computer knowledge and skills and clinical informatics role and applied computer skills; and clinical informatics attitudes and applied computer skills) to 0.66 (basic computer knowledge and skills and data/information management skills; Table 3).

Component Correlation Matrix of the Self-Assessment of Nursing Informatics Competencies Subscales

Table 3: Component Correlation Matrix of the Self-Assessment of Nursing Informatics Competencies Subscales

Internal Consistency Reliability. The total SANICS and its subscales had high internal consistency reliabilities; the total scale had Cronbach’s alpha of 0.96, with Cronbach’s alphas for subscales ranging from 0.94 for basic computer knowledge and skills to 0.84 for data/information management skills. Table 4 shows comparisons of Cronbach’s alphas with those of the original study (Yoon et al., 2009).

Comparison of Internal Consistency Reliability of the Self-Assessment of Nursing Informatics Competencies Scale

Table 4: Comparison of Internal Consistency Reliability of the Self-Assessment of Nursing Informatics Competencies Scale

Construct Validity Using a Known Group Approach. The total mean SANICS score for graduate students (mean = 3.23, SD = 0.69) was significantly higher than for undergraduate students (mean = 3.00, SD = 0.72; t = 2.50, p = 0.01). For mean subscale SANICS scores, graduate students scored higher than undergraduate students on all five subscales, but the differences were significant only on three subscales: applied computer skills: clinical informatics (t = 1.97, p = 0.04), clinical informatics attitudes (t = 4.07, p < 0.001), and clinical informatics role (t = 5.23, p < 0.001).

Responsiveness. The responsiveness of the SANICS was measured by its SRM for 31 DNP students who took the informatics course. The SRM for the total SANICS was large (0.99), indicating the SANICS is responsive. The SRMs for all five subscales were moderate to large: 0.67 for clinical informatics attitudes, 0.92 for data/information management skills, 0.96 for clinical informatics role, 1.00 for applied computer skills, and 1.14 for basic computer knowledge and skills.

Responsiveness was also indicated by a significant improvement in DNP students’ SANICS scores after taking the informatics course (paired t(30) = 9.26, p < 0.001) when the entire scale was considered. This improvement was also significant for all subscales (paired t test = 8.36 for clinical informatics role, 9.30 for basic computer knowledge and skills, 7.19 for applied computer skills, 4.46 for clinical informatics attitudes, and 7.16 for data/information management skills; p < 0.001 for all subscales).

Discussion

Our study findings show that the SANICS is a psychometrically sound instrument for nursing students with diverse demographic and educational backgrounds. The exploratory principal component analysis with promax rotation revealed a reliable and valid five-factor structure with high internal consistency reliabilities and good responsiveness for all five subscales and the entire scale. Construct validity that was determined by using a known group approach was supported by significantly higher mean scores on the total SANICS and its three subscales for graduate students than for undergraduate students.

We also found moderate intercorrelations between subscales, ranging from 0.41 (basic computer knowledge and skills and clinical informatics role and applied computer skills; and clinical informatics attitudes and applied computer skills) to 0.66 (basic computer knowledge and skills and data/information management skills). Our findings are consistent with the original findings of moderate to high correlations between subscales (Yoon et al., 2009).

The SANICS five-factor structure found in this study is similar to that reported for the original study (Yoon et al., 2009). Items loaded similarly across the five factors, but four items had slightly different loadings. The item, use networks to navigate systems, loaded on the factor applied computer skills: clinical informatics; however, this item loaded on the basic computer knowledge and skills factor in the original study. This difference may reflect, in part, participants’ different perceptions of using networks to navigate systems. Our study participants were older (49.3% were > 30 years) and were less competent in computer knowledge and skills (mean SANICS score = 3.56) than participants in the original study (11.3% were > 30 years, mean SANICS score = 3.81; Yoon et al., 2009). Our older, less computer-competent students might have perceived using networks as an applied computer skill, rather than a basic computer skill.

Differences were also found in the factor data/information management skills and its item loadings. In our study, five items were loaded on this factor: use wireless device to locate and download resources for patient safety and quality care, use wireless device to enter data, communicate with other systems (e.g., access data, upload, download) using different options for connecting to the Internet (phone line, mobile phone, cable, wireless, satellite), use database management program to develop a simple database and/or table, and use database applications to enter and retrieve information (Table 2). In the original study (Yoon et al., 2009), the last three items were loaded on basic computer knowledge and skills. The different findings may reflect, in part, the different time periods of data collection. Since the original study conducted in 2006, wireless devices have become so common that currently there may be no distinction between wired versus wireless skills. Thus, our findings suggest a new factor of data-management or information-management skills (e.g., access, upload, download) using wireless or wired devices, reflecting current informatics practice. Further studies are necessary to investigate the factor data/information management skills and its item loadings.

The study findings indicate that standard deviations of the mean SANICS scores have a small range (from 0.76 to 0.92), indicating that scores are homogeneous and that students in both programs responded similarly in informatics competencies, despite their diverse educational background. Future research is suggested to examine whether informatics competencies differ among the different programs.

Finally, we note one finding regarding the applicability of the SANICS across student populations. Our initial separate factor analyses in the undergraduate and graduate student samples show similar five-factor structures, except for a few differences in item loadings compared with the combined sample. These findings may support the applicability of the SANICS across undergraduate and graduate students. However, sample sizes in these analyses were smaller than the minimum number needed for factor analysis (10 participants per 30 items), especially for the undergraduate student sample (n = 109); thus, further research is needed with a larger sample.

Limitations and Future Research

This study had several limitations. First, the participants were volunteers, and the response rate was 56.3%, which increases the potential for nonresponse bias. Second, respondents and nonrespondents might have differed in their perceptions of nursing informatics competencies. Unfortunately, we could not compare their characteristics because we used the electronic mailing list, rather than individual e-mail addresses for data collection. Third, all respondents were recruited from one northeastern state university, which may limit the generalizability of the findings. Fourth, informatics competencies were measured by self-report, rather than respondents’ actual informatics knowledge and skills. The students’ actual informatics competencies might have been lower due to the tendency to show only one’s desirable aspects. Indeed, nursing students in one study rated themselves higher on computer skills than their actual performance (Elder & Koehn, 2009). This bias might have overestimated some of the validity and reliability coefficients, the effect size, and paired t test coefficients. Therefore, further studies are suggested on informatics knowledge and skills, with direct measures of respondents’ informatics competencies. Finally, although the study sample was more diverse in demographics than the original study, there was a difference in ethnic diversity, which might have led to similar findings in factor structure with the original study. Further studies are necessary with a more ethnically diverse sample population.

Conclusion

Given the demands of an increasingly electronic health care environment, the nursing workforce must be adequately prepared to use such health care information technologies. To ensure that all nurse graduates are fully prepared to use electronic technologies to support patient care, it is essential to assess informatics competencies using valid instruments. Our study results indicate that the SANICS is valid and is internally consistent in nursing students with diverse educational and demographic backgrounds. The high responsiveness of the SANICS indicates its strong ability to detect significant changes in informatics competencies. Further research is needed to investigate the data/information management skills and its related item loadings to make this tool more reflective of current informatics practice.

For nurse educators and instructors, the next step suggested by this study is to embrace the assessment of informatics competencies and develop educational strategies that prepare nursing students as informatics-competent graduates in information technology–rich environments.

References

  • American Nurses Association. (2008). Nursing informatics: Scope and standards of practice. Silver Spring, MD: Author.
  • Bremander, A., Wikström, I., Larsson, I., Bengtsson, M., Hagel, S. & Strömbeck, B. (2012). Cultural adaptation, validity, reliability and responsiveness of the Swedish version of the Effective Musculoskeletal Consumer Scale (EC-17). Musculoskeletal Care, 10, 43–50. doi:10.1002/msc.1006 [CrossRef]
  • Burns, N. & Grove, S.K. (2009). The practice of nursing research: Appraisal, synthesis, and generation of evidence (6th ed.). St. Louis, MO: Elsevier Saunders.
  • Chang, J., Poynton, M.R., Gassert, C.A. & Staggers, N. (2011). Nursing informatics competencies required of nurses in Taiwan. International Journal of Medical Informatics, 80, 332–340 doi:10.1016/j.ijmedinf.2011.01.011 [CrossRef] .
  • Cronenwett, L., Sherwood, G., Pohl, J., Barnsteiner, J., Moore, S., Sullivan, D. T. & Warren, J. (2009). Quality and safety education for advanced nursing practice. Nursing Outlook, 57, 338–348 doi:10.1016/j.outlook.2009.07.009 [CrossRef] .
  • Curran, C.R. (2003). Informatics competencies for nurse practitioners. AACN Clinical Issues, 14, 320–330 doi:10.1097/00044067-200308000-00007 [CrossRef] .
  • Desjardins, K.S., Cook, S.S., Jenkins, M. & Bakken, S. (2005). Effect of an informatics for evidence-based practice curriculum on nursing informatics competencies. International Journal of Medical Informatics, 74, 1012–1020 doi:10.1016/j.ijmedinf.2005.07.001 [CrossRef] .
  • Elder, B.L. & Koehn, M.L. (2009). Assessment tool for nursing student computer competencies. Nursing Education Perspectives, 30, 148–152.
  • Fetter, M.S. (2009). Graduating nurses’ self-evaluation of information technology competencies. Journal of Nursing Education, 48, 86–90 doi:10.3928/01484834-20090201-05 [CrossRef] .
  • Gebbie, K., Rosenstock, L. & Hernandez, L.M.Committee on Educating Public Health Professionals for the 21st Century. (2003). Who will keep the public healthy? Educating public health professionals for the 21st century. Washington, DC: National Academies Press.
  • Greiner, A.C. & Knebel, E.Committee on the Health Professions Education Summit. (2003). Health professions education: A bridge to quality. Washington, DC: National Academies Press.
  • Grobe, S.J. (1988). Nursing informatics competencies for nurse educators and researchers. NLN Publications, May(14–2234), 25–40.
  • Hart, M. (2008). Informatics competency and development within the US nursing population workforce: A systematic literature review. Computers, Informatics, Nursing: CIN, 26, 320–329 doi:10.1097/01.NCN.0000336462.94939.4c [CrossRef] .
  • Ho, R. (2006). Handbook of univariate and multivariate data analysis and interpretation with SPSS. Boca Raton, FL: Chapman & Hall/CRC doi:10.1201/9781420011111 [CrossRef] .
  • Husted, J.A., Cook, R.J., Farewell, V.T. & Gladman, D.D. (2000). Methods for assessing responsiveness: A critical review and recommendations. Journal of Clinical Epidemiology, 53, 459–468 doi:10.1016/S0895-4356(99)00206-1 [CrossRef] .
  • Institute of Medicine, & Robert Wood Johnson Foundation. (2011). The future of nursing: Leading change, advancing health. Washington, DC: National Academies Press.
  • Jenkins, M., Wilson, M. & Ozbolt, J. (2007). Informatics in the doctor of nursing practice curriculum. In Teich, J.M., Hripcsak, G. & Suermondt, J. (Eds.), American Medical Informatics Association annual symposium: Biomedical and health informatics: From foundations to applications to policy (Vol. 1, pp. 364–368). Red Hook, NY: Curran Associates. Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2655779/pdf/amia-0364-s2007.pdf
  • Kassam, A., Glozier, N., Leese, M., Henderson, C. & Thornicroft, G. (2010). Development and responsiveness of a scale to measure clinicians’ attitudes to people with mental illness (medical student version). Acta Psychiatrica Scandinavica, 122, 153–161. doi:10.1111/j.1600-0447.2010.01562.x [CrossRef]
  • Kohn, L.T., Corrigan, J.M. & Donaldson, M.S.Committee on Quality of Health Care in America. (2000). To err is human: Building a safer health system. Washington, DC: National Academies Press.
  • Nesselroade, J.R. & Cattell, R.B. (1988). Handbook of multivariate experimental psychology (2nd ed.). New York, NY: Plenum Press doi:10.1007/978-1-4613-0893-5 [CrossRef] .
  • Nunnally, J.C. & Bernstein, I.H. (1994). Psychometric theory. New York, NY: McGraw-Hill.
  • Ornes, L.L. & Gassert, C. (2007). Computer competencies in a BSN program. Journal of Nursing Education, 46, 75–78.
  • Peterson, H.E. & Gerdin-Jelger, U. (1988). Preparing nurses for using information systems: Recommended informatics competencies. NLN Publications, May(14–2234), 1–141.
  • Quality and Safety Education for Nurses. (2012). Search teaching strategies. Retrieved from: http://qsen.org/teaching-strategies/
  • Staggers, N., Gassert, C.A. & Curran, C. (2001). Informatics competencies for nurses at four levels of practice. Journal of Nursing Education, 40, 303–316.
  • Staggers, N., Gassert, C.A. & Curran, C. (2002). A Delphi study to determine informatics competencies for nurses at four levels of practice. Nursing Research, 51, 383–390 doi:10.1097/00006199-200211000-00006 [CrossRef] .
  • Tabachnick, B.G. & Fidell, L.S. (2006). Using multivariate statistics (5th ed.). Boston, MA: Pearson.
  • TIGER Initiative. (2012a). Collaborating to integrate evidence and informatics into nursing practice and education: An executive summary. Retrieved from http://www.tigersummit.com/uploads/TIGER_Collaborative_Exec_Summary_040509.pdf
  • TIGER Initiative. (2012b). Informatics competencies for every practicing nurse: Recommendations from the TIGER Collaborative. Retrieved from http://www.tigersummit.com/uploads/3.Tiger.Report_Competencies_final.pdf
  • Wimba, Inc. (2012). Wimba classroom for higher & further education. Retrieved from http://www.wimba.com/solutions/higher-education/wimba_classroom_for_higher_education/
  • Yoon, S., Yen, P.Y. & Bakken, S. (2009). Psychometric properties of the self-assessment of nursing informatics competencies scale. Studies in Health Technology and Informatics, 146, 546–550.

Study Respondents’ Characteristics

Variablen (%)

Graduate Programa
Undergraduate Tracka
DNPCNLTraditionalRN-to-BSNAccelerated BSN
Gender
  Female142 (92.2)11 (84.6)74 (96.1)30 (93.8)21 (100)
  Male12 (7.8)2 (15.4)3 (3.9)2 (6.2)
Race/ethnicity
  Asian/Pacific Islander4 (2.6)6 (7.7)4 (12.5)1 (4.8)
  Black, non-Hispanic28 (18.2)1 (7.7)2 (2.6)1 (3.1)
  Hispanic/Latino17 (11)2 (2.6)1 (3.1)1 (4.8)
  White, non-Hispanic101 (65.6)11 (84.6)65 (83.3)23 (71.9)18 (85.6)
  Other4 (2.6)1 (7.7)3 (3.8)3 (9.4)1 (4.8)
Age (years)
  20 to 2933 (23.6)4 (33.3)78 (100)16 (50)13 (62)
  30 to 3927 (19.3)5 (41.7)3 (9.4)4 (19)
  40 to 4956 (40)3 (25)11 (34.4)4 (19)
  ⩾ 5024 (17.1)2 (6.2)
Nursing experience (years)
  < 28 (7.2)2 (28.6)73 (93.6)4 (12.4)16 (76.2)
  2 to 519 (17.1)2 (28.6)4 (5.1)8 (25)1 (4.8)
  5 to 1012 (10.8)1 (14.2)1 (1.3)14 (43.8)2 (9.5)
  > 1072 (64.9)2 (28.6)6 (18.8)2 (9.5)
Frequency of computer use
  Several times per day110 (99.1)7 (53.8)77 (98.7)27 (84.4)21 (100)
  Once per day1 (0.9)6 (46.2)1 (1.3)1 (3.1)
  Several times per week3 (9.4)
  Several times per month1 (3.1)
Computer experience
  < 6 months3 (1.9)1 (7.6)7 (9)2 (6.2)1 (4.8)
  ⩽ 2 years47 (29.7)6 (46.2)2 (2.6)
  > 2 years108 (68.4)6 (46.2)69 (88.4)30 (93.8)20 (95.2)

Study Results of Principal Component Analysis: Five-Factor Solution

Self-Assessment of Nursing Informatics Competencies SubscaleLoading
Basic computer knowledge and skills (11 items, Cronbach’s alpha = 0.94, mean = 3.56, SD = 0.76)
  Use word processing0.93
  Use presentation graphics0.92
  Use multimedia presentations0.91
  Use computer technology safely0.79
  Use existing external storage devices0.75
  Conduct online literature searches0.58
  Navigate Windows®0.57
  Identify the basic components of the computer system0.53
  Perform basic trouble shooting in applications0.47
  Use the Internet to locate and download items of interest0.40
  Use operating systems0.41
Applied computer skills: Clinical informatics (5 items, Cronbach’s alpha = 0.89, mean = 2.19, SD = 0.92)
  Access shared data sets0.81
  Extract data from clinical data sets0.80
  Use applications for diagnostic coding0.80
  Use applications to develop testing materials0.66
  Use networks to navigate systems0.47
Clinical informatics role (5 items, Cronbach’s alpha = 0.90, mean = 2.61, SD = 0.91)
  Market self, system, or application0.83
  As a clinician (nurse), participate in selecting, designing, implementing and evaluating systems0.79
  Promote the integrity of access to information to include confidentiality, legal, ethical, and security issues0.76
  Seek available resources to help formulate ethical decisions in computing0.76
  Act as advocate of leaders for incorporating innovations and informatics concepts into their area of specialty0.74
Clinical informatics attitudes (4 items, Cronbach’s alpha = 0.89, mean = 3.94, SD = 0.86)
  Recognize that one does not have to be a computer programmer to effectively use the computer in nursing0.91
  Recognize that the computer is only a tool to provide better nursing care0.90
  Recognize that health computing will become more common0.83
  Recognize the value of clinician involvement in designing, implementing, and evaluating applications0.78
Data/information management skills (5 items, Cronbach’s alpha = 0.84, mean = 3.07, SD = 0.92)
  Use wireless device (personal digital assistant or cell phone) to enter data0.81
  Use wireless device to locate and download resources for patient safety and quality care0.75
  Use database applications to enter and retrieve information0.73
  Use database management program to develop a simple database or table0.65
  Communicate with other systems (e.g., access data, upload, download) using different options for connecting to the Internet (phone line, mobile phone, cable, wireless, satellite)0.60

Component Correlation Matrix of the Self-Assessment of Nursing Informatics Competencies Subscales

SubscaleIntercorrelation of Subscales
12345
1. Basic computer knowledge and skills
2. Applied computer skills0.50
3. Clinical informatics role0.410.41
4. Clinical informatics attitudes0.520.410.45
5. Data/information management skills0.660.540.510.52

Comparison of Internal Consistency Reliability of the Self-Assessment of Nursing Informatics Competencies Scale

Current StudyYoon et al. (2009)
Total30 items, α= 0.9630 items, α= 0.95
Subscale
  Factor 1Basic computer knowledge and skills (11 items, α= 0.94)Basic computer knowledge and skills (15 items, α= 0.94)
  Factor 2Applied computer skills: Clinical informatics (5 items, α = 0.89)Applied computer skills: Clinical informatics (4 items, α = 0.89)
  Factor 3Clinical informatics role (5 items, α = 0.90)Clinical informatics role (5 items, α = 0.91)
  Factor 4Clinical informatics attitudes (4 items, α = 0.89)Clinical informatics attitudes (4 items, α = 0.94)
  Factor 5Data/information management skills (5 items, α = 0.84)Wireless device skills (2 items, α = 0.90)

10.3928/01484834-20130412-01

Sign up to receive

Journal E-contents