Translation of genome science to patient care requires a nursing workforce that is competent to understand, deliver, and explain genetic- and genomic-based care (Calzone et al., 2010). The blinding pace of genome discovery means that nurses today must prepare to incorporate genomics into patient care in ways that might have only been imagined in the past. For this reason, nurses require a clear understanding of foundational genetic and genomic concepts (Giarelli & Reiff, 2012) as a basis for building new knowledge while genome science continues to transform health care. Effective, efficient, evidence-based genomic nursing education must be formulated rationally, comparing what students already know to what they need to know (Collins & Stiles, 2011). However, to date, genetic and genomic knowledge among nursing students remains conceptually and empirically unexplored. The purpose of the current research was to develop and psychometrically evaluate the Genomic Nursing Concept Inventory (GNCI)—a scale designed to measure the understanding of genetic and genomic concepts that are most critical to nursing practice.
Genomics and Nursing
Nurses worldwide have committed to prepare for the genomic era of health care (American Nurses Association & International Society of Nurses in Genetics, 2007; Benner, Sutphen, Leonard, & Day, 2010; Kirk et al., 2003). Specific benchmarks outline the knowledge, attitudes, and skills that are indispensable to the delivery of genomically competent care (Consensus Panel on Genetic/Genomic Nursing Competencies, 2006; Kirk et al., 2003). In the United States, nursing professional groups widely endorse a set of essential genetic and genomic competencies that define the minimal degree of competency expected of every nurse “regardless of academic preparation, practice setting, role, or specialty” (Consensus Panel on Genetic/Genomic Nursing Competencies, 2009, p. 1). The competencies denote specific underlying knowledge that might logically represent genetic and genomic literacy for nurses. Similar competencies have been endorsed in the United Kingdom (Kirk et al., 2003) and Japan (Arimori et al., 2007), and graduate-level competencies have been adopted in the United States (Greco, Tinley, & Seibert, 2012). Historically, genetic content in nursing curricula has been limited, challenging nurse educators to develop, implement, and evaluate strategies to integrate these new competencies.
The call for evidence-based curricular design (Giarelli & Reiff, 2012) logically begins with assessing students’ preinstructional knowledge (Bransford, Brown, & Cocking, 2000); however, genetic and genomic knowledge among nursing students remains largely unexplored. Recent studies focused on perceived, rather than actual, knowledge (Collins & Stiles, 2011; Dodson & Lewallen, 2011; Maradiegue, Edwards, Seibert, Macri, & Sitzer, 2005). Lacking evidence about students’ preinstructional understanding of genetics and genomics, faculty are left to formulate a priori strategies to integrate genomic content. A validated instrument to measure genomic literacy among nursing students is needed to inform educational design and facilitate evaluation.
Concept inventories are educational assessment tools designed to measure conceptual understanding in a particular knowledge domain. Unassuming in appearance, they look like any other multiple choice examination, typically have no more than 30 items, can be administered in 30 minutes, and can be scored electronically. Their ease of administration and scoring allows efficient data collection about conceptual understanding within a domain (Dufresne, Leonard, & Gerace, 2002). However, concept inventories are theory and research based.
Ausubel’s assimilation theory is a cognitive learning theory on which concept inventories are based. The assimilation theory perceives the goal of education to be meaningful learning, which is the “long-term acquisition and retention of a complex network of interrelated ideas characterizing an organized body of knowledge that learners must incorporate into their cognitive structures” (Ausubel, Novak, & Hanesian, 1978, p. 12). Meaningful learning is contrasted with rote learning, which is the “short-term acquisition of single, somewhat contrived concepts, the solution of artificial problems, or the learning of arbitrary associations” (Ausubel et al., p. 12). As new content is assimilated during meaningful learning, a complex bond forms between preexisting and new knowledge and the meanings of both ideas are subject to modification. Meaningful learning has occurred when component concepts, which reflect both the new and preexisting knowledge, are available in the learner’s cognitive structure with a sufficient degree of clarity. According to Ausubel et al. (1978), the most crucial and variable determinants of meaningful learning are found in “the availability of…relevant content in different learners’ cognitive structures” (p. 44). It is for this reason that genomic nursing education must begin with assessment. To be meaningful, genomic content must be anchored to the preinstructional cognitive structures of nursing students. Although students are likely to enter nursing school with some knowledge of genetics and genomics, almost nothing is known about the level of that knowledge or prevailing misconceptions. Assessing preinstructional genetic and genomic knowledge of beginning nursing students is a necessary first step in planning curricula.
Concept inventories are framed on a psychometric model known as the Assessment Triangle. Consequently, concept inventories are carefully designed to align with the following three factors: (a) how students learn in the domain of interest, (b) the most appropriate means of measuring understanding, and (c) the most robust means of interpreting evidence of learning (National Research Council, 2001; Richardson, 2004; Streveler et al., 2011). Items are often written so that rote information is insufficient to arrive at the correct answer, and item distractors (incorrect answers) reflect misconceptions identified in the target population. Concept inventories are therefore useful to measure understanding of critical concepts, as well as to identify specific misconceptions that may hinder meaningful learning. When used as a posttest, concept inventories can measure learning gains and identify persisting misconceptions. Widely used in science, technology, engineering, and mathematics education, concept inventories have both formative and summative utility and are increasingly used to inform curricular and course design, measure student proficiency, and evaluate teaching effectiveness (Richardson, 2004; Streveler et al., 2011). They have been shown to discriminate between the students who understand basic concepts and those who simply memorize unconnected ideas (D’Avanzo, 2008). Concept inventories also have the benefit of being easy to administer and score, thereby allowing efficient data collection about conceptual understanding within a domain (Dufresne et al., 2002).
Nursing lacks a concept inventory specifically oriented to the discipline, although at least three genetic concept inventories have been developed for undergraduate students. These include the Genetics Concept Assessment (Smith, Wood, & Knight, 2008), the Genetics Concept Inventory (Elrod, 2007), and the Genetics Literacy Assessment Instrument (GLAI; Bowling et al., 2008). Of these, the GLAI has been suggested as a measure of genetic literacy among nursing students (Daack-Hirsch, Driessnack, Perkhounkova, Furukawa, & Ramirez, 2012; Giarelli & Rieff, 2012). Designed to measure genetic literacy among undergraduate nonscience majors, the GLAI was initially validated with 395 students in nonmajor biology and genetics classes. In that student population, the GLAI demonstrated a robust psychometric performance consistent with concept inventories in other disciplines (Bowling et al., 2008). However, in two subsequent studies with undergraduate nursing students (Crane, Read, & Twomey, 2013; Daack-Hirsch et al., 2012), the GLAI demonstrated different ratings of scale difficulty and internal consistency reliability (Table 1). Of particular interest is the difference in scale difficulty; that is, nursing students in the Crane et al. (2013) and Daack-Hirsch et al. (2012) studies appeared to find the GLAI to be easier than did students in the Bowling et al. (2008) study. Because concept inventories intentionally measure understanding of difficult concepts, they logically exhibit high difficulty. For example, a review of 13 concept inventories across disciplines revealed a mean pretest difficulty of 43%. Inventories with low or moderate pretest difficulty provide limited utility in measuring learning gains.
Psychometric Features of the Genetics Literacy Assessment Instrument in Three Studies
Perhaps a greater concern of using the GLAI in nursing education relates to the conceptualization of genetic literacy for that instrument. In general, literacy is understood to be content and context specific—the capacity necessary to achieve some task or competency (Ratzan & Parker, 2006). For the GLAI, genetic literacy was defined as “sufficient knowledge and appreciation of genetics principles to allow informed decision-making for personal well-being and effective participation in social decisions on genetic issues” (Bowling et al., 2008, p. 16). The GLAI content domain includes the structure, function, and inheritance of genetic material; evolution; and the role of genetics in society. Although that content domain is relevant to nursing, the level of genetic literacy that constitutes the GLAI target is insufficient for nurses, who are expected to achieve a degree of genetic and genomic competency exceeding that of the general public. Genetic literacy or, more fittingly, genomic literacy, for nurses is appropriately defined as knowledge sufficient to develop genetic and genomic competency, as outlined in the Essentials competency documents (Consensus Panel on Genetic/Genomic Nursing Competencies, 2009). A reliable and valid scale specifically designed to measure that specific level of literacy will facilitate the implementation of genomic nursing education. The purpose of the current instrument development study was to design and test such a scale.
Instrument development was conducted in five sequential steps: (a) establishing the content domain, (b) exploring student understanding of key concepts and developing an item pool, (c) pretesting and refining items, (d) pilot testing and inventory reduction, and (e) field testing of the revised scale. The process was synthesized from approaches used to create other concept inventories (Bailey, 2008; Bowling et al., 2008; Garvin-Doxas, Klymkowsky, & Elrod, 2007; Streveler et al., 2011) and generally followed the method of Treagust (1988). Steps (b) through (d) were conducted with discrete convenience samples of Bachelor of Science in Nursing (BSN) students in a large 2-year, upper division, prelicensure program in the northwestern United States. These students complete two semesters in each of their junior and senior years. The students in the current study did not take a designated genetics or genomics course during their nursing program, and the hours of genetic content embedded within the curriculum were not estimated. Some students reported taking a genetics course prior to nursing school. Data collection for steps (b) and (d) occurred during scheduled classes, which constrained access to some student cohorts. The study was granted exempt status by the Washington State University Institutional Review Board.
Establishing the Content Domain
From specific knowledge areas of the essential genetic and genomic nursing competencies (Consensus Panel on Genetic/Genomic Nursing Competencies, 2009), 65 foundational concepts were extracted, grouped into 14 topical categories, and imported into a Web-based survey. A panel of nurses with genetics expertise was compiled from three sources—nurses who served on the steering committee for the Consensus Panel on Genetic/Genomic Nursing Competencies (N = 17); nurses in education, research, or practice in the United States who held membership in the International Society of Nurses in Genetics (N = 277); and nurse educators in the National Human Genome Research Institute Community of Genetic Educators (N = 23). Membership in these three groups overlapped broadly, and a panel of 289 experts was recruited. Each expert was provided with a unique survey link, and no duplicate responses occurred. In total, 104 experts completed the survey (36% response rate), using a 4-point Likert scale to rate concepts according to relevance to nursing practice (4 = critical to know, 1 = not important). The concept list was rank ordered according to the mean relevance scores and was then reduced, with a goal of retaining crucial concepts while maintaining a broad content domain. Therefore, concepts were considered in the context of their topical categories. An iterative process was applied, first eliminating the lowest-ranking concepts within each category. The category structure of the remaining concepts was then examined; in some instances, categories were combined or renamed. Relevance scores within each category were then calculated and compared. Decisions about concept elimination balanced the relevance scores with logical decisions to maintain a breadth of content. The process was repeated until an optimal set of 21 concepts in five topical categories remained. Although 21 concepts were anticipated to exceed the capacity of the inventory, some concepts were expected to be eliminated, for example, if they were found to be well understood by students or if robust questions to assess understanding were unable to be written.
Developing the Item Pool
Open-ended questions to probe student understanding of each concept in the initial content domain were presented to a convenience sample of junior BSN students (ranging from 96 to 134) at a single college of nursing over several weeks. Bailey’s (2008) method of student-supplied response surveys was used. During 12 scheduled classroom sessions, the students wrote nearly 6,000 textual responses describing their ideas about 21 genetic and genomic concepts. Content analysis (Krippendorff, 1980) revealed the level of student understanding and misconceptions related to each concept. One or more multiple choice questions were then written for each poorly understood concept, using the most common misconceptions as item distractors (incorrect answers). Recommendations from the item-writing literature were applied (Frey, Petersen, Edwards, Pedrotti, & Peyton, 2005; Haladyna, Downing, & Rodriguez, 2002; Nunnally & Bernstein, 1994; Taylor & Smith, 2009), as were recommendations specific to concept inventory development (Hufnagle, 2001; Martin, Mitchell, & Newell, 2003; Richardson, 2004). When possible, items were written within a framework of nursing practice. The resulting draft inventory contained 52 items that mapped to 21 concepts.
Refining Inventory Items
The draft GNCI was pretested with 15 junior nursing students, using individual, cognitive, think-aloud interviews (Conrad & Blair, 1996). The purpose was to minimize measurement error by identifying difficulties (other than knowledge deficit related to the concept of interest) encountered by students in answering the questions. Students were provided a paper copy of the draft inventory and were asked to complete the inventory and verbalize their thoughts as they considered each question. The investigator (L.D.W.) used occasional verbal probes to clarify student thought processes and took notes on a coding form, which included a matrix designed by Conrad and Blair (1996) to objectify coding. Interviews were audiotaped. Data were analyzed on completion of each interview by reviewing notes and audio recordings to identify problems, and promising solutions to increase clarity were formulated. Revised items were retested in their new form with subsequent interviews of the 15 students.
Pilot Testing and Inventory Reduction
The 52-item draft inventory was administered in proctored settings to two cohorts of students (N = 238) near the beginning and end of a 2-year, upper-division BSN program, using a paper inventory and Scantron® cards. Data were entered into SPSS® version 17.0 software to explore student demographics and calculate item and scale psychometrics. The purpose of pilot testing was to explore psychometric features of the 52 items so that the best items could be retained. Item difficulty and item discrimination were primary criteria.
Item difficulty was measured as a p-value representing the proportion of students who answered an item correctly; difficult items have low p-values (Kaplan & Saccuzzo, 1997). In general, items answered correctly by more than 85% of respondents are considered too easy, whereas items answered correctly by less than 25% of respondents are considered too difficult, and both types of questions limit scale validity (Statistical Analysis of Multiple Choice Exams, n.d.). Within those parameters, including items of varying difficulty improves a scale’s ability to discriminate among students with varying levels of the measured attribute (Nunnally & Bernstein, 1994). Concept inventories typically exhibit lower difficulty indices (indicating greater difficulty) than traditional achievement tests (Nelson, Geist, Miller, Streveler, & Olds, 2007), and target item difficulty was set at 0.15 to 0.85. Target scale difficulty, measured as the percent of correct answers, was set at 40% to 55%. These psychometric targets are consistent with other concept inventories.
Item discrimination signifies how well an item differentiates among students who scored high or low on an entire inventory. It was measured as a corrected item–total correlation for each scale item. Nunnally and Bernstein (1994) recommended a lower limit of 0.30, but the heterogeneous content domain of the GNCI was anticipated to limit the item–total correlation, so the target lower limit was set at 0.15 (Kehoe, 1995).
The scale was reduced, applying an iterative process and following the method from Nunnally and Bernstein (1994). The goal was to create a parsimonious inventory of the most robust 25 to 30 items, while providing the broadest coverage of the content domain. On the basis of the reliability analysis data, items falling outside the target ranges for difficulty or discrimination were flagged for elimination, revision, or replacement, carefully balancing each item’s psychometric features with its relationship to the content domain (Nunnally & Bernstein, 1994). In the first rounds of inventory reduction, items with the poorest psychometric features were eliminated. Scale psychometrics were recalculated after each round. Less robust items were occasionally retained, based on rational grounds. One underperforming item was replaced with a similar question from the Test of Genetic Concepts (Sadler, 2003; permission obtained). At the end of this iterative process, an optimal set of retained items had been identified. The resulting GNCI beta version included 31 items covering 18 concepts in four topical categories (Table 2).
Genomic Nursing Concept Inventory, Beta Version, Content Domain
Six proctored cohorts of BSN students (N = 705) completed the GNCI during 2011 to 2013, using paper inventories and Scantron cards. Four groups of junior students (n = 514) were beginning the nursing program, and two groups of seniors (n = 191) were nearing completion of the program. Scantron card data were entered into SPSS Version 21 software for analysis, to measure item and scale difficulty (percent of correct answers), item discrimination (corrected item–total correlation), and scale internal consistency reliability (Cronbach’s alpha). Parameters for difficulty and discrimination were as previously described; target internal consistency reliability was set at a modest Cronbach’s alpha of 0.70, which was recommended by Nunnally and Bernstein (1994) as a reasonable goal for a new scale. Mann-Whitney U tests and Pearson’s r were used to examine differences in scale scores across demographic variables, where p ⩽ 0.05 indicated statistical significance.
Of 705 participants in field testing, 84% were women and 16% were men. Ages ranged from 18 to 60 years (M = 25.4, SD = 7.11). Most students (73%) were juniors beginning nursing school, 27% were seniors approaching graduation, and 8.4% had taken a previous genetics course. Potential GNCI scores range from 0 to 31, and higher scores signify greater knowledge. Actual scores (Table 3) ranged from 2 to 31 (M = 14.45, SD = 5.12) and approached normal distribution (skewness = −0.355, kurtosis = −0.314). Scores were higher for male students (U = [26,170], p < 0.001), students who had previously taken a genetics course (U = [28,204], p < 0.001), and students in their last semester of nursing school (U = [68,670], p < 0.001). Minimal correlation was noted between score and age (r = 0.074, p = 0.05).
Student (N = 705) Demographics and Total Scores of Genomic Nursing Concept Inventory Beta Testing
The mean scale score was 14.59 (47.1% correct responses). Item difficulty (Table 4) ranged from 0.13 to 0.84 (M = 0.47, SD = 0.186). One item (#11) had difficulty outside the target range (p-value = 0.13). Item–total correlations ranged from 0.026 to 0.536 (M = 0.273), with five items (#11, #13, #14, #23, and #30) correlating below 0.15 to the total score. Cronbach’s alpha was 0.77.
Item (N = 31) Analysis of the Genomic Nursing Concept Inventory, Beta Version
Difficulty of the GNCI (47%) is comparable to other concept inventories and supports using a pretest–posttest design to measure learning gains. The most difficult item on the GNCI (#11) also showed poor item–total correlation (0.134) and warrants revision or replacement, as it is the only item addressing a key concept (gene expression). Four additional items (#13, #14, #23, and #30) showed acceptable difficulty, but item–total correlations were below the target level of 0.15; these also warrant scrutiny. Cronbach’s alpha of 0.77 met the target for internal consistency reliability.
The promising psychometric performance of the GNCI in the current study must be considered in the context of study limitations, in particular, the use of students in a single baccalaureate program. Broader testing is imperative. Of interest, a pilot study by Ricciardi, McCabe, and Ward (2012) in which the GNCI was administered to 75 practicing nurses, revealed similar psychometric features (scale difficulty 44%, item difficulty 8% to 89%, mean item discrimination 0.264, Cronbach’s alpha 0.761). The authors are currently in the process of administering the GNCI to students in BSN programs in several states to further explore instrument reliability and to test a process for multisite, large-scale administration.
Another study limitation is that inventory items have been mapped to genetic and genomic concepts on rational grounds only; analysis to explore the relationship between individual items and the content domain is about to begin. Finally, the GNCI content domain was drawn from competencies endorsed in the United States, threatening the utility of the GNCI among nursing students in other countries. However, the content domain is foundational to the delivery of genome-based nursing care worldwide, thus inviting the validation of the GNCI among nursing students globally.
The utility of the GNCI as a formative and summative assessment is directly related to its reliability and validity in measuring knowledge in a wide range of students. The students in the current study represented both genders, individuals who had or had not taken a previous genetics course, and those at the beginning and at the end of a nursing curriculum. Demographic data will continue to be collected to support the examination of between-group differences in scores and to validate the inventory across a broad population of nursing students.
For 2 years, the authors have used the GNCI in a large BSN program to measure student knowledge across the curriculum and to collect data for further psychometric analysis. Validation of the inventory with postlicensure BSN students, graduate nursing students, and practicing nurses has begun. In addition to ongoing reliability testing, affirmation of content validity and improvement of underperforming items are underway. Using methodologies from Bowling et al. (2008) and Streveler et al. (2011), genetic nurse experts will be recruited to a Delphi study to reestablish the relevance of GNCI concepts to nursing practice and to rank the degree to which each item tests its related concept. Further cognitive interviews using student focus groups and personal response systems are being conducted to identify problems with underperforming items. Applying classical test theory, exploration of test–retest reliability, and factor structure is about to begin. Finally, item response theory is being applied to assess the relationship of each item to the entire scale. Collectively, these steps will inform scale revision. The GNCI will then be ready for large-scale testing in multiple colleges of nursing.
Even as the GNCI is being refined, it represents a useful tool to support genomic nursing education. The inventory is easy to administer, requires only 30 minutes to complete, and can be scored electronically. Because each item maps to a specific concept, data reveal which concepts students understand, which concepts they do not understand, and which alternate conceptions (or misconceptions) they hold. The inventory is therefore useful as a formative assessment, allowing instructors to target their teaching to poorly understood concepts and, in particular, to address misconceptions directly. For example, the science education literature indicates a common misconception among high school and college students is the belief that the function of a gene is to determine a particular trait (Mills Shaw, Van Horne, Zhang, & Boughman, 2008). Although it is true that occasionally a single gene is sufficient to determine a trait, the one gene–one trait concept is inconsistent with current prevailing scientific thought. As of fall 2013, after administering the GNCI to a total of 634 beginning nursing students, the authors found that 59% of students share that misconception (Ward, 2013). Such information is directly actionable, allowing targeted instruction to correct the misconception.
Concept inventories are also used as summative assessments to measure learning gains and teaching effectiveness. The posttest reliability of the GNCI has not yet been established, but the authors’ experience in administering the instrument at several time points provides an example of the utility of using a concept inventory to measure both short-term and sustained learning. Administering the GNCI at the completion of a 2 credit-hour genomics course (N = 353) revealed that the percentage of students who held the one gene–one trait misconception fell from 59% to 7% (Ward, 2013). However, much of that learning gain was not sustained. When the GNCI was administered again 1 year later (shortly before graduation), 25% of 207 students indicated the primary function of a gene was to determine a specific trait. This example demonstrates not only the difficulty of correcting well-engrained misconceptions but also the utility of the GNCI as a summative assessment to measure learning gains and support evidence-based education.
Concept inventories have been described as “million dollar instruments” (D’Avanzo, 2008, p. 1081) because they take years to develop and require the expertise of content experts, educators, and psychometricians. Even as inventory refinement proceeds, the GNCI represents a practical tool, allowing faculty to quickly assess student understanding and identify misconceptions. Inventory data support curriculum design and evaluation and, pending validation of test–retest reliability, facilitate measurement of learning gains. Following validation with other populations, the GNCI use may be extended to students in non-baccalaureate or graduate nursing curricula, nursing faculty, and practicing nurses. This conceptually derived, empirically tested tool fills a compelling need in the planning, execution, and evaluation of outcomes-based genomic nursing education.
- American Nurses Association & International Society of Nurses in Genetics. (2007). Genetics/genomics nursing: Scope and standards of practice. Silver Spring, MD: American Nurses Association.
- Arimori, N., Nakagomi, S., Mizoguchi, M., Morita, M., Ando, H., Mori, A. & Holzemer, W.L. (2007). Competencies of genetic nursing practise in Japan: A comparison between basic and advanced levels. Japan Journal of Nursing Science, 4, 45–55. doi:10.1111/j.1742-7924.2007.00075.x [CrossRef]
- Ausubel, D.P., Novak, J.D. & Hanesian, H. (1978). Educational psychology: A cognitive view (2nd ed). New York, NY: Holt, Rinehart, and Winston.
- Bailey, J.M. (2008). Development of a concept inventory to assess students’ understanding and reasoning difficulties about the properties and formation of stars. Astronomy Education Review, 6, 133–139. doi:10.3847/AER2007028 [CrossRef]
- Benner, P., Sutphen, M., Leonard, V. & Day, L. (2010). Educating nurses: A call for radical transformation. San Francisco, CA: Jossey-Bass.
- Bowling, B.V., Acra, E.E., Wang, L., Myers, M.F., Dean, G.E., Markle, G.C. & Huether, C.A. (2008). Development and evaluation of a genetics literacy assessment instrument for undergraduates. Genetics, 178, 15–22. doi:10.1534/genetics.107.079533 [CrossRef]
- Bransford, J.D., Brown, A.L. & Cocking, R.R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academies Press.
- Calzone, K.A., Cashion, A., Feetham, S., Jenkins, J., Prows, C.A., Williams, J.K. & Wung, S.F. (2010). Nurses transforming health care using genetics and genomics. Nursing Outlook, 58, 26–35. doi:10.1016/j.outlook.2009.05.001 [CrossRef]
- Collins, C.A. & Stiles, A.S. (2011). Predictors of student outcomes on perceived knowledge and competence of genetic family history risk assessment. Journal of Professional Nursing, 27, 101–107. doi:10.1016/j.profnurs.2010.09.007 [CrossRef]
- Conrad, F. & Blair, J. (1996). From impressions to data: Increasing the objectivity of cognitive interviews. Proceedings of the Section on Survey Research Methods (pp. 1–9). Alexandria, VA: American Statistical Association.
- Consensus Panel on Genetic/Genomic Nursing Competencies. (2006). Essential nursing competencies and curricula guidelines for genetics and genomics. Silver Spring, MD: American Nurses Association.
- Consensus Panel on Genetic/Genomic Nursing Competencies. (2009). Essentials of genetic and genomic nursing: Competencies, curricula guidelines, and outcome indicators. Silver Spring, MD: American Nurses Association.
- Crane, M., Read, C.Y. & Twomey, J. (2013). Assessing the genetic literacy of baccalaureate nursing students: A quantitative research study. Manuscript submitted for publication.
- Daack-Hirsch, S., Driessnack, M., Perkhounkova, Y., Furukawa, R. & Ramirez, A. (2012). A practical first step to integrating genetics into the curriculum. Journal of Nursing Education, 51, 294–298. doi:10.3928/01484834-20120309-02 [CrossRef]
- D’Avanzo, C. (2008). Biology concept inventories: Overview, status, and next steps. BioScience, 58, 1079–1085. doi:10.1641/B581111 [CrossRef]
- Dodson, C.H. & Lewallen, L.P. (2011). Nursing students’ perceived knowledge and attitude towards genetics. Nurse Education Today, 31, 333–339. doi:10.1016/j.nedt.2010.07.001 [CrossRef]
- Dufresne, R.J., Leonard, W.J. & Gerace, W.J. (2002). Making sense of students’ answers to multiple-choice questions. The Physics Teacher, 40, 174–180. doi:10.1119/1.1466554 [CrossRef]
- Elrod, S. (2007). Genetics concepts inventory. Retrieved from http://bioliteracy.colorado.edu/Readings/papersSubmittedPDF/Elrod.pdf
- Frey, B.B., Petersen, S., Edwards, L.M., Pedrotti, J.T. & Peyton, V. (2005). Item-writing rules: Collective wisdom. Teaching & Teacher Education, 21, 357–364. doi:10.1016/j.tate.2005.01.008 [CrossRef]
- Garvin-Doxas, K., Klymkowsky, M. & Elrod, S. (2007). Building, using, and maximizing the impact of concept inventories in the biological sciences: Report on a National Science Foundation-sponsored conference on the construction of concept inventories in the biological sciences. CBE Life Sciences Education, 6, 277–282. doi:10.1187/cbe.07-05-0031 [CrossRef]
- Giarelli, E. & Reiff, M. (2012). Genomic literacy and competent practice: Call for research on genetics in nursing education. Nursing Clinics of North America, 47, 529–545. doi:10.1016/j.cnur.2012.07.006 [CrossRef]
- Greco, K. E., Tinley, S. & Seibert, D. (2012). Essential genetic and genomic competencies for nurses with graduate degrees. Silver Spring, MD: American Nurses Association and International Society of Nurses in Genetics.
- Haladyna, T.M., Downing, S.M. & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15, 309–334. doi:10.1207/S15324818AME1503_5 [CrossRef]
- Hufnagle, B. (2001). Development of the astronomy diagnostic test. Astronomy Education Review, 1, 47–51. doi:10.3847/AER2001004 [CrossRef]
- Kaplan, R.M. & Saccuzzo, D.P. (1997). Psychological testing: Principles, applications, and issues. Pacific Grove, CA: Brooks/Cole.
- Kehoe, J. (1995). Basic item analysis for multiple-choice tests. Practical Assessment, Research & Evaluation, 4, Article 10. Retrieved from http://pareonline.net/getvn.asp?v=4&n=10
- Kirk, M., McDonald, K., Longley, M., Anstey, S., Anionwu, E. & Benjamin, C. (2003). Fit for practice in the genetics era: A competence based education framework for nurses, midwives and health visitors (2003). Pontypridd, Wales, United Kingdom: University of Glamorgan. Retrieved from http://www.geneticseducation.nhs.uk/downloads/reportsdocs/16206/FitforPractice_Extendedsummary.pdf
- Krippendorff, K. (1980). Content analysis: An introduction to its methodology. Newbury Park, CA: Sage.
- Maradiegue, A., Edwards, Q.T., Seibert, D., Macri, C. & Sitzer, L. (2005). Knowledge, perceptions, and attitudes of advanced practice nursing students regarding medical genetics. Journal of the American Academy of Nurse Practitioners, 17, 472–479. doi:10.1111/j.1745-7599.2005.00076.x [CrossRef]
- Martin, J., Mitchell, J. & Newell, T. (2003, November). Development of a concept inventory for fluid mechanics. Paper presented at the 33rd ASEE/IEEE Frontiers in Education Conference. , Boulder, CO. .
- Mills Shaw, K.R., Van Horne, K., Zhang, H. & Boughman, J. (2008). Essay contest reveals misconceptions of high school students in genetics content. Genetics, 178, 1157–1168. doi:10.1534/genetics.107.084194 [CrossRef]
- National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. Pelligrino, J., Chudowsky, N. & Glaser, R. (Eds.), Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academies Press.
- Nelson, M.A., Geist, M.R., Miller, R.L., Streveler, R.A. & Olds, B.M. (2007). How to create a concept inventory: The thermal and transport concept inventory. Retrieved from http://www.thermalinventory.com/papers/2007HowCreateConceptInv.pdf
- Nunnally, J.C. & Bernstein, I.H. (1994). Psychometric theory (3rd ed.). New York, NY: McGraw-Hill.
- Ratzan, S.C. & Parker, R.M. (2006). Health literacy—identification and response. Journal of Health Communication, 11, 713–715. doi:10.1080/10810730601031090 [CrossRef]
- Ricciardi, C., McCabe, M. & Ward, L. (2012, October). Application of a genomic knowledge inventory to practicing pediatric nurses. Poster session presented at the meeting of the International Association of Clinical Research Nurses. , Houston, TX. .
- Richardson, J. (2004). Concept inventories: Tools for uncovering STEM students’ misconceptions. In Invention and impact: Building excellence in undergraduate science, technology, engineering and mathematics (STEM) education (pp. 19–25). Washington, DC: American Association for the Advancement of Science.
- Sadler, T. (2003). Informal reasoning regarding socioscientific issues: The influence of morality and content knowledge (Unpublished doctoral dissertation). University of South Florida, Tampa, FL.
- Smith, M.K., Wood, W.B. & Knight, J.K. (2008). The genetics concept assessment: A new concept inventory for gauging student understanding of genetics. CBE Life Sciences Education, 7, 422–430. doi:10.1187/cbe.08-08-0045 [CrossRef]
- Statistical Analysis of Multiple Choice Exams. (n.d.). Retrieved from http://chemed.chem.purdue.edu/chemed/stats.html
- Streveler, R.A., Miller, R.L., Santiago-Roman, A.I., Nelson, M.A., Geist, M.R. & Olds, B.M. (2011). Rigorous methodology for concept inventory development: Using the ‘assessment triangle’ to develop and test the Thermal and Transport Science Concept Inventory (TTCI). International Journal of Engineering Education, 27, 968–984.
- Taylor, M. & Smith, S. (2009). How do you know if they’re getting it? Writing assessment items that reveal student understanding. Science Scope, 32(5), 60–64.
- Treagust, D. (1988). Development and use of a diagnostic test to evaluate students’ misconceptions in science. International Journal of Science Education, 10, 159–169. doi:10.1080/0950069880100204 [CrossRef]
- Ward, L.D. (2013, October). A model for evaluating genomic nursing education. Paper presented at the 25th Anniversary Conference of the International Society of Nurses in Genetics. , Bethesda, MD. .
Psychometric Features of the Genetics Literacy Assessment Instrument in Three Studies
|Target Population||Study||Scale Difficulty (%)||Internal Consistency Reliability|
|Undergraduate nonscience students (N = 395)||Bowling et al. (2008)||43||>0.99a|
|Undergraduate prelicensure BSN students (N = 84)||Daack-Hirsch, Driessnack,Perkhounkova, Furukawa, & Ramirez (2012)||73||0.58a|
|Undergraduate prelicensure BSN students (N = 272)||Crane, Read, & Twomey (2013)||61||0.79b|
Genomic Nursing Concept Inventory, Beta Version, Content Domaina
|Topical Category/Concept||Inventory Item No.|
|Genome basics (12 items)|
| Genome composition and organization||2, 4, 5, 8|
| Homozygosity and heterozygosity||13, 29|
| Gene function||1, 6, 9|
| Gene expression||11|
| Genotype–phenotype association||7|
| Human genome homogeneity||3|
|Mutations (3 items)|
| Mutations and disease||19, 21|
| Germline and somatic mutations||18|
|Inheritance (8 items)|
| Autosomal inheritance||24|
| Autosomal dominant||30, 31|
| Autosomal recessive||15, 16|
|Genomic health care (8 items)|
| Family history||23, 26|
| Pharmacogenomics||12, 27, 28|
| Cancer genetics||20|
| Genetic testing||14, 22|
Student (N = 705) Demographics and Total Scores of Genomic Nursing Concept Inventory Beta Testing
|Cohort (Semester)a,b||n||Mean Age (Range; y)||Gender (n)||Previous Genetics Course (n)||Mean Scorec (SD)||Score Range|
|Junior (Fall 2011)||134||25.2 (19 to 56)||Male (21) Female (113)||Yes (14) No (120)||13.99 (4.99)||2 to 26|
|Senior (Fall 2011)||83||26.5 (21 to 56)||Male (15) Female (68)||Yes (2) No (81)||17.34 (5.18)||7 to 28|
|Junior (Spring 2012)||126||25.5 (18 to 60)||Male (22) Female (104)||Yes (14) No (112)||12.90 (4.55)||4 to 25|
|Junior (Fall 2012)||129||24.8 (18 to 52)||Male (18) Female (111)||Yes (2) No (127)||13.01 (4.78)||3 to 27|
|Senior (Fall 2012)||108||26.4 (21 to 57)||Male (22) Female (86)||Yes (12) No (96)||16.87 (4.83)||5 to 31|
|Junior (Spring 2013)||125||25.2 (18 to 58)||Male (17) Female (108)||Yes (16) No (109)||14.02 (4.91)||4 to 27|
Item (N = 31) Analysis of the Genomic Nursing Concept Inventory, Beta Version
|Item No.||Topical Category||Concept Description||Difficultya||Discriminationb|
|1||Genome basics||Gene function||0.36||0.386|
|2||Genome basics||Genome organization||0.68||0.363|
|3||Genome basics||Human genome homogeneity||0.51||0.399|
|4||Genome basics||Genome organization||0.30||0.536|
|5||Genome basics||Genome composition||0.54||0.191|
|6||Genome basics||Gene function||0.31||0.238|
|7||Genome basics||Genotype–phenotype association||0.28||0.317|
|8||Genome basics||Genome organization||0.29||0.347|
|9||Genome basics||Gene function||0.31||0.247|
|11||Genome basics||Gene expression||0.13||0.134|
|12||Genomic health care||Pharmacogenomics||0.76||0.257|
|13||Genome basics||Homozygosity and heterozygosity||0.19||0.088|
|14||Genomic health care||Genetic screening tests||0.64||0.139|
|20||Genomic health care||Gene testing in cancer||0.31||0.342|
|21||Mutations||Role of mutations in disease||0.61||0.284|
|22||Genomic health care||Carrier testing||0.48||0.291|
|23||Genomic health care||Family history—red flags||0.25||0.132|
|26||Genomic health care||Family health history—benefit||0.84||0.177|
|27||Genomic health care||Pharmacogenomics||0.47||0.295|
|28||Genomic health care||Pharmacogenomics||0.65||0.233|
|29||Genome basics||Homozygosity and heterozygosity||0.35||0.179|