Journal of Gerontological Nursing

Feature Article 

Qualitative Validation of the Nursing Home IT Maturity Staging Model

Kimberly R. Powell, PhD, RN; Gregory L. Alexander, PhD, RN, FAAN, FACMI

Abstract

The goal of the current study was to qualitatively explore issues of validity, specificity, and sensitivity regarding the nursing home (NH) information technology (IT) maturity survey and staging model. Participants who completed the NH IT maturity survey were recruited during pilot testing of the survey and staging model. Cognitive interviewing was used to collect qualitative data. Findings indicate the NH IT maturity survey and staging model is a straightforward and acceptable instrument. Every participant in our study agreed with the IT maturity stage assigned to their facility, based on their total score on the IT maturity survey. However, some participants were not sure how to answer some questions on the survey because they did not have in-depth knowledge of IT processes that took place outside of their NH facility and others experienced difficulty interpreting items because their NH facility was in a time of transition. The next step in development is quantitative psychometric testing and use of the instrument in a 3-year national study. [Journal of Gerontological Nursing, 46(7), 47–54].

Abstract

The goal of the current study was to qualitatively explore issues of validity, specificity, and sensitivity regarding the nursing home (NH) information technology (IT) maturity survey and staging model. Participants who completed the NH IT maturity survey were recruited during pilot testing of the survey and staging model. Cognitive interviewing was used to collect qualitative data. Findings indicate the NH IT maturity survey and staging model is a straightforward and acceptable instrument. Every participant in our study agreed with the IT maturity stage assigned to their facility, based on their total score on the IT maturity survey. However, some participants were not sure how to answer some questions on the survey because they did not have in-depth knowledge of IT processes that took place outside of their NH facility and others experienced difficulty interpreting items because their NH facility was in a time of transition. The next step in development is quantitative psychometric testing and use of the instrument in a 3-year national study. [Journal of Gerontological Nursing, 46(7), 47–54].

Over the past 20 years, there has been an explosive trend in information technology (IT) system implementation in health care. Billions of dollars and countless resources have been spent implementing IT systems to improve efficiency, quality, and safety across different health care settings, including nursing homes (NHs). NHs are a critical component of health care systems around the globe, but particularly in the United States, where more than 16,000 facilities provide care to >1.7 million residents (National Center for Health Statistics, 2014). The need for NH services is expected to grow significantly in the coming years as the world's population is aging at an unprecedented rate. Virtually every country in the world is experiencing growth in the number and proportion of older persons in their population. It is estimated that by 2050, one in six people in the world will be older than 65 (16%), an increase from one in 11 in 2019 (9%) (United Nations, Department of Economic and Social Affairs, 2019). As older adults live longer, increased incidence of chronic conditions and complexity of care are straining health resources in NHs, a setting historically plagued by limited resources (Anderson & Horvath, 2004; Castle & Ferguson, 2010).

Health IT systems have the potential to improve quality, safety, communication, and efficiency in NHs. Despite being ineligible for federal incentives promoting adoption of health IT, NHs are growing in their IT maturity, defined by how IT implementation moves through maturation processes from disparate fragmented systems to fully integrated systems (Alexander, Powell, et al., 2019). A recent national study of U.S. NHs found that 95% had computerized medical records and 46% had some capability for health information exchange (Powell, 2020). As NHs continue to make progress in IT implementation, they now face challenges related to evaluating systems and maximizing IT performance.

IT evaluation methods should be guided by a theoretically grounded maturity model, not only to enable comparisons of current IT stages but also to show IT evolution over time. In other words, NH administrators can benefit from understanding the evolutionary characteristics of IT by systematically identifying how these systems can be used and improved to meet strategic plans through continuous quality improvement. In its simplest conception, maturity refers to the state of being fully developed, and maturity stage refers to a succession of changes that affect an entity (e.g., a species, an industry, or a society) (Seong Leem et al., 2008). Maturity models, composed of several stages, provide a progressive, directional set of changes that improve performance with the passage of time. Nolan's stages of growth model, developed in the 1970s, was one of the first models to address IT maturity within an organization. According to Nolan, organizations slowly begin to use IT in the initiation stage (Stage 1) followed by a period of rapid growth of IT in the contagion stage (Stage 2). The rapid proliferation of systems that occurs in Stage 2 creates the need for formalized controls (Stage 3). As people become savvier IT users in a system and begin to perceive its value, integration (Stage 4) of systems and optimization of data management occur. Data administration (Stage 5) addresses issues of data ownership and security. Finally, steady growth produces maturity (Stage 6) (Nolan, 1979).

Over time, Nolan's model has been validated, modified, and used as a foundation for building new models. A recent literature review found 14 health IT maturity models had been developed between 2006 and 2015. However, these models were not developed to assess the general health care environment but rather specific IT components such as mobile health, electronic medical records (EMRs), interoperability, telemedicine, and usability (Carvalho et al., 2016). No models in the review were developed to address NH IT maturity. Prior to our work, no universally accepted NH IT maturity model existed to trend IT adoption and determine the impact of increasing IT maturity on NH quality. Situating NH IT adoption within a maturity model provides a roadmap for NH stakeholders about technological capabilities and promotes progression to higher stages of IT maturity. This model assumes that higher stages of IT maturity help facilitate improved quality performance outcomes. If true, then determining staging criteria and trends in IT maturity level is an important pathway to understand what technologies make a difference in NH quality performance. This type of evidence will help NH leaders make IT adoption decisions based on the potential to improve resident outcomes. In addition, understanding IT adoption may be helpful when negotiating contracts and establishing relationships for value-based care.

The NH IT Maturity Staging Model

Since 2016, our research team has been developing and testing a NH IT maturity survey and staging model (Alexander & Madsen, 2018; Alexander et al., 2017; Alexander, Madsen, et al., 2019). The survey measures IT adoption across three IT dimensions (capabilities, extent of use, and degree of integration) and three health care domains (resident care, clinical support, and administrative activities). This results in a 3 X 3 (dimension/domain) model with nine subscales (scored from 0 to 100) and a total IT maturity score (maximum 900). The IT maturity staging model indicates a range of total IT maturity scores that correspond to each stage in the model from Stage 0 (nonexistent IT solutions or EMR) to Stage 6 (use of data by resident or resident representative to generate clinical data and drive self-management) (Alexander, Powell, et al., 2019). The instrument and staging model have been tested and the model refined through rigorous techniques, including a four-round Delphi and pilot study (Alexander, Powell, et al., 2019). The next step in development is validation of the instrument and staging model using quantitative and qualitative methods. Qualitatively, we have evaluated performance of the survey and model through pilot testing and by identifying the specific effect of question measurement error on survey estimates (component factor analysis). Quantitative results are not reported in the current article. Qualitatively, our goal was to further validate the questions and answers contained in the survey, identifying how and where the question accomplished or failed to achieve its measurement purpose. The goal of the current study was to qualitatively explore issues of validity, specificity, and sensitivity regarding the NH IT maturity survey and staging model.

Qualitative techniques have been developed to explore cognitive issues in the process of responding to survey questionnaires. Assumptions that respondents can understand the questions being asked, questions are understood in the same way by all respondents, and respondents are willing and able to answer such questions are implicit in survey research (DeVellis, 2012). Qualitative pre-testing is an established practice in survey methodology. Questions and response options included in survey instruments are based on shared understanding of language. Differences in respondent interpretation of survey questions can undermine data validity (Mallinson, 2002). A considerable body of literature has been published in which these techniques are discussed and their validity and reliability assessed (Esposito, 2012). By collecting qualitative data in a small number of respondents, we hope to identify how perceptions of questions are interpreted across survey respondents leading to valid results.

Method

Study Design

The study design included qualitative interviews to validate survey and staging model outcomes. We used cognitive interviewing to collect qualitative data. Cognitive interviewing can be used as a diagnostic tool, which pays explicit attention to the mental processes respondents use to answer survey questions and thus allows covert as well as overt problems to be identified (Collins, 2003). This technique is flexible in nature and is complementary to, rather than a replacement for, traditional pilot testing. This study was approved by the Institutional Review Board at the University of Missouri, Columbia.

Participants and Sampling Strategy

We recruited participants who completed the NH IT maturity survey during pilot testing of the survey and staging model in a prior study. Participants were from facilities representing a range of IT maturity scores and stages (Table 1). In contrast to psychometric validation studies that are concerned with generalizability, analytic accuracy, and insight, qualitative validation studies depend on the in-depth data collected from each individual participant and the diversity among participants. Semi-structured, cognitive interviews were conducted with NH administrative leaders (N = 12) using Zoom videoconferencing and took place between April 15, 2019 and May 3, 2019.

Characteristics of Study Participants (N = 12)

Table 1:

Characteristics of Study Participants (N = 12)

Data Analysis

Cognitive interviewing involves probing questions about specific items on the survey and general inquiries about participant reaction to the survey and staging model. Each interview was digitally recorded and transcribed verbatim. Finished transcripts were verified by members of the research team for accuracy with the audio recording. Responses were systematically documented and organized according to the specific probe asked. Codes were developed for themes that emerged from data analysis of the general perceptions. Trustworthiness of findings was achieved using a variety of methods. Credibility was addressed through member checking (i.e., deliberate probing during data collection to ensure that participants' meanings were understood). An audit trail was maintained for dependability and an interview guide was used to maintain consistency. Confirmability was addressed as all transcripts were independently coded by the primary author (K.P.) and three research assistants. Any differences in coding were discussed among the research team until consensus was met. Transferability was enhanced using field notes and comparing findings across participants using exemplar quotes. We used a shared file, accessible only to the research team and updated in real time to review and organize data. The primary and secondary authors met frequently to discuss questions about methods, meanings, and interpretations. This technique is in the same spirit as interrater reliability in quantitative research (Creswell, 2012).

We began each interview by asking participants to share their general perceptions of the NH IT maturity survey and staging model and then probed into specific sections of the survey to validate comprehension, retrieval, judgment, and response processes. Lastly, we asked participants to validate the stage assigned to their facility based on their total IT maturity score and the NH IT maturity staging model.

Results

Results are reported in two descriptive formats. Table 2 is a linear presentation of themes, sub-themes, and defining characteristics from overall perceptions of the survey and staging model. Table 3 offers exemplars used to validate the survey and staging model taken directly from interview transcripts.

Themes, Sub-Themes, and Defining Characteristics of the NH IT Maturity Survey and Staging Model

Table 2:

Themes, Sub-Themes, and Defining Characteristics of the NH IT Maturity Survey and Staging Model

Exemplars Validating the NH IT Maturity Staging ModelExemplars Validating the NH IT Maturity Staging Model

Table 3:

Exemplars Validating the NH IT Maturity Staging Model

General Perceptions of the Survey and Staging Model

Overall, participants stated the survey was detailed, easy to understand, and complete (i.e., comprehensive). Two themes emerged from participants' general thoughts about the survey and staging model: (1) uncertainty about specific external IT systems, and (2) ambiguity during times of transition. Some participants (n = 4) were not sure how to answer some of the questions on the survey because they did not have in-depth knowledge of IT processes that took place external to their facility. For example, one participant said, “I was not real sure that some of the questions applied to us with regard to our pharmacy or our therapy department because we don't have our own in-house therapy department; that's contracted out.” Another participant described this uncertainty because some IT decisions are made at the local level and others at the corporate level:

There were some questions related to the survey that were very specific to my role and what we do here. There were some that were kind of more on the corporate side for IT. It was difficult to really gauge what an appropriate answer would be simply because I'm not fully aware of some of the IT configurations and projects that they have going on.

The theme of ambiguity during times of transition refers to comments made by participants (n = 5) about their difficulty interpreting items on the survey because their NH was in a time of transition. One participant described this in terms of the transition from paper to digital: “We are half [paper] and half [electronic]. It was difficult to really explain what's going on here.”

Validation of the NH IT Maturity Staging Model

Throughout the cognitive interviews, we asked participants about specific items on the survey to validate the accuracy of the staging model. Participants were asked specifically about items that fell within their IT maturity stage as well as items that fell in the stage below and above them. For example, if the participant being interviewed represented a facility with an IT maturity Stage 2, we would ask questions about items in Stage 1, 2, and 3. Findings revealed all participants (N = 12) were able to validate their IT maturity stage and all agreed with the stage assigned to them based on their survey results (Table 3).

Discussion and Implications

Findings from the current study shed light on the utility of the NH IT maturity survey and accuracy of the NH IT maturity staging model to reflect stage. Findings indicate the NH IT maturity survey and staging model is a straightforward and acceptable instrument. Every participant in our study agreed with the IT maturity stage assigned to their facility, based on their total score on the IT maturity survey. Although every participant agreed with their IT maturity stage, some questions arose related to uncertainty of external IT systems and ambiguity during times of transition. Uncertainty related to external IT systems stemmed from lack of in-depth knowledge of IT processes that took place outside the NH. For example, NHs often use third-party vendors for pharmacy services, thus, NH administrators were not clear regarding which specific pharmacy processes (e.g., drug interaction checking, allergy alerts, duplicate orders checking) were computerized.

IT is the most outsourced function in U.S. health care as it is seen as a strategic tool for controlling costs without affecting quality of care (Roberts, 2001). Vendors providing hardware, software, networking, and IT consulting solutions are ubiquitous and billions of dollars have been spent in the quest to become and remain current with IT standards. A recent study found that strategic outsourcing, including IT, in NHs reduced operational costs and led to strong relationships with vendors without negatively affecting quality of patient care (Huichuan, 2018). Our findings highlight the need for enhanced communication between vendors and NH leaders to ensure they are aware of and maximizing the benefits of IT systems available to staff, residents, and family members.

The other question that arose during this qualitative validation study was ambiguity during times of transition. It was not surprising to find some NHs in our study were in the process of transitioning from paper-based to computerized systems. Some participants explained their difficulty responding to items on the survey because they were amid a transition. In fact, many respondents indicated that between the time they completed the survey and when they were interviewed (all within a 2-month time frame) their IT capabilities had changed. Although NHs have been slow adopters of technology compared to acute-care providers, such as hospitals, they have made significant progress in recent years. A recent study found that 84% of NHs had adopted an EMR, although many reported not fully using the EMR and even fewer (3%) reported that their organization was fully engaged in interoperability (i.e., able to find, send, receive, and integrate patient information with no manual effort) (Vest et al., 2019). This aforementioned study demonstrates substantial improvement in IT adoption compared to a 2016 study, which estimated that only 64% of NHs had an EMR (Alvarado, 2017). Although the issue of ambiguity during times of transition did not affect the respondent's ability to complete the survey, it indicates a need for repeated measures and collection of longitudinal data to further understand how IT matures over time.

In the current study, we interviewed a heterogeneous sample of NH leaders, to qualitatively explore issues of validity, specificity, and sensitivity of the NH IT maturity survey and staging model. Validity is a fundamental psychometric property, ensuring the instrument measures what it intends to measure (DeVon et al., 2007). Content validity refers to the extent to which the sampled items adequately reflect the domain and operational definition of the construct (Almanasreh et al., 2019). For some, content validity is the most important property in instrument validation, as unfavorable results could introduce bias into all other measurements. We relied on a panel of experts to evaluate elements of the instrument and describe their relevance and representativeness to the content domain. We used cognitive interviewing, an evidence-based qualitative method designed specifically to investigate whether a survey question fulfills its intended purpose. All participants (N = 12) in our study found the NH IT maturity survey and staging model to be acceptable and valid. Had there been any evidence that the instrument was unacceptable or flawed, we would have amended the instrument before proceeding with further evaluation and testing.

Limitations

It is not possible to draw quantitative inferences from the current study. For example, exact estimation or computation of the effect of mis-interpretations of the instrument on resulting NH IT maturity scores or stages is not possible. However, we plan to quantitatively evaluate the IT maturity survey and staging model in a future study. The small number of participants in this study should not be considered a limitation. One of the strengths of this validation process is the ability to obtain a first-hand account of the cognitive work involved in responding to survey items by those who are involved in the process. By collecting this rich data even on a small number of participants, we were able to establish content validity before moving on to quantitative psychometric testing.

Conclusion

The NH IT maturity survey and staging model is an acceptable and valid instrument. Overall, participants found the survey to be detailed, easy to understand, and complete. During this qualitative validation study, two questions arose related to uncertainty regarding external IT systems and ambiguity during times of transition. As the instrument is further developed and tested, these questions offer insight into the cognitive processes respondents are using when they engage with the instrument. The next step in development is quantitative psychometric testing and use of the instrument in a 3-year national study.

References

  • Alexander, G. L. & Madsen, R. (2018). A national report of nursing home quality and information technology: Two-year trends. Journal of Nursing Care Quality, 33(3), 200–207 doi:10.1097/NCQ.0000000000000328 [CrossRef] PMID:29787455
  • Alexander, G. L., Madsen, R., Deroche, C. B., Alexander, R. & Miller, E. (2019). Ternary trends in nursing home information technology and quality measures in the United States. Journal of Applied Gerontology. Advance online publication. doi:10.1177/0733464819862928 [CrossRef] PMID:31311420
  • Alexander, G. L., Madsen, R. W., Miller, E. L., Schaumberg, M. K., Holm, A. E., Alexander, R. L., Wise, K. K., Dougherty, M. L. & Gugerty, B. (2017). A national report of nursing home information technology: Year 1 results. Journal of the American Medical Informatics Association, 24(1), 67–73 doi:10.1093/jamia/ocw051 [CrossRef] PMID:27107444
  • Alexander, G. L., Powell, K., Deroche, C. B., Popejoy, L., Mosa, A. S. M., Koopman, R., Pettit, L. & Dougherty, M. (2019). Building consensus toward a national nursing home information technology maturity model. Journal of the American Medical Informatics Association, 26(6), 495–505 doi:10.1093/jamia/ocz006 [CrossRef] PMID:30889245
  • Almanasreh, E., Moles, R. & Chen, T. F. (2019). Evaluation of methods used for estimating content validity. Research in Social & Administrative Pharmacy, 15(2), 214–221 doi:10.1016/j.sapharm.2018.03.066 [CrossRef] PMID:29606610
  • Alvarado, C., Zook, K. & Henry, J. (2017). Electronic health record adoption and interoperability among U.S. nursing facilities in 2016. https://dashboard.healthit.gov/evaluations/data-briefs/electronic-health-record-adoption-interoperability-nursing-facilities-2016.php
  • Anderson, G. & Horvath, J. (2004). The growing burden of chronic disease in America. Public Health Reports, 119(3), 263–270 doi:10.1016/j.phr.2004.04.005 [CrossRef] PMID: 15158105
  • Carvalho, J., Rocha, Á. & Abreu, A. (2016). Maturity models of healthcare information systems and technologies: A literature review. Journal of Medical Systems, 40(6), 1–10 doi:10.1007/s10916-016-0486-5 [CrossRef]
  • Castle, N. G. & Ferguson, J. C. (2010). What is nursing home quality and how is it measured?The Gerontologist, 50(4), 426–442 doi:10.1093/geront/gnq052 [CrossRef] PMID:20631035
  • Collins, D. (2003). Pretesting survey instruments: An overview of cognitive methods. Quality of Life Research: An International Journal of Quality of Life Aspects of Treatment, Care and Rehabilitation, 12(3), 229–238 doi:10.1023/A:1023254226592 [CrossRef] PMID:12769135
  • Creswell, J. W. (2012). Qualitative inquiry & research design: Choosing among five approaches (3rd ed.). Sage.
  • DeVellis, R. F. (2012). Scale development: Theory and applications (3rd ed.). Sage.
  • DeVon, H. A., Block, M. E., Moyle-Wright, P., Ernst, D. M., Hayden, S. J., Lazzara, D. J., Savoy, S. M. & Kostas-Polston, E. (2007). A psychometric toolbox for testing validity and reliability. Journal of Nursing Scholarship, 39(2), 155–164 doi:10.1111/j.1547-5069.2007.00161.x [CrossRef] PMID:17535316
  • Esposito, J. L. & Rothgeb, J. M. (2012). Evaluating survey data: Making the transition from pretesting to quality assessment. In Lyberg, P. B. L., Collins, M., De Leeuw, E., Dippo, C., Schwarz, N. & Trewin, D. (Eds.),Survey measurement and process quality (pp. 541–571). John Wiley & Sons, Inc.
  • Hui-chuan, C. (2018). Assessing outsourcing strategy on quality and performance of US long-term healthcare. Journal of Management Policy & Practice, 19(2), 118–128 doi:10.33423/jmpp.v19i2.1281 [CrossRef]
  • Mallinson, S. (2002). Listening to respondents: A qualitative assessment of the Short-Form 36 Health Status Questionnaire. Social Science & Medicine, 54(1), 11–21 doi:10.1016/S0277-9536(01)00003-X [CrossRef] PMID:11820675
  • National Center for Health Statistics. (2014). Nursing home care. https://www.cdc.gov/nchs/fastats/nursing-home-care.htm
  • Nolan, R. L. (1979). Managing the crises in data processing. https://hbr.org/1979/03/managing-the-crises-in-data-processing
  • Powell, K., Deroche, C. & Alexander, G. (2020). Health data sharing among U.S. nursing homes: A mixed methods study. Journal of the American Medical Directors Association. Advance online publication. doi:10.1016/j.jamda.2020.02.009 [CrossRef] PMID:32224261
  • Roberts, V. (2001). Managing strategic outsourcing in the healthcare industry. Journal of Healthcare Management, 46(4), 239–249 doi:10.1097/00115514-200107000-00007 [CrossRef] PMID:11482242
  • Seong Leem, C., Wan Kim, B., Jung Yu, E. & Ho Paek, M. (2008). Information technology maturity stages and enterprise benchmarking: An empirical study. Industrial Management & Data Systems, 108(9), 1200–1218 doi:10.1108/02635570810914892 [CrossRef]
  • United Nations, Department of Economic and Social Affairs. (2019). World population prospects 2019: Highlights. https://population.un.org/wpp/Publications/Files/WPP2019_Highlights.pdf
  • Vest, J. R., Jung, H. Y., Wiley, K. Jr.. , Kooreman, H., Pettit, L. & Unruh, M. A. (2019). Adoption of health information technology among US nursing facilities. Journal of the American Medical Directors Association, 20(8), 995–1000.e1004. doi:10.1016/j.jamda.2018.11.002 [CrossRef] PMID: 30579920

Characteristics of Study Participants (N = 12)

Variablen (%)
Gender
  Female6 (50)
  Male6 (50)
Job title
  Administrator8 (66)
  Executive Director2 (16)
  Chief Operating Officer1 (8)
  Security Officer1 (8)
IT maturity stage
  01 (8)
  12 (16)
  22 (16)
  32 (16)
  41 (8)
  53 (25)
  61 (8)
Mean (SD) (Range)
Years in position9.4 (8.2) (1.5 to 30)
IT maturity score424.7 (282.6) (119.4 to 823.5)

Themes, Sub-Themes, and Defining Characteristics of the NH IT Maturity Survey and Staging Model

ThemesSub-ThemesDefining Characteristics
General perceptionsDetailedThe instrument contained detailed and specific information
Easy to understandThe instructions (for each section) for completing the instrument were easy to understand as were the items themselves
CompleteThe instrument contained a comprehensive list of IT capabilities
Uncertainty about external IT systemsNot sure the NH had the IT capability described in the survey (i.e., did not know if the NH had the technology)The person completing the survey was not sure if the capability existed because decisions were being made outside of the facility (e.g., outsourcing of IT, adoption decisions being made at the corporate level)
Not sure if specific IT components are being used in the NH (i.e., did not know if the technology was being used in the NH)Survey respondent was unsure if staff (e.g., nurses, aids, therapists) are using specific IT features (e.g., clinical decision support)
Ambiguity during times of transitionMany NHs are in the process of transitioning from paper to digital systemsSurvey respondents discussed the dynamic IT environment in NHs related to transitioning from paper to digital systems

Exemplars Validating the NH IT Maturity Staging Model

StageIT Maturity Score (ITM)DescriptionDefinitionExemplar 1
00 to 128.57Nonexistent IT solutions or EMREMR not used; no overarching IT governance.“We're still on paper... we do have a laptop, you know, occasionally a laptop will find its way to the nursing station, but it's not used for nursing care.” (ID 301, ITM = 119.4)
1128.58 to 257.14Incomplete or disparate fragmented IT solutionsDifferent incongruous IT systems that have distinct functionality, with no integration, isolated systems, may use some standardized terminology in documentation (e.g., clinical diagnosis, nursing interventions, medical records, lab results).“It's not really a full blown like electronic medical record system. It's just a way to input information. The MDS and all of the billing stuff is in there but like as far as any of the like physician order sheets, results, the MARs all of that stuff is paper.” (ID 310, ITM = 239.3)
2257.15 to 385.71Established IT leadership that governs and coordinates structures, procedures, processes, and policiesIT leadership with specific duties and functions; incorporates super-users (e.g., staff knowledgeable about IT use) to assist in building, troubleshooting, implementing, and supporting front line staff with IT tasks. Implementing IT governance and data stewardship processes (e.g., ensuring data quality, capturing appropriate information for each data element). Some techniques are available to join data across disparate systems and are used for data analytics and reporting.“We have two IT people that work for the company. And there are 20 facilities and 20 assisted living facilities and so, can those two people handle all of our IT needs? So basic stuff you know someone's having a hard time with Google, for example, then, there's a first layer that tries to do a first run of assistance. When they can't figure it out, which unfortunately is frequently, they then have to kick it up to our own guys. The company does have policies and procedures for any kind of email, computer, anything, social media.” (ID 305, ITM = 320)
3385.72 to 514.29Automated internal connectivity and reportingUses common interfaces that permit secure sharing of data across multiple internal applications. Uses master data sources and classifications to establish data relationships between systems. Implementation of new applications requires adherence to standards for connectivity and internal reporting.“Well with our lab, when we order labs, we will get the results electronically submitted back to us. Plus, we'll get a hard copy off a fax machine. So all that is in the EHR. They're integrated with the lab company to where those results go directly to our patient profile. PT and OT and speech therapy. We do have a software system for that, but I'm not sure why that is a separate system. It would be great if that was integrated into [the EHR] as well so that we would be able to pull up any PT notes, therapy notes, and evaluation screens as opposed to our current processes, we have to ask therapy for those hard copy notes.” (ID 307, ITM = 430)
4514.30 to 642.86Automated external connectivity and reportingUses standard interfaces that permit secure data sharing across external applications (e.g., interface with vendors, such as pharmacy, labs, radiology, therapies, and/or other procedures) for treatment-related purposes. Interface with third parties for revenue cycle or quality management. Incorporates Health Information Exchange (HIE) technology. Implementation of new systems requires adherence to standards for external connectivity. Includes nursing and ancillary services documentation used in care management, also for claims and billing purposes.“Like PT, OT for example, our therapy company uses [software] and [software] is integrated with [the EHR] but to access that information you can only see some of the information; summaries more than details in the EHR so our MDS nurse and director of nursing have access to [software] itself so they can look more specifically. So it's integrated but I kind of feel like it could be more.” (ID 311, ITM = 563.7)
5642.87 to 771.43Clinical risk intervention and predictive analyticsSystem driven tools that influence the development of treatments and care plans, while minimizing risk. Includes clinical decision support. Analytics guide timely intervention to improve clinical outcomes. Interfaces allow delivery of all-inclusive clinical reporting using virtually all relevant data from internal and external systems. Enables association of external and internal data to predict outcomes and provide benchmarks.“So when they [nurses] go to do med[ication] passes if we don't have a med that's due until 10 p.m., it's not going to show up. It's only going to show the meds that you're going to give right now at this time. And it brings up the allergy alert, do not crush, if there's any assessments that have to be done prior to, it will make you do that first.” (ID 309, ITM = 649) “So [the EHR] is kind of the core of our operations for electronic data. And so they can enter it and see it just as our nurses can, also speech therapy. So all three disciplines of therapy can bring up [the EHR] and see their individual residents. They can also see meds which is integrated with [the EHR]. So they get the whole picture. Virtually anybody in the building that's in a caregiving position can bring up the whole picture anytime they need to.” (ID 308, ITM = 722)
6771.44 to 900Use of data by residents and/or resident representatives to generate clinical data and drive self-managementA secure and protected means for residents and/or resident representatives to generate and access clinical data. Increases transparency of their clinical data in a format that is easily understood by these types of end-users. Resident data are accessible electronically.“If the resident were to want to review medications or anything like that we could do that with our devices, we have computer on wheels that we could present that information to them with just like any scanned documentation about medications or any, you know, an advanced directive, all those are scanned and to their electronic health record, and able to be viewed as well.” (ID 302, ITM = 823.4)
Authors

Dr. Powell is Assistant Professor, University of Missouri Sinclair School of Nursing, Columbia, Missouri; and Dr. Alexander is Professor, Columbia University School of Nursing, New York, New York.

The authors have disclosed no potential conflicts of interest, financial or otherwise. This study was funded by grant R01HS02249 from the Agency for Healthcare Research and Quality (AHRQ). The content is solely the responsibility of the authors and does not necessarily represent the official views of the AHRQ.

Address correspondence to Kimberly R. Powell, PhD, RN, Assistant Professor, University of Missouri, S335 Sinclair School of Nursing, Columbia, MO 65211; email: powellk@missouri.edu.

10.3928/00989134-20200605-08

Sign up to receive

Journal E-contents