Journal of Gerontological Nursing

Technology Innovations 

Usability Testing of a Mobile Clinical Decision Support App for Urinary Tract Infection Diagnosis in Nursing Homes

Blaine Reeder, PhD; Cynthia Drake, MA; Mustafa Ozkaynak, PhD; Heidi L. Wald, MD, MSPH

Abstract

The aim of the current study was to conduct usability testing of a mobile clinical decision support (CDS) prototype designed for urinary tract infection (UTI) assessment by nurses in nursing homes (NHs). Usability of the UTIDecide smartphone application (app) was evaluated using cognitive walk-through and think-aloud protocol sessions with nurses (n = 6) at two NH sites. This evaluation was followed by unsupervised field tests lasting ≥1 week with nurses at one site (n = 4) and posttest interviews and administration of the System Usability Scale (SUS). Cognitive walk-through/think-aloud sessions yielded interface design recommendations that were implemented prior to field tests. All test sessions resulted in highly positive perceived usability and usefulness from participants. Average SUS score was 92.5 (n = 3), which equates to an “A” grade for usability. Design recommendations identified for future app versions are: (a) integration of the mobile CDS app with organizational information systems; and (b) expanded features to support assessment of other conditions. [Journal of Gerontological Nursing, 45(7), 11–17.]

Abstract

The aim of the current study was to conduct usability testing of a mobile clinical decision support (CDS) prototype designed for urinary tract infection (UTI) assessment by nurses in nursing homes (NHs). Usability of the UTIDecide smartphone application (app) was evaluated using cognitive walk-through and think-aloud protocol sessions with nurses (n = 6) at two NH sites. This evaluation was followed by unsupervised field tests lasting ≥1 week with nurses at one site (n = 4) and posttest interviews and administration of the System Usability Scale (SUS). Cognitive walk-through/think-aloud sessions yielded interface design recommendations that were implemented prior to field tests. All test sessions resulted in highly positive perceived usability and usefulness from participants. Average SUS score was 92.5 (n = 3), which equates to an “A” grade for usability. Design recommendations identified for future app versions are: (a) integration of the mobile CDS app with organizational information systems; and (b) expanded features to support assessment of other conditions. [Journal of Gerontological Nursing, 45(7), 11–17.]

Urinary tract infection (UTI) is the most commonly diagnosed infection in nursing homes (NHs) (Herzig et al., 2017; Montoya & Mody, 2011). Some UTI cases are caused by asymptomatic bacteriuria (ASB). Inappropriate treatment of ASB may contribute to antibiotic resistance in NHs (Morrill, Caffrey, Jump, Dosa, & LaPlante, 2016). Therefore, improved methods to increase UTI diagnostic accuracy and decrease inappropriate ASB treatment with antibiotics are important for antimicrobial stewardship strategies in NHs. However, the assessment component critical to accurately diagnosing UTI can be a challenge with frail, older adults in NHs (Wald, 2016). To overcome this challenge, evidence-based mobile clinical decision support (CDS) applications (apps) included as part of antimicrobial stewardship strategies may help reduce incorrect UTI diagnosis in NHs by influencing clinician behavior at the point of care (Handler et al., 2013). Therefore, the aim of the current study was to evaluate the usability of a mobile CDS app using laboratory and field-based methods.

UTIDecide Mobile App Description

The purpose of the current study was to develop and evaluate a high-fidelity prototype of a mobile CDS app for UTI assessment within the workflow of nurses in NHs (Jones et al., 2017; Ozkaynak, Reeder, et al., 2018; Reeder et al., 2019). UTIDecide runs on Android and iOS platforms, implementing a validated algorithm to reduce overtreatment of ASB in acute and long-term care settings (Trautner et al., 2013). After an opening screen that explains the purpose and use of the app, UTIDecide guides the user through symptom documentation. UTIDecide provides recommendations based on the implemented algorithm and documented symptoms, with the option of seeing a personalized Situation, Background, Assessment, Recommendation (SBAR) script to facilitate communication with an offsite provider. UTIDecide includes informational icons (e.g., “help”) within the app workflow to provide technical information derived from evidence-based guidelines. The high-fidelity prototype implemented a feature set at a robust enough level to test app function and acceptability (Davis, Bagozzi, & Warshaw, 1989) with targeted NH end users to improve designs for a final product. Figure 1 shows example screen captures of the UTIDecide mobile app.

Screen captures of UTIDecide interfaces for start, symptom documentation, recommendations, and Situation, Background, Assessment, Recommendation (SBAR) screens.

Figure 1.

Screen captures of UTIDecide interfaces for start, symptom documentation, recommendations, and Situation, Background, Assessment, Recommendation (SBAR) screens.

Usability Background

Usability (or perceived ease of use) is user perception of the difficulty in accomplishing a task with the mobile app, whereas perceived usefulness is user perception of potential benefit of the app when used for such a task (Davis et al., 1989). A cognitive walk-through is a usability test method to identify barriers to completion of specific tasks (Kaufman et al., 2003) and is often used with think-aloud protocols where users “think-aloud” while using a system interface to perform these tasks (Jaspers, Steen, van den Bos, & Geenen, 2004; Turner, Reeder, & Wallace, 2013).

Method

Setting and Participants

The study setting was two for-profit NHs with >100 beds (Sites A and B) in a metropolitan area in the Mountain West region of the United States. Each site had a skilled nursing unit with most beds dedicated to long-term care (LTC) residents. Both sites were staffed with RNs, licensed practical nurses (LPNs), and certified nursing assistants (CNAs).

A convenience sample of usability test participants (N = 8) was recruited by a research assistant from the two sites via flyers, word-of-mouth, and face-to-face solicitation. Sample sizes were guided by evidence from studies of usability testing that showed three to five participants reveal the majority of usability problems for a given design iteration (Lewis, 1994; Nielsen & Landauer, 1993; Turner, Lewis, & Nielsen, 2006; Virzi, 1992). Five participants were RNs and three were LPNs. Three RN participants held supervisor roles as Director of Nursing, Assistant Director of Nursing, and Charge Nurse. Participants' work experience ranged from <1 to 22 years in a nursing role. Table 1 shows participant role, site, and test activities.

Utidecide Usability Test Participants and Activities

Table 1:

Utidecide Usability Test Participants and Activities

Test Procedures

Usability testing comprised: (a) a set of cognitive walk-throughs and think-aloud sessions, and (b) a set of field tests followed by interviews and administration of the System Usability Scale (SUS). The SUS is a 10-item survey of subjective usability (Brooke, 1996) that has been shown to be valid and reliable with a range of different technologies and users (Brooke, 2013). The first and second authors (B.R., C.D.) conducted all usability tests on site, alternating as moderator and note-taker. Usability test sessions lasted approximately 30 minutes and were recorded using a digital audio recorder. All participants provided verbal consent prior to testing and were reimbursed with a $50 gift card for each test session in which they participated. All study procedures were approved by The Colorado Multiple Institutional Review Board.

Six cognitive walk-through and think-aloud sessions were conducted at both study sites (Site A, n = 3; Site B, n = 3). Participants were asked to familiarize themselves with UTIDecide on a researcher-provided smartphone, encouraged to ask any clarifying questions about the app, and told to indicate when ready to begin the test session. Each participant assessed four hypothetical patients using UTIDecide from patient vignettes developed for testing purposes (Jones et al., 2017). The four vignettes comprised a brief description of hypothetical patient demographics and symptoms, with two vignettes having symptoms concordant with UTI. Although participants stepped through each vignette using UTIDecide, they were encouraged to “think-aloud” about their experiences using the app to identify aspects of its usability.

Participants at Site B only performed field tests of usability (n = 4). Field tests involved assisting participants with downloading UTIDecide, orienting them to its use on their own smartphone, and instructing them to use the app if assessing a resident with suspected UTI. Three participants conducted field tests for a 1-week period, as planned. One participant was unavailable during the planned follow-up visit, and thus, tested the app for a 3-week period prior to being interviewed. Field test participants engaged in posttest interviews and were asked questions about usability, usefulness, workflow, and design issues. Three participants completed the SUS to assess the usability of UTIDecide after the field test. One participant did not use the app during the field test because the opportunity did not present itself, thus declined to complete the SUS.

Data Analysis

The design team met weekly to iteratively discuss and interpret results for all usability testing procedures. Cognitive walk-through/think-aloud sessions were transcribed verbatim by a research assistant. The first and second authors independently reviewed think-aloud transcripts and made notes about usability. The second author then formally coded all transcripts based on usability notes and team-level discussions using three a priori codes: usability, usefulness, and design recommendations. Both coders met to discuss coded results and identify themes of usability, usefulness, workflow, and design.

Summary notes of usability, usefulness, workflow, and design issues were transcribed from post-field test interview recordings by a research assistant with direction from the first and second authors. The second author verified accuracy of transcriptions by spot-checking interview recordings against summary notes. The first and second authors analyzed field interview summaries to identify themes related to usability, usefulness, workflow, and design to improve app design. Both authors met to reconcile disagreements through in-person discussion. Disagreements were few and related to framing of results using common usability terminology. Given the goal and sample size of design iterations, the purpose of posttest interview analysis was to identify obvious issues arising from app use by participants during field tests to improve design. This analytic purpose to identify obvious design issues contrasts with the notion of reaching data saturation, as would be the goal of an interview study for open-ended exploration of a broader problem domain with larger sample sizes recruited for that purpose, where coding would continue until a decreasing number of codes are added to the codebook and a threshold of diminishing returns is reached. The first author calculated usability scores from raw SUS results for each of the three participants who completed the SUS survey.

Results

Cognitive Walk-Through and Think-Aloud Sessions

Five themes and a list of six design recommendations were identified from cognitive walk-through/think-aloud sessions. Usability test sessions lasted between 28 and 36 minutes, with an average duration of 33 minutes. Identified themes were: Overall Usability and Usefulness, Usefulness as Memory Aid for Evidence-Based Practice, Usefulness as a Training Tool, Usefulness as a Communication Tool, and Potential to Improve Workflow.

Overall Usability and Usefulness. There was overall positive agreement regarding the usability and potential usefulness of the UTIDecide app. A typical reaction to the app during usability testing is exemplified by the following quote from Participant 3: “Overall, I think it is great.”

Usefulness as a Memory Aid for Evidence-Based Practice. Participants stated the importance of upholding evidence-based practice. However, current practice requires that they rely on their own memory without the benefit of memory aids or best practice reminders. One example of app usefulness is the provision of reminders and recommendations to comply with evidence-based guidelines, as Participant 3 stated: “I think that it is really helpful because it tells me when I should get a UA [urinalysis] and when I shouldn't and recommends based on the evidence-based practice which is what we aim to always follow.” Participant 1 noted the usefulness of the app to look up relevant clinical information: “It does give me some things to consider reading, which are nice to kind of jog your memory.”

Usefulness as a Training Tool. Participants noted the usefulness of UTIDecide as a potential training tool to raise awareness about relevant information and practice guidelines. Those in nurse manager roles were enthusiastic about its use for new nurses and CNAs, as Participant 1 stated:

I think that this may help to develop that critical thinking piece that nurses don't get, especially LPNs in 2 years, so I think that really helps as well. It might get them to say, “Hey, I better gather this information up according to what it says.”

The app was also seen as useful as a supporting resource to find necessary information or information that is not commonly used:

It could come in handy because when I was fresh out of school, if I didn't know something or didn't have a resource available to me, then I would call my fellow nurses that I went to nursing school with.

That is great. Some people don't know what dysuria is. I don't care if you've been a nurse for 20 years, there is going to be some stuff you don't know.

Usefulness as a Communication Tool. UTIDecide was identified as useful for communications with on-site team members, off-site providers, and residents. For on-site team members, it could facilitate communication in the following way: “The recommendations of ‘continue close observation’ and ‘begin oral hydration’… that would be something that I would want to pass along to the CNA,” as noted by Participant 1. In addition, UTIDecide was seen as a useful tool to empower nursing staff in communication with off-site providers:

Any change of conditions. If you are going to contact the doctor you should have the SBAR there ready to give a quick report.

You can say “hey, this is what I am following”…. It just makes you sound like you understand what you're doing just a little bit more, and again you've got a guideline to follow that gives you more than we're just going to test because the family says so.”

The app could also be used as a tool to engage residents in communications about their own health conditions, as Participant 6 noted:

It can even help the residents because they will be asking “What is going on with me?” and you can just give them your ideas, like it could be this or that, but the doctor might order a urinary sample or something. I would love to use this.

Potential to Improve Workflow. Participants broadly agreed that UTIDecide could improve workflow efficiency by reducing time spent on the assessment process and looking up information due to high rates of smartphone adoption:

Any time you use the smartphone would be easier or quicker. Once you get used to it you know what they're asking on the paper and you just check, check, check. But on the smartphone it goes faster.

I feel like this app is quicker to access rather than having to go and get a binder.

Everybody is so attached to their phone. I think they would do it faster on the phone than faster on a piece of paper.

Design Recommendations From Cognitive Walk-Throughs. The following six design recommendations were identified from cognitive walk-through/think-aloud sessions.

  1. Provide feature to save and print output from assessment process.

  2. Harmonize app language with facility language (e.g., Fahrenheit versus Celsius).

  3. Simplify language and improve clarity of presentation in the information dialog boxes displayed after tapping the “i” icon information. For example, “functional decline” and “leukocytosis” were presented together on a single icon, which introduced confusion about whether both were required to check a checklist item as “true.” In addition, some dialog box language read as jargon because text was copied directly from guidelines.

  4. Provide supporting language in the SBAR regarding the evidence-based guidelines upon which the app is based to give nurses confidence in communicating SBAR with providers.

  5. Reorganize and cluster display of pain symptoms under a single group as options for selection during assessment process. Originally, pain symptoms were spread throughout the app workflow.

  6. Link to electronic health record (EHR).

Design Recommendations 1–5 were implemented in the app version used by field test participants (described below). Design Recommendation 6 would involve substantial development efforts to integrate the app with institutional EHR systems and was beyond the scope and resources of the project.

Field Test Results

Most participants indicated that they would use the app in their practice if it were available. Participants agreed that the app fit their current workflow. Specifically, they noted that computer stations are often full so the app would improve access to online resources. In addition, they noted that general online searches often return non-relevant results and an app could deliver targeted results that streamline processes. Using an example identified in the field test (described below), a nurse who suspected a respiratory infection could use a mobile CDS function that implemented guidance specific to the suspected condition rather than conducting an internet search that returned thousands of results from indexed web pages.

Based on participant estimates of UTI frequency at their facility, the 1-week duration of the field test period was insufficient to capture incidents of suspected UTIs. Participants identified 1 to 6 months as a more appropriate field testing period. However, despite the short field test periods, two examples of real-world use emerged. The first example confirmed earlier findings from cognitive walk-through/think-aloud sessions that UTIDecide could be used as a communication tool with on-site team members. This scenario occurred when a field test participant who suspected a UTI showed the app to a nurse practitioner as they discussed the patient's symptoms. In the second example, a field test participant used the app as a resource to support her clinical judgement in an unintended way by entering symptoms associated with a suspected upper respiratory infection. Although UTIDecide is only designed for assessment of UTI symptoms, it offers possible alternatives for further assessment efforts if symptoms are not consistent with UTI. The participant wanted to see if the app would suggest this as a possible alternative for assessment, which it did on the recommendations screen.

Participants identified specific barriers to app implementation and use. All participants noted that mobile phone use was restricted by facility rules for staff although there was flexibility if phone use was for work purposes or an emergency. In addition, Wi-Fi access at the facility was often unreliable.

Design Recommendations from Field Tests. Two design recommendations were identified from field tests of UTIDecide. Design Recommendation 1 was implemented in a new version of UTIDecide, whereas design Recommendation 2 was beyond the scope of the current project:

  1. Enhance print feature for options to print to PDF and printer.

  2. Include features to allow for assessment of additional conditions.

System Usability Scale Results. The average score for all three field test participants who completed the SUS was 92.5 (Table 2). This average SUS score equates to an “A” grade for usability and is above the point (80.3) at which a user would recommend the app to a friend (Sauro, 2011).

System Usability Scale (SUS) Results

Table 2:

System Usability Scale (SUS) Results

Discussion

Nurses and other NH staff face challenges in the many competing priorities for care of medically complex NH residents and strive to follow evidence-based guidelines (Ozkaynak, Reeder, et al., 2018). Usability tests of UTIDecide found that participants perceived a mobile CDS app for diagnosis of UTI as highly usable and useful. Participants viewed UTIDecide as useful as a memory aid and reminder of evidence-based guidance; as a learning tool for new NH staff; and as a tool to facilitate communication with on-site team members, off-site providers, and residents. Participants provided feedback to improve designs that were implemented if within the scope and resources of the project. Tethering the app to the EHR system and expanding app assessment functionality to include other common conditions were two design requests that were beyond the scope of the current project. UTIDecide was broadly seen as a way to improve time efficiency in assessment, especially if integrated with other information systems. However, organizational policies prohibiting smartphone use and unreliable Wi-Fi were seen as barriers to mobile CDS adoption in the NH. Recognition of the value of integration and presence of infrastructure barriers to wireless connectivity in NHs have been documented by prior research (Alexander & Wakefield, 2009; Ko, Wagner, & Spetz, 2018). Common barriers in NHs may lend generalizability to the design results to inform future mobile CDS app development efforts in NHs.

Use of apps, such as the one described in the current study, can improve clinical workflow by providing nurses access to information at different physical locations in NHs to support decisions at different points of care (Bates et al., 2003; Ozkaynak, Wu, Hannah, Dayan, & Mistry, 2018). However, future efforts to implement mobile CDS apps into NH workflows should account for latent activities such as establishing and maintaining baseline information, task prioritizing, and handling interruptions in that practice setting (Ozkaynak, Reeder, et al., 2018). Its usefulness as memory aid/reminder of evidence-based practice can be critical to these activities by reducing cognitive overload during busy times, whereas its use as a communication tool can enhance team work and team awareness. Effective team work and team awareness are essential to identify and address change in resident conditions. The current study provides a rich example of capturing relevant aspects of NH workflow using cognitive walk-through and think-aloud sessions to improve usability and designs that connect to that workflow. The themes identified highlight how these techniques can reveal insights about workflow in NHs.

Limitations

The two NHs in the current study were owned by the same company in the same metropolitan area, which may limit generalizability of app designs to NHs with different ownership structures and policies in other areas. The exploratory nature of the study revealed the limitation of 1-week field tests of new mobile apps in NHs, suggesting that future design studies for mobile CDS apps should assess the frequency of targeted conditions prior to field tests.

Conclusion

The current project relied on user-centered design to develop and test the usability of a mobile CDS app for assessment of UTI in NH residents. Results showed that these methods are feasible to use in this setting, and the app was perceived as usable and useful by nurses. Future work should seek to integrate mobile CDS with NH information systems and expand the capabilities of mobile CDS apps to include a broader set of conditions that staff in NHs regularly assess and manage.

References

  • Alexander, G.L. & Wakefield, D.S. (2009). Information technology sophistication in nursing homes. Journal of the American Medical Directors Association, 10, 398–407. doi:10.1016/j.jamda.2009.02.001 [CrossRef]
  • Bates, D.W., Kuperman, G.J., Wang, S., Gandhi, T., Kittler, A., Volk, L. & Middleton, B. (2003). Ten commandments for effective clinical decision support: Making the practice of evidence-based medicine a reality. Journal of the American Medical Informatics Association, 10, 523–530. doi:10.1197/jamia.M1370 [CrossRef]
  • Brooke, J. (1996). SUS: A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4–7.
  • Brooke, J. (2013). SUS: A retrospective. Journal of Usability Studies, 8(2), 29–40.
  • Davis, F.D., Bagozzi, R.P. & Warshaw, P.R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35, 982–1003. doi:10.1287/mnsc.35.8.982 [CrossRef]
  • Handler, S.M., Boyce, R.D., Ligons, F.M., Perera, S., Nace, D.A. & Hochheiser, H. (2013). Use and perceived benefits of mobile devices by physicians in preventing adverse drug events in the nursing home. Journal of the American Medical Directors Association, 14, 906–910. doi:10.1016/j.jamda.2013.08.014 [CrossRef]
  • Herzig, C.T., Dick, A.W., Sorbero, M., Pogorzelska-Maziarz, M., Cohen, C.C., Larson, E.L. & Stone, P.W. (2017). Infection trends in US nursing homes, 2006–2013. Journal of the American Medical Directors Association, 18, 635.e9–635e20. doi:10.1016/j.jamda.2017.04.003 [CrossRef]
  • Jaspers, M.W., Steen, T., van den Bos, C. & Geenen, M. (2004). The think aloud method: A guide to user interface design. International Journal of Medical Informatics, 73, 781–795. doi:10.1016/j.ijmedinf.2004.08.003 [CrossRef]
  • Jones, W., Drake, C., Mack, D., Reeder, B., Trautner, B. & Wald, H. (2017). Developing mobile clinical decision support for nursing home staff assessment of urinary tract infection using goal-directed design. Applied Clinical Informatics, 8, 632–650. doi:10.4338/ACI-2016-12-RA-0209 [CrossRef]
  • Kaufman, D.R., Patel, V.L., Hilliman, C., Morin, P.C., Pevzner, J., Weinstock, R.S. & Starren, J. (2003). Usability in the real world: Assessing medical information technologies in patients' homes. Journal of Biomedical Informatics, 36, 45–60. doi:10.1016/S1532-0464(03)00056-X [CrossRef]
  • Ko, M., Wagner, L. & Spetz, J. (2018). Nursing home implementation of health information technology: Review of the literature finds inadequate investment in preparation, infrastructure, and training. Inquiry, 55, 46958018778902. doi:10.1177/0046958018778902 [CrossRef]
  • Lewis, J.R. (1994). Sample sizes for usability studies: Additional considerations. Human Factors, 36, 368–378. doi:10.1177/001872089403600215 [CrossRef]
  • Montoya, A. & Mody, L. (2011). Common infections in nursing homes: A review of current issues and challenges. Aging Health, 7, 889–899. doi:10.2217/ahe.11.80 [CrossRef]
  • Morrill, H.J., Caffrey, A.R., Jump, R.L., Dosa, D. & LaPlante, K.L. (2016). Antimicrobial stewardship in long-term care facilities: A call to action. Journal of the American Medical Directors Association, 17, 183.e1–16. doi:10.1016/j.jamda.2015.11.013 [CrossRef]
  • Nielsen, J. & Landauer, T.K. (1993). A mathematical model of the finding of usability problems. Retrieved from http://peres.rihmlab.org/Classes/PSYC6419seminar/p206-Five%20Users%20nielsen.pdf
  • Ozkaynak, M., Reeder, B., Drake, C., Ferrarone, P., Trautner, B. & Wald, H. (2018). Characterizing workflow to inform clinical decision support systems in nursing homes. The Gerontologist. Advance online publication. doi:10.1093/geront/gny100 [CrossRef]
  • Ozkaynak, M., Wu, D.T.Y., Hannah, K., Dayan, P.S. & Mistry, R.D. (2018). Examining workflow in a pediatric emergency department to develop a clinical decision support for an antimicrobial stewardship program. Applied Clinical Informatics, 9, 248–260. doi:10.1055/s-0038-1641594 [CrossRef]
  • Reeder, B., Drake, C., Ozkaynak, M., Jones, W., Mack, D., David, A. & Wald, H. ( 2019, July. ). Usability inspection of a mobile clinical decision support app and a short form heuristic evaluation checklist. Paper to be presented at HCI International 2019. , Orlando, FL. .
  • Sauro, J. (2011, February2). Measuring usability with the system usability scale (SUS). Retrieved from https://measuringu.com/sus
  • Trautner, B.W., Bhimani, R.D., Amspoker, A.B., Hysong, S.J., Garza, A., Kelly, P.A. & Naik, A.D. (2013). Development and validation of an algorithm to recalibrate mental models and reduce diagnostic errors associated with catheter-associated bacteriuria. BMC Medical Informatics and Decision Making, 13, 48. doi:10.1186/1472-6947-13-48 [CrossRef]
  • Turner, A.M., Reeder, B. & Wallace, J.C. (2013). A resource management tool for public health continuity of operations during disasters. Disaster Medicine and Public Health Preparedness, 7, 146–152. doi:10.1017/dmp.2013.24 [CrossRef]
  • Turner, C.W., Lewis, J.R. & Nielsen, J. (2006). Determining usability test sample size. International Encyclopedia of Ergonomics and Human Factors, 3, 3084–3088.
  • Virzi, R.A. (1992). Refining the test phase of usability evaluation: How many subjects is enough?Human Factors, 34, 457–468. doi:10.1177/001872089203400407 [CrossRef]
  • Wald, H.L. (2016). Challenging the “culture of culturing”: The case for less testing and more clinical assessment. JAMA Internal Medicine, 176, 587–588. doi:10.1001/jamainternmed.2016.0525 [CrossRef]

Utidecide Usability Test Participants and Activities

ParticipantCredential/RoleSiteCognitive Walk-ThroughField Test and InterviewSystem Usability Scale
1RN/DONAX
2LPN/Floor nurseAX
3RN/A-DONAX
4RN/Charge nurseBX
5LPN/Floor nurseBXa
6RN/Floor nurseBXXX
7RN/Floor nurseBXX
8LPN/Floor nurseBX

System Usability Scale (SUS) Results

ParticipantCredential/RoleSUSa
5LPN/Floor nurse100
6RN/Floor nurse97.5
7RN/Floor nurse80
Average92.5
Authors

Dr. Reeder is Assistant Professor, and Dr. Ozkaynak is Assistant Professor, College of Nursing, Ms. Drake is PhD Candidate, Colorado School of Public Health, and Dr. Wald is Adjunct Associate Professor of Medicine, University of Colorado School of Medicine, University of Colorado | Anschutz Medical Campus, Aurora, Colorado. Dr. Wald is also Vice President of Clinical Performance, SCL Health, Broomfield, Colorado.

The authors have disclosed no potential conflicts of interest, financial or otherwise. This work was supported by the COPIC Medical Foundation and Centers for Disease Control and Prevention (200-2016-92277).

Address correspondence to Blaine Reeder, PhD, Assistant Professor, College of Nursing, University of Colorado | Anschutz Medical Campus, Mail Stop C288-19, 13120 E. 19th Avenue, Ed2 North, Aurora, CO 80045; e-mail: blaine.reeder@ucdenver.edu.

Posted Online: April 12, 2019

10.3928/00989134-20190408-01

Sign up to receive

Journal E-contents