Dr. Lasater is Associate Professor, School of Nursing, Oregon Health & Science University, Portland; and Dr. Sideras is Assistant Professor, School of Nursing, Oregon Health & Science University, Ashland, Oregon; Dr. Johnson is Assistant Professor of Nursing, Dr. Hodson-Carlton is Associate Director, School of Nursing and Director, Simulation & Information Technology, and Dr. Siktberg is Director, School of Nursing, Ball State University, Muncie, Indiana.
This study was funded in part by a grant from the National League for Nursing Foundation.
The authors have disclosed no potential conflicts of interest, financial or otherwise.
Address correspondence to Kathie Lasater, EdD, RN, ANEF, Associate Professor, School of Nursing, Oregon Health & Science University, Portland, OR 97239; e-mail: firstname.lastname@example.org.
More calls for research proposals are giving priority to multisite studies. In the past, such studies usually required costly travel between sites that quickly used up precious budget dollars. Grantors have begun limiting expenses allocated for travel, making it essential to consider alternative methods for conducting research at multiple sites. As more global partnerships form and the accessibility of electronic resources proliferate, the potential for conducting multisite studies will continue to increase. However, adherence to complex procedures and research protocols in a multisite study presents daunting challenges to maintain rigorous standards, apply protocols, and replicate the study across all sites. The purposes of this article are to (a) describe the development of a digital toolkit of Web-based technologies to successfully meet these challenges and (b) evaluate the digital toolkit’s effectiveness, including implications and recommendations for future multisite studies.
Overview of the Study
A brief description of the study will provide some background for the choice of technologies in the digital toolkit. In nursing education, outcomes from intervention research on the use and effectiveness of simulation technology are scattered, inconsistent, and vary in methodological rigor and substantive focus. Sample size is one limiting factor; logistically, it is difficult for one site to gather enough data to make evidence-based assertions about the outcomes. Multisite studies are necessary to obtain a sufficient sample size. However, as with many intervention studies, replication of complex study protocols across diverse educational sites can be problematic. Key variables must be identified and measured in the same manner at all sites (Flynn, 2009).
The challenges of limited sample size and protocol replication were addressed in this collaborative, mixed-methods study through the use of multiple sites and the creation of a digital toolkit, which was supported with a variety of communication strategies to address uniform adherence to study protocols. Study findings about students’ clinical judgment will be reported in a subsequent article; however, one of the purposes of the study was to determine whether a digital toolkit would result in consistent replication of complex protocols. The toolkit comprised freeware and other Web-based technologies to manage and facilitate the research collaborative. In addition to accurate replication, other measures of toolkit success were a decrease in the necessity of expensive travel and postage, distribution of large numbers of paper copies, and onsite consultation.
At each of the five participating research sites, the institutional review board approved the study for prelicensure nursing students’ participation in a three-phase, mannequin-based simulation. Each of the three simulation phases was complex, requiring intense attention to detail, thereby increasing the challenges of replication. Student participants had assigned preparation activities and assumed multiple roles, which also required adherence to protocols.
At each site, half of the students were exposed to a study intervention, consisting of a video exemplar of an expert nurse caring for a patient with needs similar to the simulation patient. Aspects of students’ clinical judgment exhibited in the simulation were rated using a validated, evidence-based instrument for clinical judgment, forming the quantitative data. Qualitative data were collected from guided reflections, based on the dimensions of the clinical judgment rating instrument, and obtained at two points in time—the day of the simulation experience and 4 weeks later. Each site needed to securely transfer the collected data to the co-principal investigators (PIs) (K.L., E.J.) at two of the sites for analysis. All of these study activities required similar organization and progression at each site to maintain the rigor of the study.
Precedents for Multisite Studies
The Oncology Nursing Society (2005) offered a clear definition of multisite research in their Multisite Research Strategic Plan:
Multisite research is a study conducted by a consortium of investigators using the same research plan in several different regional, national, or international sites. Data are pooled for analyses to accomplish the specific aims of the study and results are disseminated by the consortium of investigators.
Although this definition accurately describes the “what,” it does not attempt to represent the “how” of conducting such research.
According to the literature, multisite studies increase the number of participants and diversity of the sample, resulting in higher statistical power, more generalizable findings, and ultimately a better likelihood of creating and impacting evidence-based practice (Flynn, 2009; Graham, Spano, & Manning, 2007; Lindquist, Treat-Jacobson, Watanuki, 2000; Nielsen & Quirk, 1997; Vessey, Broome, & Carlson, 2003). Collaborating with other sites often sets the stage for developing future research proposals, further enhancing the strength of evidence (Bossert, Evans, Van Cleve, & Savedra, 2002; Iida et al., 2005).
However, a multisite study often has numerous challenges. Although the current study did not encounter significant problems with multiple institutional review boards (IRBs), it was clear from the literature that application to multiple IRBs was a difficult hurdle for others (Flynn, 2009; Graham et al., 2007; Nowak, Bankert, & Nelson, 2006; Racine, Bell, & Deslauriers, 2010). As multisite studies become more accepted and professionals acknowledge the increased strength of the evidence from such studies, perhaps IRBs can work more closely with researchers to utilize technology and other resources to resolve this barrier.
Management of the various aspects of a multisite study is also challenging. Several multisite studies reviewed used previously developed guides or checklists for proposing and managing multisite studies; such guides can provide excellent direction for organization and anticipating challenges before the study begins (Bossert et al., 2002; Nail et al., 1998; Vessey et al., 2003). Whereas few Web-based resources for data management were described outside of nursing education (Winget et al., 2005), the nursing literature revealed little evidence for using a comprehensive digital toolkit to manage a multisite study. For the purposes of the current study, a literature review revealed broader themes—collaboration, coordination, and communication. These themes supported the creation of the digital toolkit, thereby promoting the collaborative efforts of the study.
Although some studies required multisite participation, particularly in governmental organizations (Graham et al., 2007; Racine et al., 2010), collaborative voluntary recruitment of participants must begin with a shared vision for the project (Berger, Neumark, & Chamberlain, 2007; Vessey et al., 2003). Other collaborative activities reported in the literature included negotiation for meeting times, deadlines, and publication teams (Bossert et al., 2002; Flynn, 2009; Nail et al., 1998; Vessey et al., 2003). The literature reveals that replicating the detailed protocols across multiple sites obviously requires structure (Vessey et al., 2003) and careful coordination.
Multiple metaphors from the literature for the role of coordinator provided rich word pictures, such as the “importance of an epicenter” (Flynn, 2009, p. 390). Lindquist et al. (2000) described the role of the coordinator as adhesive to “glue together the distant sites” (p. 275), and Berger et al. (2007) referred to the need for coordination to nurture relationships among the sites. A common understanding among those authors was that the lead PI should assume this role with multiple responsibilities, including initiating and tracking IRB applications, budget allocation, serving as liaison for contractual arrangements, and setting meeting times for all collaborators to convene.
Consistent application of the procedures and protocols is critical to ensure the highest possible scientific rigor, thereby supporting the usefulness of the study (Flynn, 2009; Lindquist et al., 2000). Perhaps the most important responsibility of the coordinator to ensure adherence to the study protocols is leading the participants to consensus in decisions, answering questions, and frequently checking in. The coordinator’s flexibility to maintain and support communication is an essential element with multiple research partners (Berger et al., 2007).
The need for excellent communication among all sites was a clear theme from the literature review. Team building, regularly scheduled communication, setting clear expectations from the beginning, and discussing shared recognition from the study were significant factors to minimize misunderstandings that could become distracting problems (Lindquist et al., 2000; Nail et al., 1998). Relationships built on trust among the grantees and site leaders were critical to the success of reporting multisite studies.
Recruitment of research partners from the five sites for the current study, including one international site, occurred in ways that were similar to other multisite studies—through previous collaborations and partnerships, as well as a shared vision among the partners (Berger et al., 2007; Nail et al., 1998; Vessey et al., 2003). It was critical that all five sites were philosophically and practically committed to using an evidence-based model of clinical judgment (Tanner, 2006) and the instrument for rating four aspects of clinical judgment in prelicensure students (Lasater, 2007) and that both were compatible with their curricula.
Because there were so many research sites, coordination and communication to implement the study over 6 months, developing and replicating the study protocols, and maintaining the collaborative relationships were critical. The development and use of a digital toolkit and incorporating appropriate Web-based technologies supported these purposes. Several Web-based communication approaches served as important adjuncts to the digital toolkit.
Digital Toolkit Description and Purposes
Frequent communication was key to the development of the digital toolkit. The primary research partner institution had robust technological resources available and so assumed the role of lead technology support. Participating sites had access to biweekly teleconferences using a centralized conference calling program, videoconferencing as needed, individual technical support, and the digital toolkit. Biweekly teleconferences created a forum for development and problem solving, such as refinement of the toolkit and project Web site, completion of IRB materials, and standardization of data collection. Videoconferences occurred when face-to-face meetings were important—for example, rater training for establishing interrater reliability. Individual technical support was provided for diverse needs, ranging from learning, to access and use of Web-based software, to downloading video-streamed materials. Telephone and e-mail contact information shared among all participating faculty and technical support staff promoted last-minute clarification of details as the study was implemented at each institution over a 6-month period.
The digital toolkit provided critical elements in the successful replication of the three-phase simulation and research protocols at each of the five sites. It included (a) a wiki to facilitate collaboration among the team members; (b) a project Web site housing the operational details, including digital images for the simulation setup and podcasts or vodcasts for faculty development; (c) the video exemplar; (d) document organization software to simulate an electronic medical record; and (e) a secure data management system for transport and storage of study data. The Table summarizes the contents of the digital toolkit, which could be used or customized for any type of multisite research study. An overview of the design and use of the digital toolkit elements follows.
Table: Digital Toolkit Contents
Wiki Collaborative Workspace
The Simulation Collaboration Wiki (enterprise version of PBworks®) was established at the initiation of the multisite collaboration to create a shared online workspace for the project team. A wiki is a Web-based tool that allows a group to develop a community through shared communication and resources (Kardong-Edgren et al., 2009). Lead technical support personnel assumed administrative responsibility for the establishment and operation of the wiki, including user instruction and help functions throughout the project. After the PBworks Web site was established, the partners received an invitation with access instructions to establish a wiki account and the link to the software user manual. Each partner was given site writer rights with the ability for collaborative page editing, sharing documents, uploading new files, and creating new pages.
When the multisite partners accepted the wiki invitation and established their own PBworks accounts, they received status update e-mails when other partners made changes in the wiki. The e-mails kept the entire team aware of new communications, documents, and revisions. Materials contained in the wiki were placed into the following categories: grant document, participant contact information, project rollout plans, biweekly updates and minutes of collaborative meetings, data files, digital toolkit updates, presentations, IRB, and research-related materials.
Participating sites also used the wiki to collect field notes related to the use of the digital toolkit during implementation. The field notes preserved a history of issues that might differ from other sites or identify variables that could affect the data analysis. One such example described challenges encountered when a site attempted to download the video exemplar that was shown to the intervention group at each site:
As we have been preparing to begin the study, we ran into trouble viewing the exemplar videos. Our IT person worked with [the lead technology support site] people and found that the videos do not work on our campus—it was determined that it has something to do with our firewall on campus. [The lead site] is working to come up with a solution, which may be…to place videos in a site where we can download them.
Project Web Site
A common, secured project Web site became the centerpiece of the digital toolkit. This Web site allowed for standardization of the operational details required for replication of the complex procedures and protocols of the study across multiple sites. The lead technical support personnel designed the project Web site using Adobe® Dreamweaver® Creative Suite® S3, a Web development application. The Web site was hosted on the lead institution’s server; the address was provided only to participating sites.
As shown in the Figure, a published framework of simulation (Jeffries, 2007), with which all sites were familiar, provided the structural design layout for the homepage of the project Web site. Using an established framework provided the partners with a familiar and standard language for the organization of study documents and protocol replication. After the PIs and lead technical support personnel completed the development of the Web site materials, each partner received the URL of the project Web site to review and revise materials prior to the individual site’s deployment of the research study.
Figure. Framework of simulation that was used for the structural design layout for the homepage of the project Web site. Reprinted from “Simulation in Nursing Education: From Conceptualization to Evaluation,” by P.R. Jeffries, 2007. Copyright 2007 by National League for Nursing. Reprinted with permission.
The project homepage Web site hyperlinked the information and digital assets to the framework’s dimensions of teacher, student, outcomes, and simulation design characteristics. For example, selecting Teacher from the homepage presented faculty development materials for those at each site who would facilitate the protocols with student groups. In this area, a PowerPoint® presentation provided an overview of the protocols from the beginning through evaluation.
The inclusion of other digital tools in the Teacher area was aimed at minimizing the variance in faculty facilitation and implementation for the multisite research study. Because there were varying degrees of faculty academic and experiential preparation at the participating institutions, the investigators who were experts in simulation implementation and evaluation worked with the technical support personnel in the development and inclusion of faculty development podcasts, which were designed to orient faculty to some basic skills involved in the study.
The Teacher area also included the virtual faculty notebooks in a digital format setup with Microsoft® OneNote 2007, which is a document organization software product. The standardized digital notebooks minimized variations in faculty instructions to students because the simulation was implemented across the five sites. The use of digital notebooks also decreased the expense of duplication and mailing of paper copies. The support personnel and faculty had easy access to these digitized files before and during the day of the simulation.
Selecting the Student link from the project homepage provided access to the virtual student notebooks setup with One-Note 2007. Through this link, students accessed all of the materials and instructions needed for preparation on the day of the simulation. Students had access to hard copies of the materials specific to the phase of the simulation in which they were participating (Phase 1, 2, or 3).
Areas hyperlinked with digital information from the simulation framework included objectives, fidelity, student support, and debriefing sections. For example, the Objectives area had the learning goals for each of the three phases of the simulation. The Fidelity section contained information related to the physical setup of all activities related to the simulation, such as information and multiple digital images needed by the simulation laboratory personnel at the multisite locations (i.e., setup and supplies, patient charts, and guidelines for simulation roles). The patient charts were created with OneNote 2007. Students needed the patient charts for preparation purposes and at the patient’s bedside during the three phases of the simulation. The Roles/Guidelines for Roles explained the roles of the participants and “family members” during the simulation phases. The Student Support area contained a variety of resources that students would use before and during the simulation. Selection of the Debriefing link led to faculty guides and tips for the debriefing sessions.
The Outcomes link on the homepage contained digital forms of all of the qualitative tools used in the study, including the electronic reflection activities. This area served as the common repository from which each site accessed the study evaluation instruments for administration to their students.
Document Organization Software
A variety of paper instructional guides, presimulation learning activities, patient charts, and supportive documents from the pilot testing of the simulation existed in several formats. Microsoft’s OneNote 2007 interface as an electronic version of a tabbed ring binder was a solution to the need-to-share standardized and well-organized project materials across sites. Technical and simulation personnel at the lead technology support site converted paper patient charts to a digital format, which could be downloaded by participants at the multisite locations through the project Web site (Figure). The digital files, separated by the three phases of the simulation, could then be reviewed at each location on computer stations with the OneNote software. As noted previously, other uses of the document organization software included virtual faculty and student notebooks, which were downloadable at the individual study sites.
Access to the video recording that served as the intervention was central to the study purpose of examining the effects of the intervention on students’ thinking. The sites had access to the videos by choosing the Teacher link on the project’s Web site. All sites could access the video through a Mediasite® format (a recording system that allows recorded video to be viewed later through the Internet as a streaming video) or in an offline version, which could be downloaded at each site (published in a Mediasite to-go format).
Secure Data Collection and Transfer
Secure sharing of data files is a cornerstone of multisite research. Dropbox® is a server-based, Cloud computing storage service that provides an encrypted method to transmit and store files between multiple computers. As with the wiki, the lead technology support personnel assumed administrative responsibility for the establishment of the Dropbox account, including user instruction and help functions. Areas were created within the Dropbox account so that each site could upload collected data. Folders were set up for each site and study personnel directly involved with the project received invitations for access to their own individual areas. Only the PIs for the project were given access to all areas. Because each site completed its own data collection, the site director could upload the quantitative and qualitative files into Dropbox, which the co-PIs could then access for analysis.
Digital Toolkit Evaluation and Recommendations for Future Multisite Studies
As stated previously, one of the study purposes was to determine if a digital toolkit would result in consistent replication of complex study protocols across the five sites in the research collaborative. In addition to consistent replication, other measures of success of the toolkit were a decrease in the necessity of expensive travel and postage, distribution of large numbers of paper copies, and onsite consultation.
In fact, the costs to the grant were successfully contained. Through the use of the freeware and existing resources, such as Web-based course management systems at each site, all study documents were transmitted digitally; collected data were securely filed and transmitted as well. The digital toolkit and regular, frequent communication alleviated the need for travel to any of the study sites. The only exception was that one co-investigator (S.S.) had to travel to another site to assist with the production of the video exemplar that was the study’s intervention. Clearly, there were study costs in the form of personnel and use of established resources, which are difficult to quantify. These costs should be considered for their return on investment benefit; that is, all of the study partners learned new skills or became familiar with Web-based technologies, which can be used in future multisite studies.
Use of an established and familiar framework (e.g., the Jeffries model) as an organizational structure for the project Website materials was helpful because the hyperlinks to all of the study materials worked effectively. Hence, consistent replication of the materials and procedures was certainly likely. However, the quantity and variety of information used in this project was voluminous, and at times site personnel found it difficult to locate an exact document in a timely fashion. In keeping with effective use of technology, a recommendation would be to design a descriptive submenu to pop up when the mouse hovers over a term.
A few unexpected issues, which could have been addressed through the digital toolkit, arose throughout the 6-month roll-out. Despite standardized protocols and site inclusion criteria, the investigators discovered a range of levels of research and simulation experience, diverse technology supporting simulation, and different faculty approaches for implementing simulation, especially the debriefing phase. For example, the clinical faculty at some sites actively partnered in the simulation process, whereas at other sites dedicated simulation faculty implemented the simulation protocols.
A prospective plan in the form of digitally available faculty development tools, such as the podcasts, was created to manage these differences; however, they were offered as a resource, not a requisite. In retrospect, requiring the study personnel at each site to access the tools as a group would have contributed to consistency and may have built a stronger team for the current study as well as future simulation research. In addition, a debriefing guide, based on the scoring tool, would have facilitated more consistent data collection. Communication strategies and technology support were critical elements to enhance the replication and containing costs.
Frequent communication was essential for planning, preventive problem solving, and answering process questions. Because the biweekly meetings were scheduled several months before the study launched, with at least two faculty present from each site, representation at the meetings was optimized. The co-PIs were available between meetings on an as-needed basis. Because communication occurred over the 6-month rollout, each subsequent site knew what to anticipate and could proactively problem solve for protocol replication. Anticipating varying levels of research skills and support mechanisms across project sites may require additional resources, but identifying an individual at each site for mentoring could be one solution.
Centralized technology support was vital to the successful development and management of the digital elements in the toolkit. The lead technical personnel provided assistance to technology personnel at the other sites, thereby promoting consistent implementation of the study. On the basis of our experience from this study, technology personnel at each site should be involved early in the planning stage to perform a thorough assessment of the available technology platforms. This step will identify the best way to deliver digital materials in a usable format prior to the study launch.
Mentoring study personnel in the digital toolkit’s use was another important role for the technology support staff. Many of the study’s site coordinators had to learn new technologies to facilitate the online collaboration and protocol adherence. For example, most coordinators had not previously used a wiki. Having the wiki built before the study launch would have allowed for some experiential learning, thereby expediting its use. Because students did not have access to the Web site, another technology challenge was the need to move the student learning materials from the project Web site to the different Web-based learning management systems used by the sites, such as Sakai® or Blackboard™. OneNote 2007 notebooks were not compatible with some systems; therefore, alternative methods took the place of the electronic notebooks in some cases. Future upgrades to the learning management systems or consultation with campus-specific technology support personnel at each site may help avoid such a rework.
Multisite studies offer potential for larger, more diverse participant groups and greater statistical power, setting the stage for more generalizable findings. However, multisite studies can increase the complexity of protocol replication. Current technologies present strategies for assisting multiple site studies to handle these challenges.
The five participating sites in this mixed-methods study collaboratively created and used a digital toolkit, comprising freeware and online resources already in place to foster protocol replication and diminish the costs of expensive travel, on-site consultation, postage, and paper use. The toolkit for this study included (a) a wiki, (b) a project Web site; (c) the video exemplar; (d) document organization software to simulate an electronic medical record; and (e) a secure data management system. In addition to the digital toolkit, the collaborative identified that frequent communication and designation of a lead technology support site with adequate technology support at each partner site were the most critical factors in implementing a multisite study spanning 5,000 miles.
- Berger, A.M., Neumark, D.E. & Chamberlain, J. (2007). Enhancing recruitment and retention in randomized clinical trials of cancer symptom management. Oncology Nursing Forum, 34(2), E17–E22. doi:10.1188/07.ONF.E17-E22 [CrossRef]
- Bossert, E.A., Evans, S., Van Cleve, L. & Savedra, M.C. (2002). Multisite research: A systems approach. Journal of Pediatric Nursing, 17, 38–48. doi:10.1053/jpdn.2002.30932 [CrossRef]
- Flynn, L. (2009). The benefits and challenges of multisite studies: Lessons learned. AACN Advanced Critical Care, 20, 388–391. doi:10.1097/NCI.0b013e3181ac228a [CrossRef]
- Graham, D.G., Spano, M.S. & Manning, B. (2007). The IRB challenge for practice-based research: Strategies of the American Academy of Family Physicians National Research Network (AAFP NRN). Journal of the American Board of Family Medicine, 20, 181–187. doi:10.3122/jabfm.2007.02.060110 [CrossRef]
- Iida, E.E., Springer, J.F., Pecora, P.J., Bandstra, E.S., Edwards, M.C. & Basen, M.M. (2005). The SESS multisite collaborative research initiative: Establishing common ground. Child and Family Social Work, 10, 217–228. doi:10.1111/j.1365-2206.2005.00377.x [CrossRef]
- Jeffries, P.R. (Ed.). (2007). Simulation in nursing education: From conceptualization to evaluation. New York, NY: National League for Nursing.
- Kardong-Edgren, S.E., Oermann, M.H., Ha, Y., Tennant, M.N., Snelson, C., Hallmark, E. & Hurd, D., … (2009). Using a wiki in nursing education and research. International Journal of Nursing Education Scholarship, 6(1), Article 6. doi:10.2202/1548-923X.1787 [CrossRef]
- Lasater, K. (2007). Clinical judgment development: Using simulation to create an assessment rubric. Journal of Nursing Education, 46, 496–503.
- Lindquist, R., Treat-Jacobson, D. & Watanuki, S. (2000). A case for multi-site studies in critical care. Heart & Lung, 29, 269–277. doi:10.1067/mhl.2000.106939 [CrossRef]
- Nail, L.M., Barsevick, A.M., Meek, P.M., Beck, S.L., Jones, L.S., Walker, L. & King, M.E., … (1998). Planning and conducting a multi-institutional project on fatigue. Oncology Nursing Forum, 25, 1399–1403.
- Nielsen, K.M. & Quirk, A.G. (1997). The process for initiating nursing practice changes in the intrapartum: Findings from a multisite research utilization project. Journal of Obstetric, Gynecologic, and Neonatal Nursing, 26, 709–717. doi:10.1111/j.1552-6909.1997.tb02746.x [CrossRef]
- Nowak, K.S., Bankert, E.A. & Nelson, R.M. (2006). Reforming the oversight of multi-site clinical research: A review of two possible solutions. Accountability in Research, 13, 11–24.
- Oncology Nursing Society. (2005). ONS approves multisite research strategic plan. Oncology Nursing Society News, 20(13), 14.
- Racine, E., Bell, E. & Deslauriers, C. (2010). Canadian Research Ethics Boards and multisite research: Experiences from two minimal-risk studies. IRB: Ethics & Human Research, 32(3), 12–18.
- Tanner, C.A. (2006). Thinking like a nurse: A research-based model of clinical judgment. Journal of Nursing Education, 45, 204–211.
- Vessey, J.A., Broome, M.E. & Carlson, K. (2003). Conduct of multisite clinical studies by professional organizations. Journal for Specialists in Pediatric Nursing, 8, 13–22. doi:10.1111/j.1744-6155.2003.tb00179.x [CrossRef]
- Winget, M., Kincaid, H., Lin, P., Li, L., Kelly, S. & Thornquist, M. (2005). A web-based system for managing and co-ordinating multiple multisite studies. Clinical Trials, 2, 42–49. doi:10.1191/1740774505cn62oa [CrossRef]
Digital Toolkit Contents
|Technology||Purpose||Platform or Format||Web Site|
|Wiki||Facilitate communication, such as field note sharing||PBworks®||http://pbworks.com|
|Project Web site||Standardize operational details and research protocols||Adobe®DreamWeaver®Creative Suite S3®||http://www.adobe.com|
|Video exemplar||House and provide access for the research intervention||Mediasite®||http://www.sonicfoundry.com/mediasite|
|OneNote||Organize documents; provide tabs for simulated electronic medical record files||Microsoft®||http://office.microsoft.com/en-us/onenote/|
|Online demo: http://office.microsoft.com/en-us/onenote-help/demo-what-is-onenote-HA010168634.aspx|
|Secured data site||Secure data files for electronic transfer||Dropbox®||http://www.dropbox.com|