Journal of Nursing Education

Major Article 

Enhancing Critical Thinking in Graduate Nursing Online Asynchronous Discussions

Nancy L. Novotny, PhD, RN, CNE; Stephen J. Stapleton, PhD, RN, CEN, FAEN; Elaine C. Hardy, PhD, RN

Abstract

Background:

Graduate nursing students in online courses often have limited success in developing the critical thinking (CT) skills essential for advanced roles. This study describes the use of complementary strategies in a graduate-level nursing course to enhance CT in online discussions.

Method:

Using Paul and Elder's framework for understanding the components of CT, the authors designed an asynchronous online course using multiple strategies to promote CT. We used mixed methods to collect descriptive and numerical data and content and repeated measures analyses to identify changes in CT skills and student perceptions across the semester.

Results:

CT scores increased significantly and aligned with students' perceived improvements in CT.

Conclusion:

Evidence of CT in online discussions increased significantly across the semester with the use of multiple instructional strategies and substantial student and faculty efforts. The findings are a useful benchmark for future studies comparing combinations of strategies to identify those most effective and least arduous. [J Nurs Educ. 2016;55(9):514–521.]

Abstract

Background:

Graduate nursing students in online courses often have limited success in developing the critical thinking (CT) skills essential for advanced roles. This study describes the use of complementary strategies in a graduate-level nursing course to enhance CT in online discussions.

Method:

Using Paul and Elder's framework for understanding the components of CT, the authors designed an asynchronous online course using multiple strategies to promote CT. We used mixed methods to collect descriptive and numerical data and content and repeated measures analyses to identify changes in CT skills and student perceptions across the semester.

Results:

CT scores increased significantly and aligned with students' perceived improvements in CT.

Conclusion:

Evidence of CT in online discussions increased significantly across the semester with the use of multiple instructional strategies and substantial student and faculty efforts. The findings are a useful benchmark for future studies comparing combinations of strategies to identify those most effective and least arduous. [J Nurs Educ. 2016;55(9):514–521.]

Graduate nursing students are increasingly taking online courses but often with limited success in developing the critical thinking (CT) skills that are essential for them to engage in advanced professional roles. Because CT skills, such as conceptualizing, interpreting, analyzing, synthesizing, evaluating, and applying information in new situations, are needed both by nurses in graduate-level programs to integrate their expanding knowledge and by advanced professional nurses to solve problems and make clinical decisions, the American Association of Colleges of Nursing (2011) recommends CT as an essential competency in master's-level nursing education. Students who have foundational knowledge and experience in applying CT within authentic nursing practice are well poised for further developing their CT skills in graduate-level nursing education. The current study describes the use of complementary instructional strategies, such as embedding direct instruction in CT within the course content, which significantly enhanced CT in online discussions in a master's-level nursing course.

Background

Online asynchronous formats have the potential to stimulate CT because they offer students opportunities to engage in dialog about course content (Greenlaw & DeLoach, 2003), as well as time to formulate well-reasoned posts. However, master's-level students' discussions in online courses rarely provide evidence of high CT and may relate to instructional design, strategies, goals, and guidance (Bai, 2009; Maurino, 2007). Of course, even traditional courses do not always succeed in developing CT, especially if instructors have a poor understanding of CT, rely on teacher-centric strategies, or offer little support for CT skill practice (Chan, 2013). In an online environment, where instructors face the continual challenge of providing needed interaction, practice, and feedback, the quandary is compounded (Cook et al., 2010).

Components of CT

Helping students develop CT requires an understanding of its nature. According to a highly useful framework developed by Paul and Elder (2008), CT has three components: (a) elements of thought, to which we, as emerging critical thinkers, apply, (b) intellectual standards of thinking, in a process by which we develop the (c) intellectual traits that make CT habitual. The first step is learning to recognize the elements of our thought, or parts of our thinking:

  • Our purpose in reasoning.
  • The question we are trying to answer.
  • The assumptions we are making.
  • The points of view we consider.
  • The information we use.
  • The concepts through which we express our reasoning.
  • The interpretations by which we give meaning to information.
  • The implications we draw from our reasoning.

After we have identified the elements of our thought, we learn to evaluate the way we are using these elements by applying intellectual standards, such as completeness, depth, breadth, logic, significance, clarity, accuracy, fairness, and relevance. The first two components of the Paul-Elder framework help thinkers formulate clear, precise questions; evaluate information to form accurate interpretations; articulate well-reasoned conclusions; and evaluate alternative ideas, assumptions, and consequences in critical dialogue. Practice using these components, according to Paul and Elder (2008), helps to develop intellectual traits of humility, autonomy, integrity, courage, perseverance, confidence in reason, and fair-mindedness. It also deepens the tendency to regularly use higher cognitive skills (Yang & Chou, 2008). The Paul-Elder framework provides students and instructors with a common language and systematic approach to thinking critically (Ralston & Bays, 2013) and helps observers to identify evidence of CT in discussions. Critically thoughtful discussion posts should clearly demonstrate the components of the framework.

Online Discussion Strategies to Promote CT

Little research exists about strategies for increasing CT in online courses, specifically for nursing students. The effectiveness of CT-enhancing strategies within online discussions has been most frequently documented outside of the nursing educational literature. Strategies that facilitate CT in traditional nursing courses also apply in online discussions, such as questioning, interactive learning, role modeling, reflective writing, problem-based learning, peer guidance, and instructor feedback (Chan, 2013; Halstead, 2005; Lovatt, 2014; Staib, 2003).

However, the potential value of these CT strategies in online discussions will be limited if instructors design the discussions poorly. Many online discussion designs and approaches believed to promote the development of CT do so by creating constructivist learning environments.

Several factors may influence the success of a design. The pace of discussions must allow time for students to explore the various elements of a topic and respond thoughtfully (Greenlaw & DeLoach, 2003; Lim, Cheung, & Hew, 2011; Richardson & Ice, 2010). Discussion groups should be small to modest in size. Having fewer posts to read and mentally process helps avoid undue cognitive load and supports a higher order of thinking by allowing members to reflect and respond in more depth (Darabi & Jin, 2013). Instructor guidance, such as specific instructions about how to engage in high-quality discussion, examples of high-quality posts, and detailed rubrics for evaluating posts, yields improvements in the quality and extent of CT in online discussions (Darabi & Jin, 2013; Panadero & Jonsson, 2013; Zhou, 2015). Discussion facilitators can help students learn by modeling critically thoughtful comments or questions in discussions and explaining the cognitive techniques being used (DeWever, Van Keer, Schellens, & Valcke, 2009; Ertmer & Koehler, 2015; Nouri, Alhani, & Ahmadizadeh, 2013). While many aspects of the facilitator role may be enacted with equal effectiveness by student peers or instructors, (DeWever, Van Winckel, & Valcke, 2008), peers who are aware of their own thinking and open-minded may fay facilitate higher quality discussions than peers who are not (Hew & Cheung, 2011). Finally, having students engage in structured reflection helps them to self-assess and recognize the existence and influence of assumptions, possible alternatives, and contexts within problem solving (Forneris & Peden-McAlpine, 2007).

Before discussions even begin, instructors can convey to students the importance of developing their CT skills by incorporating learning objectives that are specifically related to CT within a discipline-specific course. Studies that integrated CT with course content, by providing explicit instruction about how to meet the course objectives through applying CT principles to the subject matter, have resulted in improved CT skills (Bensley & Spero, 2014).

Using Multiple Interventions to Promote CT

Comparison studies of online instructional discussion interventions support the use of multiple interventions. Different strategies are associated with various CT skills or levels of cognition, and higher CT occurs with the use of several strategies. Richardson and Ice (2010) identified differences up to 20% in several levels of CT evident in posts between groups that used one of three unique instructional strategies and rare evidence of the highest level of CT in any group. Likewise, the comparisons of posts by Darabi, Arrastia, Nelson, Cornille, and Lang (2011) revealed different strengths of four strategies, with each contributing the most to a different type of cognitive presence. DeWever et al. (2008) found that students using several discussion roles constructed higher levels of knowledge than students using only one.

Recent research further supports the idea that overall improvement in CT may require the use of a combination of strategies to address the varied types of cognitive processes used in CT. Heijltes, Van Gog, and Pass (2014) found that explicit CT instruction and practice were needed to improve CT. In addition, a meta-analysis by Abrami et al. (2015) indicates that multiple strategies to teach CT (i.e., dialogue, authentic instruction, or mentoring) produced larger effect sizes than single or paired strategies. In addition, the effect of any strategy on specific students may be influenced by those students' individual characteristics, such as prior experience with CT, comfort with technology, goal orientations, motivational beliefs, or overall academic performance (Bolger, Mackey, Wang, & Grigorenko, 2014; Kovanović, Gašević, Joksimović, Hatala, & Adesope, 2015; Kwan & Wong; 2015; McMullen & McMullen, 2008; Williams, Oliver, Allin, Winn, & Booher, 2003).

These findings suggest that individual differences in characteristics, capacities, or development account for the usefulness of different instructional strategies. Developing CT requires practice using several cognitive skills. Given the differential impact of various strategies on CT and the importance of student-level factors and variation in baseline CT skill development, it is believed that including multiple strategies to develop CT may best meet a range of individual student needs.

These research questions were asked:

  • In a course incorporating complementary strategies to promote use of CT, does CT in online discussions change during a semester?
  • How do students' concepts of CT change during a semester in which CT is emphasized?
  • In a course in which multiple CT-enhancing strategies are used, what changes occurred in student perceptions concerning online discussion, aspects of CT involved in critical discussion, and improvement in CT skills?
  • Which strategies used to improve CT skills within online discussions are most valuable?

Method

To evaluate the use of multiple strategies to enhance CT within online discussions, we designed a fully online master's-level nursing course in the midwestern United States aimed at accommodating anticipated variations among students and maximizing students' improvement in critically thoughtful course discussions. We designed a mixed-methods study that captured both quantitative and qualitative data to evaluate CT and ascertain students' perceptions about CT and online course discussions. We used a quasi-experimental, within-subjects approach to identify changes in students' CT in three discussions across the semester.

Questionnaires were administered at the start and the end of the course, allowing students to describe their concepts of CT and to rate the value of CT in discussions, the improvement in their own CT skills, and the usefulness of strategies used to enhance CT. The students' discussion posts were also analyzed using a rubric identifying the components of CT. All 22 students enrolled in the course were invited to participate in the study. The university's institutional review board approved the administration and analysis of the questionnaire with student consent and the $15 compensation for completing each questionnaire, as well as rating the deidentified posts of all students from three selected discussions. The principal investigator (N.L.N.) was the course instructor.

Discussion Strategies

The course instructor (N.L.N.) consulted an expert in online teaching to identify appropriate course design, structure, and organization, and then implemented several strategies aimed at stimulating students' use of CT within asynchronous online discussions. Figure 1 shows the time line used to implement these strategies across the semester.


Instructional strategies used to enhance critical thinking (CT) in discussions.

Figure 1.

Instructional strategies used to enhance critical thinking (CT) in discussions.

The course included learning objectives related specifically to CT. At the beginning of the course, the instructor presented a synchronous orientation explaining several key course components, including the relationship between the course learning objectives and CT within discussions, the instructor's expectation of quality peer dialogue, and the need for students to invest time in the discussions. The instructor reviewed specific instructions for engaging in the discussions and the grading rubric, which was comprised of two criteria (i.e., thinking and writing), each defined by four levels of quality.

The assignments included eight short video presentations on CT. The first two defined CT and explained how attention to components of the Paul-Elder framework contributes to the development of important analytical skills in advanced nursing roles. The final six shared examples of elements of thought and intellectual standards pertinent to critically considering various professional issues.

Discussion topics on a range of professional issues were carefully worded to stimulate use of higher level cognitive skills. For the semester, students were assigned to a four- or five-member discussion team and were required to post an initial response and a minimum of two substantial responses in each 1- to 2-week discussion.

The instructor shared information and suggestions about facilitation techniques (Lim et al., 2011) and in each team's first discussions actively modeled the facilitator role. Each student was then required to twice facilitate the team discussion, to post an end-of-discussion synthesis, and to reflect on both enacting the facilitator role and the quality of the team's discussion.

After each discussion, the instructor provided each student with rubric-based feedback and periodically encouraged students to reflect on the quality of thinking in their contributions to discussions.

Data Collection

A total of 63 initial posts in three discussions by 21 enrolled students provided data used to answer research question one. One student's data were excluded due to missing posts. Week 1, 3, and 13 posts were de-identified and copied into a text file, which the three investigators used to score posts. Nine students (41%) consented to complete a questionnaire at the beginning and end of the semester, administered by an investigator who did not teach the course, via secure software. One student completed only the first questionnaire. After grades were submitted, the responses were downloaded, shared with all investigators, and used to answer research questions two, three, and four.

Instruments

To analyze changes in all three of the components of CT in discussions across the semester, we modified an instrument based on the Paul-Elder framework previously used by Ralston and Bays (2013) with adequate reliability. We wanted the instrument to fully include all three components, so that we could identify not only evidence that students had applied standards of thinking to elements of thought, but also evidence of students' CT-associated traits. We modified Ralston's and Bays' four criteria slightly to identify evidence of application of standards of thinking to elements of thought: (a) clarity of purpose and identification of complexities of relevant questions, (b) clarity of ideas, relevance of concepts, and accuracy and completeness of credible evidence, (c) fairness of interpretations and inferences considering relevant assumptions, points of view, and context, (d) identification of significant, logical implications and consequences based on relevant evidence. We also added a fifth criterion to identify evidence of students' CT-associated traits: (e) indications of CT characteristics, such as self-reflection, self-corrective actions, unbiased self-awareness, or recognition of flawed thinking. Quality levels for each criterion ranged from 4 (fully developed) to 1 (development not evident). Summed criterion scores could range from a total of five to 20.

To develop a shared understanding of how to apply the CT rubric, the investigators initially trained together by discussing factitious posts. After we reviewed our scores and shared rationales, then we individually scored actual students' posts over several months. On examining our own patterns, we found that all of us had tended to score earlier posts lower than later posts; to avoid this rater drift, we decided to individually rescore all posts over a shorter time period of 3 weeks. All our final criteria scores were within one quality level of each other and at least two scores were identical. We identified a difference of one or more points for 3% (10 of 315) of criteria in all posts. To evaluate our internal consistency, we used Cronbach's alpha and found we had reliability of .814 for week 1 posts, .692 for week 2 posts, and .843 for week 13 posts. Deleting the trait criteria would have decreased Cronbach's alpha from .13 to .05. To determine interrater reliability, we analyzed the three total CT scores for each post. We obtained intraclass correlation coefficients (ICC) using a two-way mixed ANOVA model for consistency. The reliability of the discussion total scores across all raters was high for both the first (ICC = .846, confidence interval [CI] = .682 to .933, p < .001) and final (ICC = .898, confidence interval = .788 to .955, p < .001) discussions, and slightly lower for the second discussion (ICC = .700, CI = .38 to .869, p = .001).

The investigators designed a pre- and postquestionnaire that captured students' prior and current experiences and perceptions about CT and online course discussions. Students identified their perceptions about the extent to which critical discussion involves various aspects of CT using a scale from 0 (not at all) to 10 (very much). Respondents were asked to describe their conceptions of CT and to identify the factors limiting their online discussion participation from a list derived from the literature. Two items were included only at the start of the course: hours per week that would be reasonable to invest in course discussions, and prior experience with online courses and discussions. Three items were included only at the end of the course: the hours invested weekly in discussions; the effectiveness of strategies used to promote CT in discussions, using a scale from 0 (not at all effective) to 10 (very effective); and self-assessed improvement of CT skills, using a scale from 0 (not at all) to 10 (very much).

Analyses

We analyzed data using IBM® SPSS® Statistics version 20.0 software. We reported categorical variables using frequencies and percentages. We reported interval variables with normal distributions using means and standard deviations, and non-normal distributions using medians and ranges. We reported confidence intervals for a 95% interval and set a significance level of .05 for all analyses. We calculated the total CT score of each post by averaging the three reviewer scores. We used repeated measures general linear model analysis to examine mean differences between 21 students' total CT scores on three posts across the semester. Mauchly's Test of Sphericity (X2(2) = .837, p = .184) indicated that the assumption of sphericity was met. We described changes in students' pre- and postcourse ratings by differences in means or medians. We used a t test to compare pre- postestimates of time invested in course discussions. We compared students' pre- postnarrative descriptions of CT by analyzing the descriptive content.

Results

All questionnaire respondents had prior experience with blended or fully online courses. Students previously took a median of four fully online classes, with a range from one to 15.

Question 1: Changes in CT Between Discussions

After all strategies had been fully implemented, there was a statistically significant improvement in CT scores over the semester. Of a possible score of 20, the total mean CT score for discussion one was 12.4 (SD = 1.5); for discussion two, 12.8 (SD = 1.2); and for the final discussion, 14.0 (SD = 1.9). Figure 2 depicts the mean total CT discussion scores with a significant overall effect, F(2, 20) = 5.217, p = .010. Although the first discussion was immediately followed by an introduction of instructional strategies to enhance CT, the second discussion showed no significant change. However, between the second and final discussions there was a mean increase of 1.186 (CI = .284 to 2.008, p = .013), and between the first and final discussions, the mean increase was even greater: 1.554 (CI = .314 to 2.794, p = .017).


Changes in mean critical thinking (CT) scores between the three discussions.

Figure 2.

Changes in mean critical thinking (CT) scores between the three discussions.

Question 2: Changes in Students' Conceptions of CT

Students' conceptions of CT became more complex over the semester. Initially, students described CT as a process to arrive at an understanding, decision, or plan. Some students' descriptions were deeper, such as “CT is a process of analyzing and carefully reviewing pertinent information in the attempt to make the most appropriate decision or conclusion about an issue or problem.” Other initial descriptions were vague or unclear, such as “[CT is an] in-depth analysis of information to formulate a consecutive idea.”

We compared the frequency of pre- and postcomments pertaining to the CT elements of thought, standards of thinking, and traits. Ideas associated with purpose, information or concepts, and interpretations and inferences were evident in both early and later descriptions. In later descriptions, the question at issue was identified twice as often and point-of-view three times as often as in earlier descriptions. Although the early descriptions did not indicate any consideration of assumptions, in later descriptions, several students made implicit references to assumptions. In the later descriptions, students included more specific or indirect references to standards of thinking or traits of a critical thinker. Early descriptions alluded only to relevance or depth, whereas later descriptions also included clarity and logic. In later descriptions, several students acknowledged a higher quality of thinking as essential for CT, as exemplified in a comment describing CT as “a way of thinking where one thinks at a deeper level by analyzing and trying to bring clarity, logic, and relevance into their thinking.” Many early descriptions implied that a critical thinker demonstrates autonomy or perseverance, whereas later descriptions also included confidence in reason and fair mindedness.

At the semester's end, students started demonstrating awareness of metacognition. One student shared, “I'm not sure that I have ever thought critically. This class has made me more sensitive to those that do not evaluate all possible and likely outcomes and assign value to information based on their preconceived beliefs and experience.” Another student mentioned developing an “awareness of how I think about any material or concept by systematic evaluation.”

Question 3: Changes in Students' Perceptions

In comparison to prior online courses, students perceived fewer problems in this course with superficiality or the value of discussions. Although adequacy of discussion instructions and user-friendliness of discussion structure were barriers in prior courses, they were not problems in this course. Unclear discussion topics was the one barrier cited 25% more often in this course.

We found differences between the students' pre- and postratings of the extent to which elements of thought and intellectual standards are involved in critical discussions. Means of the three elements of thought that were initially rated the lowest increased by one point: Inferences or conclusions 7.5 (SD = 1.1), assumptions 7.4 (SD = 1.2), and implications and consequences 7.3 (SD = .9). The mean score of purpose decreased over the semester from a beginning score of 7.6 (SD = 1.5) to an end score of 6.5 (SD = −2.3). The mean scores for the other elements differed by no more than 0.2 points. Students rated two standards of thinking higher at the end of the semester. Median values increased for clarity by 2 points (median = 8.5, range = 6 to 10) and fairness by 1.5 points (median = 7.0, range = 1 to 10). Other standards differed by ⩽ 0.5 points.

Between expected and reported time invested in discussions, there was a significant mean increase of 7.2 hours (CI = 2.11 to 12.26, t(10.1) = 3.041, p = .012. At the beginning of the course, students estimated that it would be reasonable to invest an average of 4.2 hours (SD = 2.9) each week in online discussions; however, at the end of the semester, they reported having spent an average of 11.4 hours (SD = 6) weekly. Students reported exerting high effort in discussions, ranging from 8 to 10, with a median of 8.5.

Students rated the improvement in their CT skills since the start of the course from 2 to 9, with a median of 6.5. The one outlier who rated improvement as 2 commented that “the level of discourse was hampered greatly by low performers” on the team.

Question 4: Effectiveness of Strategies to Enhance CT in Discussions

As shown in Figure 3, at the end of the semester, students rated the value of all strategies a mean of ⩾ 5.0. Overall, students perceived instructor-centered strategies involving direct interaction with a student or team to be most effective, and all student–facilitator activities were rated as least effective. The discussion rubric and wording and the focus of discussion topics were also highly rated, followed by self-reflection. Video presentations about CT earned middle ratings.


Students' mean ratings of the value of strategies to enhance critical thinking (CT) in discussions on a scale of 1 (least effective) to 10 (most effective).

Figure 3.

Students' mean ratings of the value of strategies to enhance critical thinking (CT) in discussions on a scale of 1 (least effective) to 10 (most effective).

Discussion

The nature, breadth, and depth of students' conceptualizations of CT evolved. Later descriptions were fuller, clearer, and more specific and indicated greater awareness of the importance of points of view, the question at issue, CT standards, CT traits, and metacognition. Changes in students' pre- and postquantifications of the extent to which elements and standards of thinking are involved in critical dialogue also indicated increased recognition of the components of CT. Self-reported improvement in CT skills was positive, although the range was wide.

Our finding that students rated instructor-centric strategies higher than student-centric is consistent with Zhou's (2015) finding that students highly valued interacting with instructors. Although students rated instructor-active strategies most highly, the instructor perceived student-centric activities involved in the facilitator role to be effective. Student facilitators often elicited a more substantial analysis from team members throughout a discussion than team member posts. Student posts rarely demonstrated synthesis unless the student was also facilitating. Most facilitator reflections also included syntheses that were well formulated rather than mere summaries of the discussion. In addition, facilitators addressed assumptions slightly more often in their reflections than within ongoing discussions. The instructor-facilitated discussions during week 1, as distinct from the later discussions facilitated by students, did not demonstrate higher CT than subsequent discussions. The most improvement in CT was evident between the second and final student-facilitated discussions. Evidence of higher CT after all strategies were implemented was compatible with student-reported changes across the semester.

Low student ratings of the effectiveness of facilitator-related strategies may have been influenced by limited prior exposure to student-centered approaches. Student discomfort with a CT-enhancing strategy has been shown to lessen its effectiveness (Lim et al., 2011; Richardson & Ice, 2010). Alternately, students and instructor may have used different criteria to assess effectiveness. On the basis of findings that when students used their preferred online instructional strategies their CT abilities were generally lower and vice versa, Richardson and Ice (2010) wrote, “students don't always realize what is good for them” (p. 57).

Student assessments of the discussion topics appear contradictory, with many students rating the wording of topics both unclear and effective for increasing CT. However, the apparent contradiction actually explains the results. The topics seemed unclear and were effective precisely because they required careful thinking rather than straightforward, memorizable answers.

We do not know whether all students watched the mini presentations or reviewed the instructor feedback. This study has other limitations. In each discussion topic, only the initial posts were analyzed, which may not fully represent the CT evident throughout the discussions. We selected discussion topics specifically to elicit CT, but not all topics may have equivalent potential to elicit CT. Our study was not designed to identify the most effective CT-enhancing strategies nor to determine duration or transferability of effects. Finally, despite congruent findings derived from the questionnaire, the small number of respondents requires the cautious interpretation of related findings. Due to course and enrollment constraints, which necessitated a quasi-experimental design, the lack of a control group disallows assertions that the multiple CT-enhancing strategies directly caused the improvements in CT.

However, we believe these changes are unlikely to have occurred without the explicit embedding of CT with the course material, through multiple references to the common language provided by the Paul-Elder framework in direct instruction and individual feedback. We see the course's emphasis on engaging in thoughtful discussions as contributing to the decreases in student reports of superficial and low value discussion in this course, compared with their prior experiences. Students' positive perceptions about adequacy of instructions and user-friendly structure indicate that the implementation of effective online design features was successful. Our findings overall are consistent with prior studies in non-nursing students, which identified heightened effectiveness when using concomitant strategies to improve CT (Abrami et al., 2015).

Aligning with Ku's (2009) emphasis on assessing both cognitive and dispositional components of CT, the rubric, including the trait criterion of the full Paul-Elder framework, helped to reliably maintain internal consistency. Using a specific CT framework to embed CT principles within course content and to evaluate CT within posts helped to make the study cohesive. Our discussions among ourselves to establish a consistent approach to applying the rubric helped develop our abilities to differentiate the quality of thinking within posts and facilitated higher reliability. The credibility of our findings results from collecting different types of data from different sources and analyzing them by different means, all of which yielded congruent results.

Conclusion

Using multiple strategies to embed CT with the subject matter of an online master's-level nursing course and elicit higher level cognitive skills in discussions of course content yielded a significant improvement in CT within online posts. After all strategies were implemented, both instructors and students perceived an improvement in CT skills.

Promoting CT by using multiple instructional strategies required substantial student and faculty efforts. Despite the small discussion teams, students spent significantly more time in discussions than they had anticipated would be reasonable, and the course instructor spent a comparable amount of time in role modeling and providing rubric-based feedback. Given the effort involved for all stakeholders and the importance of developing CT skills for advanced nursing roles, nursing education research must explore the most efficient teaching–learning strategies, including ways to ease instructor burden in efforts to enhance CT within online course discussions. Preliminary indications of the value of having students act as facilitators suggest that further studies are needed to explore ways to promote student comfort in the facilitator role.

A more efficient and beneficial approach than intensive single-course strategies might be to embed CT development throughout a program across multiple courses. Coordinated, repetitive, and consistent efforts would lighten the workload within single courses, expose students to a variety of CT-enhancing strategies, and increase the program's potential to address a wider range of student-related, mediating factors. Therefore, we recommend exploring the use of a programmatic approach to furthering graduate nursing students' development of CT.

This study adds to the limited body of quantitative evidence documenting the effectiveness of multiple discussion strategies in fully online nursing courses, and the findings are a useful benchmark for future studies comparing combinations of strategies to identify those most effective and least arduous.

References

  • Abrami, P.C., Bernard, R.M., Borokhovski, E., Waddington, D.I., Wade, A. & Persson, T. (2015). Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research, 85, 275–314. doi:10.3102/0034654314551063 [CrossRef]
  • American Association of Colleges of Nursing. (2011). The essentials of master's education in nursing. Washington, DC: Author.
  • Bai, H. (2009). Facilitating students' critical thinking in online discussion: An instructor's experience. Journal of Interactive Online Learning, 8, 156–164.
  • Bensley, D.A. & Spero, R.A. (2014). Improving critical thinking skills and metacognitive monitoring through direct infusion. Thinking Skills and Creativity, 12, 55–68. doi:10.1016/j.tsc.2014.02.001 [CrossRef]
  • Bolger, D.J., Mackey, A.P., Wang, M. & Grigorenko, E.L. (2014). The role and sources of individual differences in critical-analytic thinking: A capsule overview. Educational Psychology Review, 26, 495–518. doi:10.1007/s10648-014-9279-x [CrossRef]
  • Chan, Z.C. (2013). A systematic review of critical thinking in nursing education. Nurse Education Today, 33, 236–240. doi:10.1016/j.nedt.2013.01.007 [CrossRef]
  • Cook, D.A., Levinson, A., Garside, S., Dupras, M., Erwin, P.G. & Montori, V.M. (2010). Instructional design variations in internet-based learning for health professions education: A systematic review and meta-analysis. Academic Medicine, 85, 909–922. doi:10.1097/ACM.0b013e3181d6c319 [CrossRef]
  • Darabi, A., Arrastia, M.C., Nelson, D.W., Cornille, T. & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning, 27, 216–227. doi:10.1111/j.1365-2729.2010.00392.x [CrossRef]
  • Darabi, A. & Jin, L. (2013). Improving the quality of online discussion: The effects of strategies designed based on cognitive load theory principles. Distance Education, 34, 21–36. doi:10.1080/01587919.2013.770429 [CrossRef]
  • DeWever, B., Van Keer, H., Schellens, T. & Valcke, M. (2009). Structuring asynchronous discussion groups: The impact of role assignment and self-assessment on student's levels of knowledge construction through social negotiation. Journal of Computer Assisted Learning, 25, 177–188. doi:10.1111/j.1365-2729.2008.00292.x [CrossRef]
  • DeWever, B., Van Winckel, M. & Valcke, M. (2008). Discussing patient management online: The impact of roles on knowledge construction for students interning at the paediatric ward. Advances in Health Sciences Education, 13, 25–42. doi:10.1007/s10459-006-9022-6 [CrossRef]
  • Ertmer, P.A. & Koehler, A.A. (2015). Facilitated versus non-facilitated online case discussions: Comparing differences in problem space coverage. Journal of Computing in Higher Education, 27, 69–93. doi:10.1007/s12528-015-9094-5 [CrossRef]
  • Forneris, S.G. & Peden-McAlpine, C. (2007). Evaluation of a reflective learning intervention to improve critical thinking in novice nurses. Journal of Advanced Nursing, 57, 410–421. doi:10.1111/j.1365-2648.2007.04120.x [CrossRef]
  • Greenlaw, S.A. & DeLoach, S.B. (2003). Teaching critical thinking with electronic discussion. The Journal of Economic Education, 34, 36–52. doi:10.1080/00220480309595199 [CrossRef]
  • Halstead, J. (2005). Promoting critical thinking through online discussion. In Oermann, M.H. & Heinrich, K.T. (Eds.), Annual review of nursing education (3rd ed., pp. 143–163). New York, NY: Springer.
  • Heijltjes, A., Van Gog, T. & Pass, F. (2014). Improving students' critical thinking: Empirical support for explicit instructions combined with practice. Applied Cognitive Psychology, 28, 518–530. doi:10.1002/acp.3025 [CrossRef]
  • Hew, K.F. & Cheung, W.S. (2011). Student facilitators' habits of mind and their influences on higher-level knowledge construction occurrences in online discussion: A case study. Innovations in Education and Teaching International, 48, 275–285. doi:10.1080/14703297.2011.593704 [CrossRef]
  • Kovanović, V., Gašević, D., Joksimović, S., Hatala, M. & Adesope, O. (2015). Analytics of communities of inquiry: Effects of learning technology use on cognitive presence in asynchronous online discussions. Internet and Higher Education, 27, 74–89. doi:10.1016/j.iheduc.2015.06.002 [CrossRef]
  • Ku, K.Y.L. (2009). Assessing students' critical thinking performance: Urging measurements using multi-response format. Thinking Skills and Creativity, 4, 70–76. doi:10.1016/j.tsc.2009.02.001 [CrossRef]
  • Kwan, Y.W. & Wong, A.F.L. (2015). Effects of the constructivist learning environment on students' critical thinking ability: Cognitive and motivational variables as mediators. International Journal of Educational Research, 70, 68–79. doi:10.1016/j.ijer.2015.02.006 [CrossRef]
  • Lim, S.C.R., Cheung, W.S. & Hew, K.F. (2011). Critical thinking in asynchronous online discussion: An investigation of student facilitation techniques. New Horizons in Education, 59, 52–65.
  • Lovatt, A. (2014). Defining critical thoughts. Nurse Education Today, 34, 670–672. doi:10.1016/j.nedt.2013.12.003 [CrossRef]
  • Maurino, P.S. (2007). Looking for critical thinking in online threaded discussions. Journal of Educational Technology Systems, 35, 241–260. doi:10.2190/P4W3-8117-K32G-R34M [CrossRef]
  • McMullen, M.A. & McMullen, W.F. (2008). Examining patterns of change in the critical thinking skills of graduate nursing students. Journal of Nursing Education, 48, 310–317. doi:10.3928/01484834-20090515-03 [CrossRef]
  • Nouri, J.M., Alhani, F. & Ahmadizadeh, M.J. (2013). Qualitative study of humanization-based nursing education focused on role modeling by instructors. Nursing and Health Sciences, 15, 137–143. doi:10.1111/j.1442-2018.2012.00732.x [CrossRef]
  • Panadero, E. & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9, 129–144. doi:10.1016/j.edurev.2013.01.002 [CrossRef]
  • Paul, R. & Elder, L. (2008). The miniature guide to critical thinking: Concepts and tools. (5th ed.). Berkley, CA: Foundation for Critical Thinking.
  • Ralston, P.A. & Bays, C.L. (2013). Enhancing critical thinking across the undergraduate experience: An exemplar from engineering. American Journal of Engineering Education, 4, 119–125. doi:10.19030/ajee.v4i2.8228 [CrossRef]
  • Richardson, J.C. & Ice, P. (2010). Investigating students' level of critical thinking across instructional strategies in online discussions. Internet and Higher Education, 13, 52–59. doi:10.1016/j.iheduc.2009.10.009 [CrossRef]
  • Staib, S. (2003). Teaching and measuring critical thinking. Journal of Nursing Education, 42, 498–508.
  • Williams, R.L., Oliver, R., Allin, J.L., Winn, B. & Booher, C.S. (2003). Psychological critical thinking as a course predictor and outcome variable. Teaching of Psychology, 30, 220–223. doi:10.1207/S15328023TOP3003_04 [CrossRef]
  • Yang, Y.C. & Chou, H. (2008). Beyond critical thinking skills: Investigating the relationship between critical thinking skills and dispositions through different online instructional strategies. British Journal of Educational Technology, 39, 666–684. doi:10.1111/j.1467-8535.2007.00767.x [CrossRef]
  • Zhou, H. (2015). A systematic review of empirical studies on participants' interactions in internet-mediated discussion boards as a course component in formal higher education settings. Online Learning Journal, 19. Retrieved from http://olj.onlinelearningconsortium.org/index.php/olj/article/view/495
Authors

Dr. Novotny is Assistant Professor and Dr. Stapleton is Associate Professor, Mennonite College of Nursing, Illinois State University, Normal; and Dr. Hardy is Director and Clinical Assistant Professor, Peoria Regional Campus, College of Nursing, University of Illinois at Chicago, Peoria, Illinois.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

The authors thank Linda Summers, the Coordinator of Blended and Online Instruction at Illinois State University Center for Teaching and Learning, for her support in the design of the online structure and formulation of discussion topics. They also thank Ryan J. Smith for the preparation of the images and Dr. Carol Cavalier for her review and suggestions to improve this article.

Address correspondence to Nancy L. Novotny, PhD, RN, CNE, Assistant Professor, Mennonite College of Nursing, Illinois State University, PO Box 5810, Normal, IL 61790-5810; e-mail: nlnovot@ilstu.edu.

Received: December 23, 2015
Accepted: June 09, 2016

10.3928/01484834-20160816-05

Sign up to receive

Journal E-contents