Journal of Nursing Education

Major Article 

Relationship Between Student Engagement and Outcomes for Online Master of Science in Nursing Students

Joanne Farley Serembus, EdD, RN, CCRN, CNE; Patricia A. Riccio, PhD, RN

Abstract

Background:

The purpose of this study was to examine associations between student engagement and student outcomes for online Master of Science in Nursing students using course analytics.

Method:

A retrospective, correlational design was used to analyze the relationship between the admission grade point average (GPA), course analytics measuring course access, minutes, interactions, and submissions, as well as the output of course grade. Additional associations with age, gender, major, and geography were tested.

Results:

Interactions and submissions had the highest impact on the course grade. Each additional increase in submissions resulted in an increase in course grade by 0.33% (p < .0001). Additionally, each 1-point increase in entry-level GPA was associated with an increase in course grade by 1.93% (p = .0289). Each 1-year increase in age demonstrated a course grade decrease of 0.17% (p < .0001).

Conclusion:

The two factors that most affected grade were interactions and submissions. Course grade was associated with entry-level GPA, age, access, and minutes. [J Nurs Educ. 2019;58(4):207–213.]

Abstract

Background:

The purpose of this study was to examine associations between student engagement and student outcomes for online Master of Science in Nursing students using course analytics.

Method:

A retrospective, correlational design was used to analyze the relationship between the admission grade point average (GPA), course analytics measuring course access, minutes, interactions, and submissions, as well as the output of course grade. Additional associations with age, gender, major, and geography were tested.

Results:

Interactions and submissions had the highest impact on the course grade. Each additional increase in submissions resulted in an increase in course grade by 0.33% (p < .0001). Additionally, each 1-point increase in entry-level GPA was associated with an increase in course grade by 1.93% (p = .0289). Each 1-year increase in age demonstrated a course grade decrease of 0.17% (p < .0001).

Conclusion:

The two factors that most affected grade were interactions and submissions. Course grade was associated with entry-level GPA, age, access, and minutes. [J Nurs Educ. 2019;58(4):207–213.]

Online graduate nursing programs throughout the United States are seeing unprecedented growth (American Association of Colleges of Nursing, 2018). Although enrollments are increasing in online learning, overall graduation rates are not. This is problematic for nurse educators as the nursing shortage continues to grow. Even more troubling is that attrition in online learning is higher than traditional face-to-face programs (Gazza & Hunker, 2014; Rice, Rojjanasrirat, & Trachsel, 2013). Multiple reasons for attrition in online programs have been reported, and one of the main reasons is a lack of engagement (Purarjomandlangrudi, Chen, & Nguyen, 2016; Scott, 2014). Online learning can be isolating for students, leading to attrition. Minimal instructor interaction and lack of course information are some of the most common reasons for this problem (Hannum, Irvin, Lei, & Farmer, 2008). Engagement serves as a foundation to successful student retention initiatives. The more engaged students are, the more likely they will remain enrolled in a course or in the institution as a whole (Lundberg & Sheridan, 2015). Additionally, students who are more engaged with one another online also spend more time involved in their own learning (Young & Bruce, 2011). The purpose of this study was to examine associations between student engagement and student outcomes for online Master of Science in Nursing (MSN) students using course analytics available in the learning management system (LMS).

Literature Review

An engaging online classroom contains many of the same design and implementation features as an engaging on-campus classroom. Updating the Seven Principles for Good Practice, Chickering and Ehrmann (1996) stressed that the instructional strategies embedded in the seven principles remain essential for technology-mediated education. According to Chickering and Ehrmann (1996), the following are important components of an engaged online learning environment: contacts between students and faculty, prompt feedback for students, cooperation and reciprocity among students, and the use of active learning techniques. Several strategies have been recommended for structuring online courses to promote cooperative and active learning. These include applying concepts learned to case studies or problem-based activities, role-playing experiences, and discussion forums about concepts, group projects, and interactive activities such as simulations and learning games (Britt, Goon, & Timmerman, 2015; Dietz-Uhler & Hurn, 2013a; Dixson, 2010).

The term engagement has evolved over time. In 1985, Astin defined student engagement by using the term involvement. He described involvement as the amount of physical or psychological activity or student energy devoted to the academic experience. The theory was developed during a longitudinal study of college dropouts. Astin (1975) specifically focused on factors in the college environment that significantly affected students' persistence.

According to Astin, the three elements of inputs, environments, and outcomes comprise the theory of involvement. Inputs include those variables the student brings to the educational endeavor (i.e., demographics, background, previous experiences). The student's environment, also termed resources, consists of all the experiences a student has during college (e.g., classroom environment, activities, events, interactions, conversations). Finally, several outcomes entail student characteristics upon graduation (i.e., knowledge, changed behavior, attitudes, beliefs, values) (Astin, 1984, 1993).

Astin (1985) described a highly involved student as one who expends considerable energy studying, spends extensive time on campus, participates actively in student organizations, and interacts frequently with faculty and other students. Basically, the more activities students are engaged in, the more they feel involved. Similarly, the involved student is one who is highly satisfied with the course and achieves success.

While the term involved has evolved over time to engaged, the basic constructs of the theory continue to be present in the work of others; thus, the words are used interchangeably (Pike & Kuh, 2005; Pace, 1990). Kuh (2003) used the term engagement to represent constructs such as the quality of student effort and involvement in their learning activities. Cooke (2016) viewed student engagement as the level of interest students show toward the subject matter being taught; their interaction with the content, instructor, and peers along with their motivation to learn and progress through the course. He further stated that engaging learners in their learning tasks is one of the first necessary steps toward successful online learning (Cooke, 2016).

Given the pivotal role that student engagement has in keeping students connected with their course and their learning, measuring engagement in online learning is extremely important. Various methods have been used to attempt to quantify the level of engagement; however, only two main tools are designed for use in the online environment. These tools include the Online Student Engagement Scale and the National Survey of Student Engagement. Both surveys entail student self-reports, so their accuracy has been called into question (Campbell & Cabrera, 2011; Pascarella, Seifert, & Blaich, 2010; Pike & Kuh, 2005; Porter, Rumann, & Pontius, 2011). Furthermore, data are collected from surveys retrospectively and do not allow faculty to make changes in their courses in time to assist learners (Bodily, Graham, & Bush, 2017). Analytics provided by the LMS have the prospect of providing information on student engagement in real time. Data obtained can also be used to improve the quality of online courses by making changes in learning activities, assignments, and the learning environment. Additionally, analytics can be used to examine student activity data within a course to make predictions about learning outcomes and institute appropriate interventions to improve those outcomes for the future. Given these issues and recent improvements made to LMS data collection, many educators have turned to learning analytics as a measure of student engagement. This study focused on the predictive aspects of learning analytics to learn those aspects that are most predictive of positive student outcomes.

Learning analytics are defined as “the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Dietz-Uhler & Hurn, 2013b, p. 17). LMSs provide educators with information about students' navigational patterns. These can include the number of times they access (called hits) items in the LMS such as discussion boards, course videos, linked web pages, journal articles, quiz completion, or submission of assignments via digital drop box. Minutes spent on each of these activities are also captured by the LMS. These data can be helpful to educators as they consider the teaching strategies used and design courses that increase student engagement, decrease cognitive load, and foster reflection. It is posited that such courses lead to decreased attrition in higher education. Several studies showed a significant relationship between the various types of interactions in the LMS and students' academic performance. Agudo-Peregrina, Iglesias-Pradas, Conde-González, and Hernández-García (2014) found that the following interactions, as quantified by the LMS learning analytics, significantly influenced academic performance: interactions with peers and teachers; interactions related to student assessment; and interactions involving active participation within the course. Zhang (2016) examined the association between online students' behaviors in the LMS and their learning performance, as measured by final course grades. He discovered a significant negative correlation between the number of days students delayed in accessing weekly lecture materials and the final grade, whereas an increased number of discussion board postings had a positive correlation with the final grade. Dvorak and Jia (2016) found that data in the LMS offered insights into factors beyond intellect for student success. They observed that higher achieving students began working earlier on assignments than their lower achieving peers. Furthermore, Yu and Jo (2014), determined that total studying time in the LMS, interaction with peers, regularity of learning interval, and number of downloads were significant factors for students' academic achievement in the online learning environment and significantly predicted learning outcomes.

Although learning analytics have been used to analyze the association between student activity and performance in online higher education, studies on this topic were not found in the nursing literature. Most investigations on this topic are found in recent years as a result of improvements in LMSs and the data that can be mined. Nurse educators can learn about the engagement of their students in the online environment using their LMS analytics. These data can be used when designing online courses that will best engage learners and decrease attrition in online nursing programs. The purpose of this study was to examine associations between student engagement and student outcomes for online MSN students.

Conceptual Framework

The conceptual framework used in this study was student involvement theory, as described by Astin (1975, 1985) (Figure 1). Astin's framework of student involvement consists of inputs, resources (also called environment), and outputs. His theory was adapted from the physical classroom to the online learning environment for this study. Therefore, inputs include the student's grade point average (GPA) and selectivity (current major) on admission to the graduate program. Resources include the quantity of online access, such as the total course minutes a student has in the course, as well as the student–teacher interactions evident in student introductions and discussion board postings and other places within the course and found as total interactions and total submissions in the learning analytics. Outputs include the course grade (Astin, 1975, 1984, 1985, 1993). According to Astin (1975, 1985, 1993), students are described as highly involved if (a) they interacted with faculty more, (b) participated actively with fellow students on a more frequent basis, and (c) devoted more time to studying. Likewise, those students who neglected studying had less frequent interactions with faculty and were on the opposite end of high levels of involvement (Astin, 1985).

Conceptual framework, online student engagement using learning analytics, adapted from Astin's work (1975, 1985).

Figure 1.

Conceptual framework, online student engagement using learning analytics, adapted from Astin's work (1975, 1985).

Analytics Terms Defined

Accesses refers to the number of times the selected student accessed the course in which they are enrolled during the selected term. Interactions refers to the total number of clicks on content items, tools, and assessments within the Blackboard™ site for a course. Minutes refers to the total number of minutes spent within the Blackboard site for a course. These are calculated from when the student accesses the Blackboard Learn subject site until they log out or access another Blackboard Learn subject site. After 3 hours of inactivity, the student is logged out and the minutes recorded is based on the time of their last interaction (or click). Submissions refers to the total number of times a student has submitted work through the Blackboard site for a course (e.g., assignments, tests, discussion board posts, journal entries).

Method

This study used a retrospective, correlational design. The investigation took place at a large, private, urban university hosting an online MSN program. The sample consisted of online graduate nursing students enrolled in one of two courses: Health Policy and Politics or Research Methods (N = 360). The students included those from the nurse anesthesia, nurse practitioner and advanced role programs of clinical nurse leader, nursing education, clinical trials, and nursing leadership. Data were obtained from course learning analytics in Blackboard Learn, as well as student files containing the variables of entry GPA and major selected. Engagement variables consisted of the number of times students accessed the course, minutes spent within the course, interactions within the course, and the number of submissions made within the course. The output variable of course grade was located in Blackboard Learn.

Ethical Considerations

Permission from the institutional review board was obtained prior to the start of this study. Every online MSN student completing a Research Methods or a Health Policy & Politics course during the chosen 10-week term (N = 360) was selected. Data were retrieved by an honest broker from the learning analytics and course grade column in Blackboard Learn, as well as information retrieved from student records. Abstracted data were managed and organized by the honest broker at the university who removed all identifiers from student data and then sent the files, along with demographic information for each student, to the nurse researchers.

Data Analysis

Data were analyzed for the following input variables: entry-level GPA; major selected upon entry into the program (selectivity); and engagement variables, which consisted of the number of times students accessed the course, the minutes and interactions spent within the course, and the number of submissions made within the course. The output variable was the course grade.

Descriptive statistics were computed for all the variables, including the outcome variable of grade. Descriptive analyses included means, standard deviations, and histograms for continuous variables and contingency tables with proportions for categorical data. Analysis methods included the assessment of correlation coefficients, analysis of variance (ANOVA), and multiple linear regression using the backwards elimination method. Specifically, Pearson product–moment correlation coefficients and Spearman's correlation coefficients were produced to assess relationships between the input variables (GPA, major, access, minutes, interactions, submissions) and the output variable (grade). ANOVA was used to assess the impact of categorical predictor variables on grade, and backwards elimination within a multiple linear regression analysis was used to produce a model that only included variables that significantly predicted grade at the alpha = .05 level. The non-parametric Kruskal-Wallis test was used to confirm the results of the ANOVA. All analyses were performed using SAS® software package version 9.4.

Results

Data were first analyzed through data distribution, using a histogram, of the outcome variable of grade, as found in Figure 2. As seen in Figure 2, most of the students received passing grades; however, a few students failed a course. The histogram and the Shapiro-Wilk (W = 0.711) and Kolmogorov-Smirnov tests (D = 0.108) revealed that, although the distribution of grades was skewed to the left, there was no extreme deviation from a normal distribution. Nonetheless, nonparametric tests were run to validate these parametric test results. Table 1 describes all the variables in the study.

Histogram of distribution of the outcome variable of grade.

Figure 2.

Histogram of distribution of the outcome variable of grade.

Descriptive Statistics for Variables (N = 360)

Table 1:

Descriptive Statistics for Variables (N = 360)

As described in Table 1, the average grade was a B+ (based on the grading scale for the nursing program); the mean was 0.910 with a SD of 0.060. Overall the grades ranged from 0.372 to 0.995. The students tended to be more than 30 years of age, with a mean age of 32.5 years. The entry-level GPA was 3.5; access was 97.5; the number of minutes was 3,120; the number of interactions was 817.3; and the number of submissions was 22.6. Based on comparisons of this sample with the data from the entire nursing program, the averages were the same for access and submissions but higher than the averages for minutes spent in the online classroom and the number of interactions within the online classroom.

Table 2 reveals that although all the correlation coefficients were significantly different from zero, grade was mildly associated with entry-level GPA, age, access, and minutes (Pearson rho < 0.20). Surprisingly, the two observed factors that had the highest influence on grades were interactions (Pearson rho = 0.28) and submissions (Pearson rho = 0.45). Spearman correlation coefficients, although not shown here, were also computed, producing results similar to those achieved with the Pearson correlation coefficient analysis.

Associations of Variable Using Pearson Correlation Coefficients (N = 360)

Table 2:

Associations of Variable Using Pearson Correlation Coefficients (N = 360)

The average grade was then calculated across each of the categorical variables (i.e., race, major, gender, geography) to test whether there was a significant difference in grade outcome within these demographic variables. The distribution of grade across the categorical variables with the corresponding p values from the ANOVA can be seen in Table 3. Consistent results were observed with the p values from the nonparametric Kruskal-Wallis test. It can be concluded from the analyses that the outcome variable of grade did not differ significantly across the groups within each categorical demographic variable.

Categorical Demographic Variable and Grade as an Outcome Variable (N = 360)

Table 3:

Categorical Demographic Variable and Grade as an Outcome Variable (N = 360)

A backwards elimination variable selection method was used within a general linear model regression to develop the final model for the outcome variable, students' average grades, presented as percentages ranging from 0 to 100. Access, age, gender, geography, entry-level GPA, interactions, major, minutes, race, and submissions were all included as potential independent predictors in the model, and variables were removed one at a time until the final model consisted only of variables significant at the .05 level. Race, major, gender, and geography were treated as categorical predictors in the model. The continuous predictors in the original model were access, age, entry-level GPA, interactions, minutes, and submissions. The variables remaining in the final model included entry-level GPA, age, and submissions. The final model results are displayed in Table 4.

Predictors of Grade by Entry-Level Grade Point Average, Age, and Submissions (N = 360)

Table 4:

Predictors of Grade by Entry-Level Grade Point Average, Age, and Submissions (N = 360)

As seen in Table 4, for every 1-point increase in entry-level GPA, the expected course outcome grade increased by 1.93% (p = .0289) after adjusting for other variables in the model. For every 1-year increase in age, the expected course outcome grade decreased by 0.17% (p < .0001) after adjusting for other variables in the model. For every one additional increase in submissions, the expected course outcome grade increased by 0.33% (p < .0001) after adjusting for other variables in the model.

Discussion

A primary finding from this research was the fact that interactions and submissions had the highest impact on the final outcome of course grade. This is consistent with Astin's model of involvement (engagement), which states that students who participate actively with faculty and students and devote more time to studying, are highly involved. These same students, according to Astin (1984), are more likely to be successful, as found with their course grade. Astin (1984, 1985) long held that the extent to which students learn and develop personally is directly proportional to the quality and quantity of student involvement in the educational process. These results are also comparable to other researchers who studied the relationship of LMS analytics with academic achievement. Macfadyen and Dawson (2010) investigated which student online activities accurately predicted academic achievement as a means of early identification of at-risk students. Their analysis of LMS tracking data from a Blackboard Vista–supported online, undergraduate Biology course revealed that the most significant predictive variable for academic success was the total student contribution of posts to course discussion forums. It was posited that the results supported the proposition that learning is a social process and further confirmed that the degree of student engagement with peers is an important indicator of success in a course.

Similarly, Yu and Jo (2014) and Zhang (2016) found that the interactions in the LMS were significant factors in predicting academic achievement in the online learning environment. Likewise, Agudo-Peregrina et al. (2014) found that the interactions in the LMS learning analytics also influenced academic performance. They reported that the results highlighted the importance of promoting student contributions and teacher–student interactions to improve learning outcomes.

These results indicated to the investigators that increased interaction among students and student–faculty would increase student success. Interactions that currently existed in the courses were predominantly discussion boards at the time of the study. Both the Health Policy and Politics and Research Methods courses had three discussion boards each during a 10-week term. Faculty were active in responding to almost every student in the discussion at least once. Students were required to make a minimum of three posts during each discussion. Given these results, faculty sought to further enhance interactions. Faculty for the Health Policy and Politics course added writing workshops using the Zoom web-conferencing website to review paper drafts for two assigned papers with groups of students. Faculty for the research course included one-to-one meetings with students for review of research study critiques via telephone or web conferencing. These recent course additions were met with further decrease in attrition and increase in course grades. Although these findings were not part of this study, they impressed on the faculty that continuing quality course improvements can be made based on the use of learning analytics.

Another important finding from this research was that for every 1-point increase in entry-level GPA, the expected course outcome grade increased by 1.93%. This finding is similar to several studies related to graduate nursing students. Investigations of student registered nurse anesthetists (SRNAs) discovered a statistically significant relationship between the admission GPA and the current GPA (Burns, 2011). Similarly, Knestrick et al. (2016) analyzed predictors of 847 online nurse practitioner students enrolled over a 2-year period. The undergraduate GPA was a statistically significant predictor (odds ratio [OR] = 0.76; 95% confidence interval, 0.23–1.8) of attrition. Every 0.1 increase in the undergraduate GPA was found to decrease the odds of attrition by 2.5%. Patzer et al. (2017) investigated retrospective data for 37 graduating nurse practitioners and found that admission GPA had the highest relative importance in predicting nursing graduate school success, b = .63, t = 7.31, p < .001. Finally, Suhayda, Hicks, and Fogg (2008) evaluated data over a 5-year period from the records of 738 graduate nursing students in advanced practice programs. Results demonstrated that the combination of cumulative GPA of 3.25 and nursing science GPA of 3.0 predicted student success in 99% of the cases.

In contrast, the results also revealed that for every 1-year increase in age, the expected course outcome grade decreased by 0.17% (p < .0001). Comparable results were found by Wilson, Gibbons, and Wofford (2015), who examined data from 180 SRNAs enrolled over a 6-year period to learn the reasons for the high attrition rates. They found that successful SRNAs were significantly younger (p < .01) and that each additional year of age was associated with a 13% decrease in the odds of success (OR = 0.87).

Likewise, the study by Knestrick et al. (2016) of nurse practitioner students determined that students older than age 40 years had double the odds of leaving the program (OR = 1.83; 95% confidence interval [1.12, 2.95]), and those between the ages of 30 and 40 years had 15% higher odds of attrition than students under age 30 years. Although age was not an admission criterion, success in a nurse anesthesia program was also reported to have an inverse relationship with age by Wilson, Gibbons, and Wofford (2015). Successful students were found to be significantly younger (p < .01), with each additional year of age being associated with a 13% decrease in the odds of success (OR = 0.87). This finding may be attributed to the length of time since students were previously enrolled in a formal education program. This is an important result because it may be beneficial to offer additional supervision and tutorial assistance to those students over the age of 40 years.

This investigation did not find a significant relationship between grades across the demographic variables of major, race, gender, or geography. This seems to be consistent with most studies examining the effect of these variables on academic success in graduate nursing. However, one study pertaining to graduate nursing students found that gender predicted academic success. Wilson, Gibbons, and Wofford (2015) discovered that a greater proportion of female students successfully completed a nurse anesthesia program than did male students (p = .01), accounting for 3.32 times the odds of success.

Limitations

A limitation of the study is that data were collected from only two online nursing graduate courses from one college of nursing over a period of one term. Ideally, the sample size in future studies should be based on an a priori power analysis and conducted in more than one site. Moreover, the generalizability of the findings related to program engagement may be influenced by the teaching strategies used in the courses included in this investigation and similar results regarding engagement may not be found in other online nursing programs.

Implications for Future Practice

Further study needs to be conducted using larger samples to determine whether Astin's model is verified, in that entry-level GPA does determine student grades. Although mildly associated in this study, more work should be done to see whether this continues to hold up. Moreover, if grades do have a relationship between age, access, and minutes in a more intensive way, as found in this study, more work needs to be done for corroboration. One interesting fact from this study was that older students tended to have lower grades. Although grades do appear to be related to interactions and submissions, replication of the study could determine whether classes that vary in numbers of assignments still demonstrate an association with interactions and submissions.

References

  • Agudo-Peregrina, Á.F., Iglesias-Pradas, S., Conde-González, M.Á. & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31, 542–550. doi:10.1016/j.chb.2013.05.031 [CrossRef]
  • American Association of Colleges of Nursing. (2018). 2017–2018 enrollment and graduations in baccalaureate and graduate programs in nursing. Retrieved from http://www.aacnnursing.org/News-Information/Research-Data-Center/Standard-Data-Reports
  • Astin, A.W. (1975). Preventing students from dropping out. San Francisco, CA: Jossey-Bass.
  • Astin, A.W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Development, 25, 297–308.
  • Astin, A.W. (1985). Involvement: The cornerstone of excellence. Change, 17(4), 34–39. doi:10.1080/00091383.1985.9940532 [CrossRef]
  • Astin, A.W. (1993). What matters in college? Four critical years revisited. San Francisco, CA: Jossey-Bass.
  • Bodily, R., Graham, C.R. & Bush, M.D. (2017). Online learner engagement: Opportunities and challenges with using data analysis. Educational Technology, 57, 10–18.
  • Britt, M., Goon, D. & Timmerman, M. (2015). How to better engage online students with online strategies. College Student Journal, 49, 399–404.
  • Burns, S.M. (2011). Predicting academic progression for student registered nurse anesthetists. AANA Journal, 79, 193–201.
  • Campbell, C.M. & Cabrera, A.F. (2011). How sound is NSSE? Investigating the psychometric properties of NSSE at a public, research-extensive institution. Review of Higher Education, 35, 77–103. doi:10.1353/rhe.2011.0035 [CrossRef]
  • Chickering, A.W. & Ehrmann, S.C. (1996). Implementing the seven principles: Technology as lever. AAHE Bulletin, 3–6. Retrieved from https://www.uab.edu/elearning/images/facultytoolkit/SevenPrinciplesTechnology.pdf
  • Cooke, N.A. (2016). Information sharing, community development, and deindividuation in the eLearning domain. Online Learning, 20, 244–260. doi:10.24059/olj.v20i2.614 [CrossRef]
  • Dietz-Uhler, B. & Hurn, J.E. (2013a). Strategies for engagement in online courses: Engaging with the content, instructor, and other students. Journal of Teaching and Learning with Technology, 2, 62–65.
  • Dietz-Uhler, B. & Hurn, J.E. (2013b). Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning, 12, 17–26.
  • Dixson, M.D. (2010). Creating effective student engagement in online courses: What do students find engaging?Journal of the Scholarship of Teaching and Learning, 10(2), 1–13.
  • Dvorak, T. & Jia, M. (2016). Do the timeliness, regularity and intensity of online work habits predict academic performance?Journal of Learning Analytics, 3, 318–330. doi:10.18608/jla.2016.33.15 [CrossRef]
  • Gazza, E.A. & Hunker, D.F. (2014). Facilitating student retention in online graduate nursing education programs: A review of the literature. Nurse Education Today, 34, 1125–1129. doi:10.1016/j.nedt.2014.01.010 [CrossRef]
  • Hannum, W.H., Irvin, M.J., Lei, P.W. & Farmer, T.W. (2008). Effectiveness of using learner-centered principles on student retention in distance education courses in rural schools. Distance Education, 29, 211–229. doi:10.1080/01587910802395763 [CrossRef]
  • Knestrick, J.M., Wilkinson, M.R., Pellathy, T.P., Lange-Kessler, J., Katz, R. & Compton, P. (2016). Predictors of retention of students in an online nurse practitioner program. Journal for Nurse Practitioners, 12, 635–640. doi:10.1016/j.nurpra.2016.06.011 [CrossRef]
  • Kuh, G.D. (2003). What we're learning about student engagement from NSSE: Benchmarks for effective educational practices. Change, 35(2), 24–32. doi:10.1080/00091380309604090 [CrossRef]
  • Lundberg, C.A. & Sheridan, D. (2015). Benefits of engagement with peers, faculty, and diversity for online learners. College Teaching, 63, 8–15. doi:10.1080/87567555.2014.972317 [CrossRef]
  • Macfadyen, L.P. & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54, 588–599. doi:10.1016/j.compedu.2009.09.008 [CrossRef]
  • Pace, C.R. (1990). The undergraduates: A report of their activities and progress in the 1980's. Los Angeles: Center for the Study of Evaluation, University of California, Los Angeles.
  • Pascarella, E.T., Seifert, T.A. & Blaich, C. (2010). How effective are the NSSE benchmarks in predicting important educational outcomes?Change, 42, 16–22. doi:10.1080/00091380903449060 [CrossRef]
  • Patzer, B., Lazzara, E.H., Keebler, J.R., Madi, M.H., Dwyer, P., Huckstadt, A.A. & Smith-Campbell, B. (2017). Predictors of nursing graduate school success. Nursing Education Perspectives, 38, 272–274. doi:10.1097/01.NEP.0000000000000172 [CrossRef]
  • Pike, G.R. & Kuh, G.D. (2005). A typology of student engagement for American colleges and universities. Research in Higher Education, 46, 185–209. doi:10.1007/s11162-004-1599-0 [CrossRef]
  • Porter, S.R., Rumann, C. & Pontius, J. (2011). The validity of student engagement survey questions: Can we accurately measure academic challenge?New Directions for Institutional Research, 2011(150), 87–98. doi:10.1002/ir.391 [CrossRef]
  • Purarjomandlangrudi, A., Chen, D. & Nguyen, A. (2016). Investigating the drivers of student interaction and engagement in online courses: A study of state-of-the-art. Informatics in Education, 15, 269–286. doi:10.15388/infedu.2016.14 [CrossRef]
  • Rice, J., Rojjanasrirat, W. & Trachsel, P. (2013). Attrition of on-line graduate nursing students before and after program structural changes. Journal of Professional Nursing, 29, 181–186. doi:10.1016/j.profnurs.2012.05.007 [CrossRef]
  • Scott, P. (2014). The perils of a lack of student engagement: Reflections of a “lonely, brave, and rather exposed” online instructor. British Journal of Educational Technology, 47, 51–64.
  • Suhayda, R., Hicks, F. & Fogg, L. (2008). A decision algorithm for admitting students to advanced practice programs in nursing. Journal of Professional Nursing, 24, 281–284. doi:10.1016/j.profnurs.2007.10.002 [CrossRef]
  • Wilson, J.T., Gibbons, S.W. & Wofford, K. (2015). Process improvement: Addressing attrition from the Uniformed Services University of the Health Sciences nurse anesthesia program. AANA Journal, 83, 351–356.
  • Young, S. & Bruce, M.A. (2011). Classroom community and student engagement in online courses. MERLOT Journal of Online Learning and Teaching, 7, 219–230.
  • Yu, T. & Jo, I.H. (2014). Educational technology approach toward learning analytics: Relationship between student online behavior and learning performance in higher education. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 269–270). New York, NY: Advanced Computing for Machinery. doi:10.1145/2567574.2567594 [CrossRef]
  • Zhang, X. (2016). An analysis of online students' behaviors on course sites and the effect on learning performance: A case study of four LIS online classes. Journal of Education for Library and Information Science, 57, 255–270. doi:10.3138/jelis.57.4.255 [CrossRef]

Descriptive Statistics for Variables (N = 360)

VariableMeanSDMedianMinimumMaximum
Grade0.910.060.920.3721.00
Grade point average3.500.313.552.454.00
Age, years32.487.2630.0023.0069.00
Access, n97.0044.1889.5022.00256.00
Minutes, n3,12019112652313.2314,903
Interactions, n817.31402.22733.00189.002,736
Submissions, n22.008.3721.009.0057.0

Associations of Variable Using Pearson Correlation Coefficients (N = 360)

VariableGPAAgeAccessMinutesInteractionsSubmissions
Grade0.118−0.1540.1710.1460.2820.448
0.02500.00340.00110.0054< 0.0001< 0.0001

Categorical Demographic Variable and Grade as an Outcome Variable (N = 360)

VariableMean (SD)p
Major.144
  Nurse practitioner0.91 (0.06)
  Anesthesiology0.95 (0.03)
  Education0.92 (0.04)
  Leadership0.91 (0.03)
  Other0.90 (0.09)
Race.645
  White0.91 (0.06)
  Black or African American0.90 (0.07)
  American Indian or Alaskan Native0.93 (0.03)
  Asian0.90 (0.05)
Gender.167
  Male0.93 (0.05)
  Female0.91 (0.06)
Geography.600
  Northeast0.91 (0.06)
  South0.91 (0.07)
  Midwest0.89 (0.05)
  West0.92 (0.03)

Predictors of Grade by Entry-Level Grade Point Average, Age, and Submissions (N = 360)

VariableSlope Estimate (SE)95% Confidence Limitp
Grade point average1.93 (0.88)[0.20, 3.67].0289
Age−0.17 (0.04)[−0.24, −0.10]< .0001
Submissions0.33 (0.03)[0.27, 0.40]< .0001
Authors

Dr. Serembus is Associate Clinical Professor, and Dr. Riccio is Assistant Clinical Professor, College of Nursing and Health Professions, Drexel University, Philadelphia, Pennsylvania.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Joanne Farley Serembus, EdD, RN, CCRN, CNE, Associate Clinical Professor, College of Nursing and Health Professions, Drexel University, 1601 Cherry Street, 3-Parkway Building, RM 9101, Philadelphia, PA 19382; e-mail: jmf64@drexel.edu.

Received: August 05, 2018
Accepted: January 08, 2019

10.3928/01484834-20190321-04

Sign up to receive

Journal E-contents