Student Persistence in Web-based Courses:
Identifying a Profile for Success

Thomas Valasek
Raritan Valley Community College


            Like many other institutions of higher education that have recently ventured into Web-based instruction, Raritan Valley Community College has discovered a disturbing phenomenon about online classes, namely, that the student attrition rate is often two to three times as high as that of comparable classes with traditional, face-to-face instruction.   In the eight online classes (in English, Humanities, and Social Sciences) that were the focus of this study the average student dropout rate was 33%, as opposed to an average rate of 13% for traditionally-taught English, Humanities, and Social Science classes.  This high attrition occurred even though the Welcome page of each course includes a “quiz” for prospective students to indicate whether they are “suited for online learning,” and even though all but two of the instructors for these classes said they took special measures to help their online students succeed. 

            Moreover, the percentage of students in these eight online courses who received a final grade of “F” (15%) is also higher than the percentage of students (12.6%) who failed traditional English, Humanities, and Social Science classes.  Since the WebCT software used for online classes at RVCC allows instructors to monitor the last date and time each student logged in, it is possible to identify students in online classes who effectively dropped out (i.e., stopped logging in) but who did not officially withdraw (or were not administratively withdrawn) before the deadline for withdrawing elapsed.  Of the 26 students who failed in the eight online classes, 11 fall into this category.  If these 11 students were counted along with officially withdrawn students, the attrition rate for these online courses would be 40%.    

            There has been abundant research, generated largely by Tinto’s theoretical model (Tinto, 1975 & 1982), on why distance education students (mainly in telecourses) drop out more frequently than students in traditional classes.  The literature attributes this attrition primarily to complex “barriers” that impede full student participation in distance education (Sweet, 1986; Garland, 1993; Pugliese, 1994; Brown, 1996; Morgan & Tam, 1999).  But there is considerably less research about the other side of this question, that is, why distance education students, despite all these barriers, persist in their courses.  And research related specifically to student persistence in Web-based courses is even scarcer.   Nevertheless, there are indications that online classes tend to be “more favorably accepted” by students who exhibit “a certain level of self-motivation and self-organization” (Palloff & Pratt, 1999), that students who express a high level of satisfaction with online classes are learners who find learning “conducive for thoughtful analysis of class questions and commentary” (Gibbs, 1998), and that most successful distance education students “place primary responsibility on themselves to learn” (Miller & Husmann, 1994). 

            The purpose of this study is to identify common characteristics of students who persisted until the end of the term in the eight online courses observed and to construct a profile of students most likely to complete online classes successfully.  The data for this study included demographic information from college records about students in the eight online classes, from an Online Student Survey administered during the last ten days of the semester, and from a survey of the faculty who taught these online classes.


The following eight online courses from the disciplines of English, Humanities, and Social Sciences were chosen for this study: 

  • English II
  • American Film
  • Creative Writing I
  • American Literature
  • Introduction to Sociology
  • Global Patterns of Racism
  • Introduction to Psychology
  • Introduction to Cultural Anthropology

From the standpoint of content these are standard introductory and survey courses, except for Global Patterns of Racism, which is an interdisciplinary, team-taught course that surveys the causes and manifestations of racism in diverse cultures.  (Students may take Global Patterns of Racism for credit in English, History, or Anthropology.  The three instructors for this course come from these disciplines.)  From the standpoint of pedagogy these courses depend more on class discussion and student interaction than on lecture. 

A total of 167 students were enrolled in these eight online classes on the tenth day of the semester, after the allotted drop/add period.  Class size ranged from 13 to 29 students, with an average class size of 21 students.  By the last ten days of the semester, after the deadline for withdrawing from classes had elapsed, 55 students had withdrawn (or had been administratively withdrawn), leaving 112 students still enrolled. 

Ten days before the end of the semester the RVCC Coordinator for Innovative Teaching posted the Online Student Survey on the Home page of the eight online courses along with a letter encouraging students to participate in the survey.  At this point in the term only students still enrolled were able to access and complete the online survey.  The letter informed students that this survey, conducted with their instructor’s permission and cooperation, was voluntary and anonymous, and that the instructor would receive only summary results of the survey.   

The Online Student Survey was designed to gather some demographic information about the students in the study, particularly about any previous experience with online classes and about time commitments to other classes, jobs, and family responsibilities.  In addition, it focused on four factors that may affect student persistence/attrition in online classes:

  • The student’s confidence using a computer.

  • The number of times per week the student logs in to the course.

  • The amount of time the student spends per week working on the course.

  • The student’s overall satisfaction with the online course.

  • The instructors who taught the online courses in the study were also surveyed concerning their expectations about student time commitments to online classes, their responsiveness to students, and the overall effectiveness of their online instruction.

Results of the Online Student Survey

Demographic Data

            There were 66 responses to the Online Student Survey.  Five of the respondents were enrolled in more than one online course in this study. (Four students were enrolled in two courses, and one student was enrolled in four courses.)  For demographic purposes, therefore, there are 59 students represented in Online Student Survey.  The following information was compiled from the survey about these 59 students: 

  • Twenty students (34%) had taken at least one online course previously, while 39 (66%) had never taken an online course before.
  • All but five students in the study (92%) were enrolled in one to four additional classes during the semester, with an average of 2.4 additional classes. 
  • Among those students taking additional classes, 44% were taking at least one other online course (including online courses not part of this study). 
  • The average age of the students who responded to the online survey is 28 years, which is the same as the average age of all students attending RVCC.
  • The percentage of “traditional” college-age students (18 to 24 years) in this group is 54% (versus 52% college-wide), while the percentage of students who were 30 or older is 36% (versus 35% college wide).
  • Students in this group spend a significant amount of time working outside the home:  91% report working at a job more than ten hours per week, and 55% work more than 30 hours per week. 
  • More than one-third of the students in this group spend a significant amount of time on family responsibilities:  37% report that they are committed to family responsibilities that require more than 20 hours per week; 12% are committed to more than 40 hours per week.  

In order to supplement the demographic data gathered from the Online Student Survey and to move toward identifying a profile of students who not only persist in online classes but succeed in them, we also looked at demographic data from RVCC student records for those who passed these online classes with a grade of “C” or better.   There were a total of 70 students in this group, eight of whom passed two online courses in this study and one who passed four courses in the study.   The following is additional demographic data about this group of students:

  • 64% of the students who passed the online courses are female, and 34% are male.
  • The average age of students who passed the online courses is 27.9 years.
  • The ethnographic distribution of students who passed the online courses is 73% Caucasian, 7% Asian, and 16 % “unknown.” 

This demographic snapshot of students who passed the online courses contrasts strikingly in one area with that of students who failed or withdrew from these courses:

·        The average age of students who failed the online classes is 22.1 years.  (80% of the students who failed are in the 18-to-24 age group, while 8% are in the30-and-older age group.) 

·        The average age of students who withdrew (or were withdrawn) from the online classes is 23.9 years.  (65% of the students who withdrew are in the18-to-24 age group, while 23% are in the 30-and-older age group.) 

Student Confidence Using a Computer

            When asked how much confidence they feel “about working with a computer” on a scale from 1 to 5 (where 1 is “very little” and 5 is “very much”), most of the students in the class rated their confidence level at 4 or 5.  (The mean response to this question was 4.2.)  Only two students said their confidence was at 1, and none said it was at 2.  Three of the five respondents who were enrolled in more than one online course rated their confidence with a computer at 5.  Conversely, when asked how often they felt frustrated with computer problems on a scale from 1 to 5 (where 1 is “very seldom” and 5 is “very often”), the mean response was 2.0.  On this question only five students rated their frustration level at 5, and one student rated it at 4.

 Amount and Distribution of Time Spent Working on Online Courses

            In the Online Student Survey students reported that they logged in to their online course an average of 6.6 times per week, and spent an average of 28 minutes online each time they logged in.  The average amount of time students said they spent altogether on their course (online and off-line) is 8.6 hours per week.  There was a wide range among the responses to these survey questions, particularly for log-ins (ranging from 2 to 20 per week) and for total time spent working on the course (ranging from 2 to 35 hours per week).  About half of the students reported that they worked six or fewer hours per week on their online course, while 36% said they spent ten or more hours per week.  Together, these two groups of students at opposite ends of the “work spectrum” account for about 90% of the responses to this survey question.  

Curiously, when the instructors were asked how many times per week they expect students to log in, they were sharply divided because they understood this question in two different ways.  Five of the instructors understood the question as, “How many times per week do you [realistically] expect students to log in?”  These instructors expected students to log in twice a week.  The other four understood the question as “[ideally] expect,” and they expected students to log in an average of 4.7 times per week.  In both cases, however, the faculty expectations were lower than the average number of log-ins per week the students reported. 

There was an even more dramatic division between the two groups of instructors when asked how many hours per week they expect students to spend altogether on their online course.  The “realistic” group expected an average of 3.6 hours per week; the “idealistic” group expected 9.5 hours per week.   This division among the faculty, along with the wide range of student responses to these questions on the survey, suggests that there may be misconceptions, both among faculty and students, about how much time students actually need in order to complete online classes successfully. 

Another way to look at student commitment to online classes is to examine the number of “hits” that record when students visit content pages in online courses, the number of times they “post” comments on discussion topics, and the number of postings by other students that they “read.”   WebCT software tabulates these student activities individually, so we can directly correlate this data with each student’s final grade.  Since the range of “hits,” “reads,” and “posts” varies in the eight online courses, to compare this data for all students in the study it is necessary to find the median hits, reads, and posts for each course and calculate how much above or below the median the “A,” “B,” and “C” students participated in each activity.  The results of these calculations (see Table 1) show that “A” students exceed the median number of hits, reads, and posts by at least 33% (and by as much as 66% in the number of posts), while “B” and “C” students fall short of the median in all three categories.  There are some anomalies in these data, created by the small sample size of “C” and true “F” students (16 and 15 respectively) and by the fact that in three classes only one student received a “C” grade, which skews the range of responses for “C” students.  Nonetheless, the data clearly indicate that in the “posts” category, the most important measure of a student’s contribution to online discussions, “F” students participated much less than students who passed.

Grade Hits Reads Posts
A +42% +33% +66%
B -19% -37% -22%
C -46% -119% -13%
F -36% -36% -31%

Table 1:  Percentage Above (+) or Below (-) Median for Online Activities

The data related to hits, reads, and posts are especially interesting for the most successful students in this study.  Among the 26 “A” students there were 16 who were above the median in all three categories and who, as a group, averaged 61%, 51%, and 88% above the median for hits, reads, and posts.  The average age of these 16 students is 33.7 years, while the average age of all “A” students in the study is 31.4 years.  The analysis of the hits, reads, and posts tabulated in these courses reinforces evidence that students who persist and succeed in online classes are prepared to commit considerable time and effort to course work, and that students who pass with a grade of “A” are willing to work significantly longer and harder than students who pass with a “B” or “C.”  

To find out whether there is a correlation between the students’ final grades in these online classes and their overall GPA, we ran a chi-square independence test, comparing students who succeeded with students who did not, and also comparing students in each grade category.  The results in both cases confirmed that statistically there is a strong association between the students’ final grades and their overall GPA, showing that a student’s GPA is a reliable indicator of the probability of success in these online courses.  The frequency distributions for the chi-square analysis show that among the students who passed these courses only two have GPA’s between 1.5 and 2.0 and none lower than 1.5, whereas among those who failed or withdrew there were 12 students with GPA’s between 1.5 and 2.0, and 17 students lower than 1.5.

Student Satisfaction with Online Courses

            The Online Student Survey includes three questions that measure student satisfaction with their online classes.  The first question asked students to rate, on a scale of 1 to 5 (where 1 is “very unresponsive” and 5 is “very responsive”), how responsive their instructor was to their questions and concerns.  The responses to this question were highly favorable, with a mean of 4.4, and included no responses at 1 and only five at 2.  The students, in fact, rated the instructors’ responsiveness more highly than they rated themselves, since the instructors’ mean response was 3.8 when asked to rate their responsiveness to students on a scale of 1 to 5.  In the comments section of the survey several students specifically praised the responsiveness of the instructor, noting, for example, that their instructor “is extremely quick to respond via e-mail,” “follows up quickly on comments,” “responds promptly,” “offers positive suggestions for postings or writings,” or “gives very helpful feedback.”  Several students, however, commented that their instructors were not always prompt about “getting back to students,” and others observed that “better communication” with the professor would improve the course.

            The second question asked students to rate the “overall level of instruction” in the online classes on a scale of 1 to 5 (where 1 is “very ineffective” and 5 is “very effective”).  Responses to this question were also favorable, with a mean of 4.2, and again higher than the instructors’ own rating of instructional effectiveness, with a mean of 3.8.  One student wrote that what she liked most about her course was “the quality of the instruction, material, and student responses.”

            The third question asked students what recommendation they would give to another student interested in their online course, again on a scale of 1 to 5 (where 1 is “very weak” and 5 is “very strong”).  While still favorable, with a mean of 3.9, the responses to this question showed a wider range than the previous two.

Many students noted that what they like most about online courses are the convenience and flexibility of taking classes at home and the freedom to work on their own time and at their own pace.  Several students were enthusiastic about online discussions.  (Comments included:  “I believe the group discussions are truly superior in an online course.”  “I feel more at ease writing to a group, instead of being in front of a class talking.”  “Discussions in the online course make it much easier to express myself.”)   Some students, however, wanted “more instructor interaction and feedback.”  One student said that online discussions “still don’t replace classroom discussions.” 

            What students said they like least about online classes is that they require too much work and take up too much time.  One student stated, “I found I put in about twice as many hours per week in this [online] course as in others [traditional classes] I have taken here.”  Another said simply, “To be honest, it [the online class] was much more work than a regular class.”  Some students also complained that online classes are too impersonal and that they lack “in-person contact with instructor and other students.”  Two students said they missed “face-to-face contact” with the instructor. 


            The data from the Online Student Survey, along with collateral information from RVCC student records, suggest some demographic and behavioral characteristics that may be reliable indicators of student persistence and success in online classes.  Other studies have also identified some of these indicators: 

  • Non-traditional college students (age 30 and older) are more likely to persist and succeed in online classes than traditional college-age students (age 18 to24).  Studies of dropout rates for distance education students report higher levels of persistence for students over 27 years of age (Rekkedal, 1983; Cookson, 1989).  And a current study at Atlantic Cape Community College finds that 66% of the students “surviving” online courses until the end of the semester are over the age of 29 (Russell, 2001).
  • Successful online students develop realistic expectations about how much time online learning will demand.  The most frequent comment from persisting students in this study was that they had to spend more time than they expected on these courses.  The instructors agreed that many students withdraw from online classes because they take more time than students expect.   Other studies indicate that online learning demands more time of students than traditional face-to-face instruction (Harasim, 1990; Gibbs, 1998; Bonk & Cummings, 1998; Palloff & Pratt, 1999).
  • Successful online students are organized and able to manage the demands of their classes along with their responsibilities at work and/or at home.  In an interview about her book, How to Be a Successful Online Student, Sara Dulaney Gilbert states that from the student’s perspective one of the biggest differences between online courses and traditional courses is time management, which is one key to success for online students (Young, 2001).  Other studies have shown that effective time management is critical for student success for distance learning (Bernt & Bugbee, 1993; Miller & Husmann, 1994), and specifically for online classes (Palloff & Pratt, 1999; Gilbert, 2001).
  • Successful online students feel confident about using a computer.   Students who persisted in this study, even those who acknowledged that they are not especially skilled with computers, did not indicate that computer problems were an impediment to success, even if they were in an online class for the first time.  These results appear to corroborate previous research indicting that computer experience makes no significant difference in outcomes for online courses (Hiltz, 1993; Harrasim, 1995).
  • Successful online students keep pace with course work and assignments, logging in regularly and frequently.  Students who persisted to the end of the semester in this study reported logging in an average of 6.6 times per week.  In her book Sara Gilbert tells students straight out that the chances of success with an online course are higher if “you are able to discipline yourself to work at the course regularly and keep up with the deadlines” (217).  
  • Successful students participate actively in online class discussions.  Analysis of “hits,” “reads,” and “posts” indicates that students who contribute most to online discussions succeed with the highest grades, while students who contribute little are much less likely to succeed at all.  Some instructors in this study suggest that one way to help students succeed is to encourage more online participation and to monitor it more carefully.   Bonk and Cummings (1998) concur with this suggestion.  And in Learning Networks:  A Field Guide to Teaching and Learning Online Harasim et al. state that among the factors that make a difference in student success in online courses is “the self-discipline to participate regularly.”  They also suggest:  “To show that participation is important, grade it” (178).


The following are recommendations for improving student persistence and success rates in online courses based on the results of this study:

·        Use market research and community outreach to identify and recruit “niche” groups of students who may be especially well-suited and prepared for online learning.  Based on this study an especially promising target audience for online classes may be non-traditional female students, age 30 and older, employed nearly full-time either at home or in the work force.  Gilbert (2001) says that the “typical” online student is a “caregiver” over 25, employed, with some higher education (74).

·        Provide better screening and advising for prospective online students.  The Welcome page of every online course should include a realistic “self-test” that helps students determine whether they are suited for online classes.  Academic counselors need to advise students about the special demands of online classes.

·        Provide in-person orientation sessions for online students and encourage (perhaps require) them to attend, especially those enrolling in an online course for the first time.  The online instructors in this study strongly endorse this recommendation.

·        Encourage online instructors to make their course outlines and syllabi more concise than in traditional classes and to delineate clearly what they expect of students in online classes.  One of the “dozen recommendations” Bonk and Cummings (1998) make for more effective Web-based learning is to “provide clear expectations and prompt task structuring” (87).  Harasim et al. (1995) also stress the importance of clear expectations, stated in terms of minimal numbers of log-ins per week and/or “messages” posted per week.

·        Encourage instructors to develop a “sense of community” in online classes.  This is another major recommendation of Bonk and Cummings (1998), also strongly advocated by other researchers (Johnson-Lenz & Johnson-Lenz, 1990; Harasim et al., 1995; Rea et al. 2000).

·        Encourage online instructors to monitor student progress closely and make timely efforts to contact those who fall behind, using private email messages or telephone calls.  Most of the online classroom management techniques recommended in Palloff and Pratt (1999) begin with the words, “Make personal contact with the student” to determine the cause of a problem, to encourage participation, to offer coaching, reassurance, or supportive responses (52).  

·        Encourage online instructors to arrange informal face-to-face meetings with students, if possible, either individually or as a group.  Both students and faculty in this study commented that an online course can be a “lonely” and impersonal experience.  Gibbs (1998) also observes that online students often feel isolated because they lack “face-to-face interaction with their peers and the instructor” (16).  And Rea et al. (2000) conclude that some students become disenchanted with online learning when they do not “feel involved in the class” and that these students “may need additional support from the instructors which may not be available in the total technologyenvironment” (150).