Type, bread, syntax error on data section 1 = '[]'
Issue:'[]'
Line:MNSU[http://www.mnsu.edu/]^[]

– Minnesota State University, Mankato
shortcut to content

Minnesota State University, Mankato

Minnesota State University, Mankato
Undergraduate Research Center

Type, bread, syntax error on data section 1 = '[]'
Issue:'[]'
Line:MNSU[http://www.mnsu.edu/]^[]

Page address: http://www.mnsu.edu/urc/journal/2001/2001_students_msu_web.html
Student(s): Kristin Ruder
Mentor: Roland Nord
Department: English
Abstract

As part of ongoing usability studies of the MSU Web site, survey data was collected from convenience samples of MSU students to evaluate their self-assessed expertise using the Internet as well as their use of and satisfaction with the MSU Web site.

Review of literature

In fall 1999, Minnesota State University President Rush created the MSU Web Task Force and charged it to develop a new look for the Universitys website that is reflective of Minnesotas other great public university (1999). As a member of the Task Force, the senior author argued that efforts should be made to improve the usability of the site as well as the look and feel of the site.

Consequently, the plan to improve the Web site required two major steps: 1) user/task analysis to determine the needs of the users of the site and 2) comparison of the success of users at completing tasks on the old and the revised sites. Stated in slightly different terms, development of the new Web pages corresponds to development of an interfacenot development of a new interface, but rather the evolution of an interface; only some pages would change and only some users would change. A number of authors (Hackos and Redish 1998; Schriver 1997) have treated Web development as part of interface development.

User/task analysis

Hackos & Redish (1998) begin the interface-development life cycle with user and task analysis; they argue that user and task analysis should parallel the development of corporate objectives (long-range goals, market decisions, and feasibility decisions) within system development. Consequently, we focused on two questions: 1) Who are the users of the MSU Web site and 2) What are they using the Web site to accomplish

Basing their analyses on the work of Dreyfuss and Dreyfuss (1986) and Ehn (1989; 1993), a number of researchers (Hackos and Stevens 1997; Barker 1998; Hackos and Redish 1998) have categorized users by stage of use.

Table 1: Classifications of users by stage of use

Hackos and Stevens (1997) Barker (1998) Hackos and Redish (1998)
Novices Novices Novices
Advanced beginners   Advanced beginners
Competent performers Experienced Competent performers
Proficient performers    
Expert performers Experts Experts

The argument is that if designers understand their audiences (namely, their stage[s] of use), they should be better able to design an interface (Web site) to meet the users immediate needs and to help users learn the interface in order to meet their future needs.

Barkers classifications are based upon four criteria: 1) users familiarity with softwarethe number of programs with which they are familiar, 2) users technical knowledge, 3) users attitude and ability to see relationships between the software they use and their workthe tasks they use the software to accomplish, and 4) users learning behaviorthe manner in which they learn to use software.

Hackos and Redishs classifications also appear to be based upon four criteria: 1) users attitudes toward software and work, 2) users needs to accomplish work, 3) users learning and problem-solving styles, and 4) users conceptual understanding about the interface.

University Web sites have, of course, broad audiences, consisting of employees (faculty and staff) and students, who use the site frequently, as well as visitors (prospective students, parents, alumni, and researchers), who use the site infrequently. In this manner, the university Web site combines features of both an extranet and an intranet. This causes difficulties because as Nielsen (2000) points out, extranets are typically customer centered while intranets are employee centered. Furthermore, intranets tend to be the larger of the two and are frequently designed to reflect organizational structure. Nielsen, cautions against using organizational structure to design Web sites for non-employees, citing results from usability testing that showed the success rate was 80 percent when people used the navigation scheme structured according to most users mental model and only 9 percent when using the navigation scheme structured according to the companys internal thinking (202).

Document design

Researchers have found low success rates in usability tests when asking subjects to begin on the home page and then navigate to locate facts or complete an assigned task. Nielsen (2000, p. 164) reports that users in a study by conducted by Spool were able to locate correct pages only 42 percent of the time and in a study conducted by Nielsen and Hurst only 26 percent of the users were able to complete an online application. Unfortunately, Nielsen doesnt report the rate of false positivesthat is, the number (or percent) of subjects who believed incorrectly that they had found the correct answer, as opposed to those who ran out of time or gave up on the task.

In a study testing the usability of manuals for electronic products, Schriver found that roughly 60 percent of users blamed themselves for errors they made with consumer electronics (1997, p. 216). Furthermore, in approximately two-thirds of the cases when users blamed themselves for difficulties they were having with a product, the problem originated with the product (the interface) or the manual or bothnot the user.

Spool (1999) notes that when users are asked to choose between two software packages, users typically prefer the software package that provides the best usability; however, he found that when subjects are asked to choose between two Web sitesto pick the site that they liked besttheir choice is frequently not linked to usability of the site. That is subjects often preferred Web sites that they were not able to use but which they found interesting or attractive.

 

Marketing

Poock and Lefond review the literature on the development of academic Web sites, concluding that little is known about the use of the Web in higher education, particularly by college-bound high school students (2001, 15). They note that market research has focused on ebusiness, though they cite studies (Abrahamson 2000; Strauss 1998) indicating that college-bound students increasingly use the Web in their college search process and cite writers (Middleton, McConnell, and Davidson 1999; Strauss 1998) arguing for increased resources for university Web development, especially for development of Web resources that promote and market the institution.

Poock and Lefond attempt to identify elements of college Web sites that increase or inhibit browsing or submission of applications by college-bound students. They identify eight characteristics, listed in order of importance: 1) content (admission and environmental), 2) organization/architecture, 3) ease of navigation, 4) speed of connection, 5) user enjoyment, 6) audience focus, 7) distinctiveness, 8) graphics.

Purpose of this particular study

A number of studies were developed in order to conduct user and task analysis and to assess usability of the MSU Web site. Goals for this particular study were

          to identify user demographics and experience with MSU site, including

- users self-assessment of expertise using the Internet

- users experience with the MSU site

- users satisfaction with the MSU site

         to identify user aesthetic satisfaction with the MSU site          to identify user navigation (usability) of the MSU site.

Results from the questionnaires would provide the Task Force with a better sense of (portions of) the audience (users) of the MSU Web site.

Indeed, discussion by Task Force members typically focused on tasks they performed, not tasks their users performed; the design firm that redesigned the MSU home page (spring 2000) did not conduct user/task analysis. A number of researches, of course, have pointed to the paucity of research in university Web design (Poock and Lefond, 2001; Middleton, McConnell and Davison, 1999).

Even if such research enables designers to classify audiences/users, Schriver (1997, 154-64), in a review of the literature on audience analysis, questions the value of audience analysis by classification as a heuristic to guide document development. Citing her own research (1997, 455) as well as the work of others (Benson 1994; Lentz and Pander Maat 1992; Lentz and De Jong 1995) she concludes that experts are unlikely to predict more than half of the problems that readers will experience, and perhaps they will predict substantially less. Consequently, she argues for the importance of user feedback (usability testing) in document design.

Surveys

Surveys were designed and distributed the fall semesters of 1999 and 2000. The survey asked students to provide demographic information, to self-assess their Internet expertise and their success using the MSU Web site, and to report their use of specific sites within the MSU Web site.

Two questionnaire forms were used to survey freshman and to survey juniors or seniors (Appendix A and B). The forms differed in one demographic question (class vs. age) and in the majority of the sites listed in the final section.

Participants

Because the MSU Web site was redesigned from 1999 to 2000, desired participants for the 2000 survey included 1) new entering students enrolled in First Year Experience (FYEX 100) or Composition (Eng 101) coursesthose who would only be familiar with the current site, and 2) undergraduate students enrolled in Technical Communication (Eng 271) coursesthose who would be familiar with both the former and the current sites.

Results

Demographics of Web users (2000 survey results)

The majority of the participants (77%) were between 18 and 22 years old. The remaining 23% of participants fell below 18 years (16%), between 23 and 30 years (5%), and above 30 years (2%).

The majority of the participants (54%) were male. In FYE 100 and Eng 101 classes, however, 57% of students were female, compared to 25% in Eng 271, which typically has a lower percentage of female enrollments.

The majority of the participants (63%) were freshman, 17% were juniors, 11% were sophomores, and 7% were seniors. Juniors made up the majority in Eng 271 (46%).

The majority of the participants (73%) owned a computer. Fewer students in both More FYE 100 (66%) and Eng 101 (70%) classes owned a computer than students in Eng 271 (85%).

The majority of the participants (83%) had a nonMSU email account. FYE 100 students reported the highest percentage (93%), compared to Eng 101 (70%), and Eng 271 (85%).

The majority of the participants (64%) used Internet Explorer as their primary browser, 28% use Netscape, 5% use another browser, and 0% use Lynx. FYE 100 students reported the highest percentage of IE use (73%), compared to Eng 101 (57%), and Eng 271 (61%).

The majority of the participants (56%) claimed competent as their level of expertise using the Internet, 29% claimed advanced, 8% claimed expert, and 7% claimed novice. As shown in Table 2, both FYE 100 and Eng 101 students claimed less expertise (students claimed competent more often than they claimed advanced or expert) than students in Eng 271 did.

The majority of the participants (42%) use the MSU Web site 2-3 times per week (FYE 100 - 38%, Eng 101 44%, Eng 271 45%). Students in FYE 100 (50% 0-1 times per week) access the site less frequently than students in Eng 101 (37% 0-1 times per week) and Eng 271 (28% 0-1 times per week) do.

The majority of the participants (55%) rated the MSU site design good, 35% rated fair, 5% rated excellent, and 5% rated poor. Both FYE 100 (35%) and Eng 101 (34%) students tended to rate the design lower (chose fair or poor more often than they chose good or excellent) than students in Eng 271 (50%).

The majority of the participants (66%) were usually successful when using the MSU Web site. As shown in Table 3, more students (24%) were occasionally successful than were always successful (8%).

As shown in Table 4, the majority of the participants (89%) were successful with their first use of the MSU site. 

As shown in Table 5, the majority of the participants (52%) first accessed the site for a specific task, 43% for a general task, and 4% gave no response.

As shown in Table 6, the majority of the participants (37%) first used the site before applying, 23% before starting classes, 28% during orientation, and 11% after starting classes.


Reported success using the MSU Web Site
Question FYE 100 (%) Eng 101 (%) Eng 271 (%) Mean (%)
Novice 9 9 3 7
Competent 64 65 40 56
Advanced 22 21 45 29
Expert 4 6 13 7
Rarely 3 1 1 2
Occasionally 32 18 21 25
Usually 57 74 68 67
Always 8 6 10 8
Yes 90 92 85 89
 No 10 8 15 11
No response 2 6 5 4
General task 51 37 42 43
Specific task 46 57 54 52
Before applying 46 33 33 37
Before starting class 16 21 32 23
During orientation 20 30 35 28
After starting classes 17 16 1 11

 

Extent of use of the MSU Web site (2000 survey results)

 

The top 5 most attempted tasks were also within the top 6 most successful tasks (each task has over 90% success rate). The bottom 5 most attempted tasks averaged a 70% success rate.

Top 5 uses (of all participants)

1.        Accessed a college or department page (78%)

2.       Checked userID (67%)

3.       Used the Registrars page (61%)

4.       Used WebPALS (59%)

5.       Used the Library page (56%)

Bottom 5 uses (of all participants)

16.    Used the Campus Services page (21%)

17.    Used the Parking page (19%)

18.    Used the Hub (17%)

19.    Contacted the Computer Help Desk (13%)

20.   Checked MSUs mission statement (12%)

Extent of use of the MSU Web site (1999 survey results)

The top 5 most attempted tasks were also within the top 6 most successful tasks (each task has over 85% success rate). The bottom 5 most attempted tasks averaged a 75% success rate.

Top 5 uses (of all participants)

1.        Checked userID (70%)

2.       Accessed a college or department page (66%)

3.       Used WebPALS (60%)

4.       Located course materials (49%)

5.       Used the Registrars page (45%)

Bottom 5 uses (of all participants)

16.    Used the Campus Services page (19%)

17.    Used the Parking page (17%)

18.    Used the Student Handbook (17%)

19.    Contacted the Computer Help Desk (17%)

20.   Checked MSUs mission statement (11%)

Comparisons between gender (2000 survey results)

The following list cites three significant differences between the male and female feedback.

·         Females tended to report a higher success rate when using the MSU Web site.

·         Females claimed less expertise than males did.

·         Females tended to rate the design of the site lower than males did.

 

Discussion

Although a formal statistical analysis of the data was not conducted, the data suggests several leads for further research.

MSU Web site redesign

During late fall 1999, MSU dissolved the Web Task Force; then, in spring 2000, MSU contracted with a graphic designer to redesign the MSU home page and first tier of pages linked to the home page. The new pages were launched at the end of spring semester 2000.

The following observations can be noted when comparing results from the 1999 survey (based on the old Web site) and the 2000 survey (based on the redesigned MSU Web site):

·         No change in students self-assessed levels of expertise

·         No change in students frequency of use of MSU site (or down slightly)

·         No change in students ranking of site design

·         No change in students reported success rate

Students in the 2000 survey did, however, report an increase in their use of the sites search engine (compared to students in the 1999 survey). Furthermore, there is considerable evidence about the importance of the navigation and search engine to prospective and current students success rate and satisfaction using the MSU Web site:

·         Poock and Lefond (2001) found that college-bound students increasingly use the Web in their college search process. According to this study, 40% of students accessed the site before applying; 83% before beginning classes.

·         Again, according to Poock and Lefond (2001), organization/architecture, and ease of navigation were among the top three elements of college Web sites that increase or inhibit browsing or submission of applications by college-bound students. In the MSU study, 89% of students reported that their first use was successful (Table 4).

·         It appears that students are becoming more inclined to search (30% attempted to conduct a search in 1999; 43% in 2000). However, of the most frequently attempted tasks, conducting a search on the MSU Web site was the least successful (combined, 37% of the participants attempted the task; 71% were successful).

·          

In other words, more students appeared to be using search to navigate MSUs site, and search appeared to be the least successful strategy; still, students reported approximately the same level of satisfaction with the MSU site.

Interestingly, in fall 2001, MSU implemented a new search engine. Initial usability testing (Nord, unpublished) indicated that use of the MSU search function appeared to vary inversely with length of time at MSU; in other words, new entering students were much more likely to use search than juniors or seniorsperhaps because juniors and seniors had established a better sense of where to locate information on the MSU Web site or because juniors and seniors had a history of unproductive use of the MSU search engine. Debriefings following usability testing supported the latter hypothesis: juniors and seniors had a history of unproductive searches; consequently, they were slow to use a (new) MSU search engine to solve their tasks.

User self-assessment

Students in the survey self-reported their level of expertise using the Web and their level of satisfaction with the MSU Web site:

·         Participants claimed ‘novice to competent expertise using the Internet more often than they claimed ‘advanced to ‘expert (Table 2).

·         Even though female participants claimed less expertise than males did, they tended to report a higher success rate when using the MSU Web site.

·         The majority of the participants (66%) were ‘usually successful when using the MSU Web site (Table 3); more students (24%) were ‘occasionally successful than were ‘always successful (8%).

 

·         A clear majority of the participants (75%) reported that they are usually (or always) successful or satisfied using the MSU Web site.d

· 

 

Results from usability tests of the MSU Web site (Nord 1999) suggest that students in the survey may have exaggerated their level of expertise in using the Web and their level of success in using the MSU Web site. The 29 participants (8 freshmen, 9 juniors or seniors, and 12 faculty or staff) in the usability testing self-reported slightly lower levels of expertise (7% novice, 67% competent, 20% advanced, and 3% expert) than did the survey group (7% novice, 56% competent, 29% advanced, and 7% expert). Interestingly, faculty and staffthose participants with the most access to computers and most familiarity with the organization of the MSU Web siteclaimed the lowest level of expertise using the Internet; juniors and seniors (taking Technical Communication [Eng 271]) claimed the highest expertise.

Participants in the usability tests were asked to complete 8 tasks using the MSU web site; faculty or staff completed 66% of the tasks (mean time 2:34), junior or seniors completed 59% of the tasks (mean time 3:20), and freshmen completed 46% of the tasks (mean time 3:38). Faculty or staffthe group claiming least expertisewere most successful in using the MSU Web site to complete the assigned tasks.

These results support Nielsens (2000) claim that Web sites that are structured similarly to (the structure of the) organization are difficult for nonmembers (or new members) of the organization to use. Faculty or staffthose participants most familiar with the organization of a universitywere more successful at using the MSU Web site than students were. 

But these results appear to contradict survey results, where 75% or better of the 1999 survey participants and of the 2000 survey participants claimed to be usually or always successful when using the MSU Web site. Two explanations seem likely: 1) students are not likely to use the Web to complete difficult tasks or all tasks; in other words, students use the Web when they think it is most expeditious to do so. For more difficult tasks, they may use print publications, email, the phone, or inquire in person. 2) Students may overestimate their success because they do not want to call attention to (what they may view as their) failures (not the failures of the Web site). This reasoning seems a logical extension of Schrivers blame study (discussed in the literature review). Anecdotally, it was a rare participant in the usability study who did not blame him or herself for not being able to complete a task or who did not want to know what the right answer was. Indeed, in the 1999 survey, new entering freshmen who claimed to be novice or competent users did, on average, attempt fewer tasks and reported lower levels of success than did the advanced or expert users.

 

 

 

Site design and user satisfaction

This study (along with the 1999 survey and usability study) appears to support Spools (1999) claim

 

 

 

 

that users satisfaction with a site is not directly tied to the usability of the site.

In the 1999 survey and usability testing new entering freshmanthose students who claimed least success and who displayed least success in using the MSU Web siterated the site design higher than the junior or senior students and the faculty or staff who claimed and displayed more success in using the MSU Web site; in fact, for the three groups, the better the groups performance using the site, the lower that group rated the sites design. This is not to suggest that the relationship is one of cause and effect; a simpler explanation might be that familiarity breeds contempt. Those students, faculty, and staff most familiar with the MSU site were bored with the site, so they rated the design less favorably.

As mentioned earlier, in the MSU home page and first tier pages were redesigned in spring 2000. The redesign was primarily cosmetic, and in all likelihood affected usability little, introducing an additional keystroke for all tasks other than searches.

In the 2000 survey, overall the student participants rated the design of MSU site similarly to the students in the 1999 survey, except this time the junior or senior students appear to have rated the design of the site more highly than did the new entering freshmen. (Perhaps this is an example of a cyber-Hawthorne effect for the junior and senior students.)

Given the increasing importance of Web sites as marketing tools for universities, results of the 2000 survey may prove a worst-case scenario: prospective students who are surfing the Web to compare universities appear to be the group most vulnerable to usability problems and the group least satisfied with design of the MSU Web site.

Further research is necessary to determine the effect of Web site design and usability on prospective students when they are considering and applying to universities.



Works Cited

Barker, T. (1998). Writing software documentation: A task-oriented approach. Boston: Allyn and Bacon.

Dreyfus, H., & Dreyfus, S. (1986). Minds over machines. NY: Macmillan.

Ehn, P. (1993). Scandinavian design: On participation and skill. In D. Schuler & A. Namioka  (Eds.), Participatory design: Principles and practices (pp. 41-77). Hillsdale, NJ: Lawrence Erlbaum Associates.

Ehn, P. (1989). Work-oriented design of computer artifacts. Hillsdale, NJ: Lawrence Erlbaum Associates.

Gullikson, S. (1999). The impact of information architecture on academic Web site usability. The Electronic Library, 293-304.

Hackos, J. & Redish, J. (1998). User and task analysis for interface design. New York: John Wiley & Sons.

Hackos, J, & Stevens, D. (1997). Standards for online communication: Publishing information for the Internet/World Wide Web/help systems/corporate intranets. New York: John Wiley & Sons, Inc.

Hartman, K. (1997). The Internet & college admissions: Implications and opportunities. Change, 30 (2), 32-38.

Hossler, D. (1998). Using the Internet in college admissions: Strategic choices. Journal of College Admissions, 162 (Winter), 12-19.

Middlelton, I., McConnell, M., & Davidson, G. (1999). Presenting a model for the structure and content of a university World Wide Web site. Journal of Information Sciences, 25 (3), 219-27.

Nielsen, J. (2000). Designing web usability. Indianapolis, IA: New Riders.

Nord, R. (1999). MSU Web site observers notebook. Roland Nord [online]. Available: http://krypton.mnsu.edu/~nord/wtf/UT.htm

Palmer, J. & Griffith, D. (1998). An emerging model of Web site design for marketing. Communication of the ACM, 41 (3), 45-51.

Poock, M. & Lefond, D. (2001). How college-bound prospects perceive university Web sites: finding, implications, and turning browsers into applicants. C&U Journal, (summer), 15-21.

Schriver, K. (1997). Dynamics in document design: Creating texts for readers. New York: John Wiley & Sons.

Spool, J., Scanlon, T., Snyder, C., Schroeder, W., & DeAngelo,T. (1999). Web site usability: A designers guide. San Francisco: Morgan Kaufmann.

Stoner, M. (1998). The Internet & admissions: Bonding for the future. On Target [online]. Available: http://cbweb1.collegeboard.org/aes/ontarget/ontarg15/html/ontar15.html

Strauss, D. (1998). The use of the world Wide Web as a source of information during the search and choice stages of the college selection process. Unpublished manuscript, Ohio State University.

Wright, S. (1997). More high school seniors using Internet to select colleges. Black Issues in Higher Education, 14 (May 29), 25.