Type, bread, syntax error on data section 1 = ''
Type, bread, syntax error on data section 1 = ''
Assessments that measure opinions or thoughts about students' or alumni's own knowledge, skills, attitudes, learning experiences, perception of services received or employers' opinions. While these types of measures are important and necessary they do not measure students' performance directly. Some examples of indirect indicators of learning might include information gathered from alumni, employers, and students; graduation rates; retention and transfer studies; graduate follow-up studies; success of student in subsequent institutional settings; and job placement data.
Assessments that measure student learning outcomes directly. Some people call them authentic measures. Some examples of direct indicators of learning may include pre- and post-testing, evaluation of projects, standardized national inventories, locally developed inventories, and student portfolios.
|Surveys and Interviews
Measure student opinions about the imporantce of knowledge/skills and attitude changes
|Alumni surveys and/or meetings||Inexpensive
Acknoledges importance of student (or alumni) opinions
|Not a direct measure of learning
Difficult to develop valid instruments
|Senior exit surveys and/or interviews||Allows individualization and follow-up probes
May develop positivie interactions with students
|May be intimidatiing, biasing results
|Focus group interviews||As for interviews, but allows more students to be "interviewed" in less time||A few students can skew the results if not carefully performed|
|Measures employer or parent satisfaction|
|Employer survey and/or interview||Parent survey and/or interview|
|Commercial standardized tests and inventories||Low time investment
May not match specific program goals
|Locally developed pre/post tests and inventories||Matches local goals
Development and scoring processes are informative
|Difficult to develop valid instruments
Time consuming to develop
Table Source: Central Michigan University. (2002). Tools for Assessing Student Learning Outcomes.
See also Minnesota State University, Mankato Academic Affairs (2002) for more information on the use of portfolios and surveys
Qualitative methods look to gather a "detailed description of the situations, events, people, interactions, and observed behaviors, the use of direct quotations from people about their experiences, attitudes, beliefs, and thoughts; and the analaysis of excerpts or entire passages from documents, correspondence, records and case histories." (Patton (1990) as in Upcraft and Schuh (1996, p. 21)). Common techniques within qualitative research include interviews (both group and individual), document analysis, and field observations of people (Schuh and Upcraft, 2001).
Quantitative methods are distinguished by the emphasis on numbers, measurement, experimental design, and statistical analysis. Often the emphasis is on analyzing a large number of cases using carefully constructed instruments that have been evaluated for their reliability and validity (Patton, 1990). "Common techniques include questionnaires, structured interviews, and tests" (Palomba and Banta, 1999). Findings from quantitative methods may often be reported using descriptive statistics such as mean, standard deviation, correlation, etc.
|A Brief Comparison of Qualitative and Quantitative Methods|
|Focus of Research||Quality (nature, essence)||Quantity (how much, how many)|
|Key Concepts||Meaning, understanding, description||Statistical relationships, prediction control, description, hypothesis testing|
|Sampling||Nonrepresentative, small, purposeful||Large, random, representative, stratified|
|Date||Field notes, people's own words||Measures, counts, numbers|
|Methods||Observations, interviews, reviewing documents||Experiments, surveys, instruments|
|Instruments||Researcher, tape recorder, camera, computer||Inventories, questionnaires|
|Data Analysis||Onging, inductive (by researcher)||Deductive (by statistical analysis)|
|Findings||Comprehensive, holistic, richly descriptive||Precise, numerical|
|Advantages||Flexibility, emphasis on understanding large groups, hard-to-explain anomalies||Controlling intervening variables, oversimplificaion|
Sources: Bogdan and Biklen, 1992; Worthen and Sanders, 1987; Gall, Borg, and Gall, 1996; and Merriam, 1998.
Bogdan, R.C., and Biklen, S.K. Qualitative Research for Education. (2nd ed.) Needham Heights, Mass.: Allyn & Bacon, 1992.
Gall, M.D., Borg, W.R., and gall, J.P. Education Research: An Introduction. (6th ed.) New York: Longman, 1996.
Merriam, S. B. Qualitative Research and Case Study Applications in Education. San Francisco: Jossey-Bass, 1998.
Palomba, C. A., and Banta, T. W. Assessment Essentials. San Francisco: Jossey-Bass, 1999.
Patton, M. Q. Qualitative Research and Evaluation Methods. Thousand Oaks, Calif.: Sage, 1990.
Schuh, J. H., Upcraft, M. L., and Associates. Assessment Practice in Student Affairs: An Applications Manual. San Francisco: Jossey-Bass, 2001.
Upcraft, M. L., and Schuh, J. H. Assessment in Student Affairs: A Guide for Practitioners. San Francisco: Jossey-Bass, 1996.
Worthen, B. R., and Sanders, J. R. Educational Evaluation: Alternative Approaches and Practical Guidelines. New York: Longman, 1987.