Type, bread, syntax error on data section 1 = '[]'
Issue:'[]'
Line:MNSU[http://www.mnsu.edu/]^[]

– Minnesota State University, Mankato
shortcut to content
Minnesota State University, Mankato
Minnesota State University, Mankato

Latest information about COVID-19 and the campus community

×

Type, bread, syntax error on data section 1 = '[]'
Issue:'[]'
Line:MNSU[http://www.mnsu.edu/]^[]

Page address: https://web.mnsu.edu/assessment/resources/goals.html

STUDENT LEARNING OUTCOMES ASSESSMENT GUIDELINES

PURPOSE

MSU's mission statement says that we "promote learning through effective undergraduate and graduate teaching . . ." In order to assure ourselves, our students and our public that we are effectively promoting learning, we assess students' mastery of the learning outcomes established by our various programs. Then, in a cycle of continuous program improvement, we examine and refine the curriculum, assignments and learning experiences supporting that learning.

HISTORY

A brief review of the history of the development of assessment processes at Minnesota State University establishes a foundation for understanding the philosophy and guidelines that accompany it. As early as 1992-93, an Assessment Committee was established to begin formulation of assessment plans and guidelines. This combined faculty/administration Committee prepared and distributed the first Student Learning Outcomes Assessment Guidelines for MSU. By 1995-96 the Program Review and Assessment Sub Meet had been formed and charged with responsibility of developing an institutional assessment plan and related guidelines. The guidelines were distributed to faculty by the Sub Meet. As the Sub Meet made its 1997-98 call for all departments to submit formal plans for assessment of student learning, suggestions supporting development of effective plans were once again distributed. Throughout this history, assessment philosophy and guidelines have reflected the work of both the American Association of Higher Education and the North Central Association. Those philosophical principles and assessment guidelines are presented below.

PHILOSOPHY

Assessment of student learning at Minnesota State University is based on the "Principles of Good Practice in Assessing Student Learning" established in 1992 by the American Association of Higher Education. These principles assume that assessment of student learning begins with identification of our educational purposes; for example, "to promote learning". Those purposes are further defined by the question, "What do we expect our students to know and be able to do when they have completed their educational programs?" The AAHE principles further suggest that, through assessment, we meet our responsibilities to students and to the public, providing evidence of our effectiveness in promoting learning.

The AAHE Principles also link assessment and the act of learning itself. Effective learning integrates knowledge, occurs on many levels, and is demonstrated in students' performance across time. For this reason, assessment works best when it is ongoing and able to identify changes in student knowledge, skills and attitudes. The kinds of learning experiences provided for students affect their learning; therefore, the involvement of all faculty in the assessment process is important in developing a complete understanding of students' success in learning.

GUIDELINES (Based on current national standards)

STUDENT LEARNING OUTCOMES

  • Faculty determine student learning outcomes for their programs. These outcomes may be shaped by the requirements of accrediting agencies.
  • Student learning outcomes help us meet department, college and university mission and goals.
  • Student learning outcomes are written in clear, measurable terms.
  • Student learning outcomes address student knowledge, behavior/skills, and attitudes/dispositions.

ASSESSMENT INSTRUMENTS AND METHODS

  • Faculty determine the assessment instruments and methods used in assessing student learning outcomes established for their programs.
  • Different assessment methods and instruments may be appropriate for different programs, reflecting the diversity of our programs and their student learning outcomes.
  • Instruments and methods used are appropriate for the learning outcomes being assessed.
  • Criterion-referenced instruments should be used in assessing student learning. That is, learning is assessed against pre-determined standards or qualities of performance. Exceptions to this guideline may be made if departments wish to consider the results of norm-referenced, standardized examinations such as the GRE, national discipline-based exams, or national/state licensure or certification exams.
  • Student learning outcomes are assessed by a variety of direct and indirect measures chosen by the faculty. Examples of effective direct indicators of learning include pre- and post-testing; capstone courses; oral examinations; internships; portfolio assessments; evaluation of capstone projects, theses or dissertations; standardized national exams; locally developed tests, performance of licensure, certification or professional exams; and juried reviews and performances. Examples of effective indirect indicators of learning might include information gathered from alumni, employers and students; graduation rates; retention and transfer studies; graduate follow-up studies; success of students in subsequent institutional settings; and job placement data. Examples of effective data collection methods include paper and pencil testing, essays and writing samples; portfolio collections of student work; exit interviews, surveys, focused group interviews; the use of external evaluators; logs and journals, behavioral observations; and many other research tools. Research methods should be tailored to the type of data to be gathered and the degree of reliability required.
  • Assessment includes measures of "value added", such as pre- and post-tests.
  • Assessment occurs not only at the end, but also throughout the program.
  • Assessment of student learning outcomes avoids curriculum review reports, program evaluations, faculty recognition, student/faculty ratio, grades earned in courses and GPAs. These do not directly reflect student learning.

DATA COLLECTION, ANALYSIS AND REPORTING

Effective assessment planning and processes include:

  • Clear indication of whom will collect and analyze data.
  • Regular, frequent collection and analysis of data.
  • Clarity about whom will get reports on findings (ex. faculty, students, dean, Assessment and Evaluation Sub Meet, alumni, etc.).

There should be annual assessment activity and reporting of findings to the College Dean and the Assessment and Evaluation Sub Meet. Not every outcome has to be assessed every year, but all outcomes must be addressed at least once during the program review cycle in order to provide data for that review. Additionally, at least one outcome from every academic degree program should be assessed each year.

Assessment focuses on the enhancement of student learning through the continual improvement of the educational process.

Feedback loops are essential in providing faculty with information useful to the improvement of instruction and learning. Assessment is not an end in itself. Feedback from analysis of assessment data leads to conclusions about current program effectiveness. This, in turn leads to recommendations and plans to improve the program. Effective feedback loops are characterized by clarity at this stage--what will happen/change to improve student learning as a result of assessment results. These changes are then implemented and evaluated within the assessment cycle.

PROCESS

  • Faculty design and implement the assessment program and use it to find ways to improve the education they provide.
  • Authority and responsibility for the design and operation of assessment is shared throughout the faculty and administration.
  • Strong campus-wide assessment committees and programs support the assessment effort: Assessment and Evaluation Sub Meet and Confer and College and Department Assessment Committees.
  • Assessment of student learning outcomes is linked to program review to encourage continual program improvement.
  • Students should understand the purposes of assessment. Departments and colleges at MSU are encouraged to provide information about assessment in a variety of student-oriented publications and to include students on assessment committees. Program-related student learning outcomes and related assessments should also be included in course syllabi.