» Resources » Guidelines for writing program outcomes

Guidelines for writing program outcomes

Source: http://www.asu.edu/oue/outcomes.html

  1. A program outcome must flow directly from, and support, the college and division/school/department mission. The connection between the mission and the outcome should be clear.
  2. A program outcome must be directly related to the academic discipline of the program. Focus on program outcomes that reflect the specific knowledge and skills you expect students to acquire as part of their educational experience in the program. Avoid program outcomes that are more related to the general education component of an education. Writing and critical thinking, for example, are important educational outcomes, but it is unlikely that your program can demonstrate that your graduates acquired their writing or critical thinking skills through their coursework in the program. You may, however, incorporate writing and critical thinking into program outcomes directly linked to the academic discipline of the program. Consider these examples:
    • General: Graduates of the Criminal Justice program will be critical thinkers.
    • Program-specific: Graduates of the Criminal Justice program will analyze a current issue in criminal justice, evaluate evidence, and construct an argument.
  3. A program outcome must be observable and measurable. Write outcomes that are focused on demonstrable behaviors rather than what students know, think, understand, appreciate, etc. What someone knows, thinks, understands, or appreciates is invisible and cannot be directly measured. It is impossible to measure an invisible mental quality like a student’s knowledge or understanding. It is possible to measure how well a student solves a problem, presents an argument, or gives a dance performance.
    • Not observable: Graduates of the BA program will think critically.
    • Observable: Graduates of the BA program will interpret, analyze, evaluate and construct arguments.
  4. A program outcome must be focused on learning outcomes rather than curricular inputs. Be sure to focus on the knowledge and skills that program graduates should possess. Resist the temptation to write outcomes about curricular inputs, department resources, faculty characteristics, or instructional methods. Program outcomes are related to demonstrated behaviors of the students who graduate — not characteristics of the program or its faculty.
    • Input focused: Program faculty will improve their content knowledge through participation in professional development activities.
    • Input focused: All department labs will be equipped with state-of-the-art instruments.
    • Outcome focused: Graduates of the Art History program will analyze the religious and political influences on 18th century European artists.
  5. A program outcome must communicate a single outcome rather than combine multiple outcomes into a single statement.
    • Multiple outcomes: Graduates of the psychology program will be lifelong learners who understand the concepts of psychology and can apply those concepts to the design and application of real research problems.
    • Single outcome: Graduates of the psychology program will be able to design a research study.


Action Verbs for the Cognitive Domain

Knowledge Comprehension Application Analysis Synthesis Evaluation
arrange classify apply analyze arrange appraise
define describe choose appraise assemble argue
delineate discuss demonstrate calculate collect assess
duplicate explain dramatize categorize compose attach
label express employ classify construct choose
list identify illustrate compare create compare
match indicate interpret contrast design core
memorize locate operate critique develop defend
name recognize practice debate formulate estimate
order report schedule diagram manage evaluate
outline restate show differentiate organize judge
recall review sketch discriminate plan measure
recognize select solve distinguish predict predict
relate summarize use examine prepare rate
repeat tell write experiment propose revise
reproduce translate inspect set up score
specify question write select
state relate support
underline test value


Action Verbs for the Affective Domain

Receiving Responding Valuing Organization Characterization
ask answer complete adhere act
choose assist describe alter discriminate
describe comply differentiate arrange display
follow conform explain defend influence
give discuss follow explain listen
identify help initiate generalize modify
name perform join identify perform
select present justify integrate practice
reply select read modify propose
use tell report organize qualify
select prepare question
share relate serve
study synthesize solve
work use


Examples of Evidence of Student Learning

Direct (Clear and Compelling) Evidence of What Stu­dents Are Learning

  • Ratings of student skills by field experience supervi­sors
  • Scores and pass rates on appropriate licensure/ certifi­cation exams (e.g., Praxis, NLN) or other pub­lished tests (e.g., Major Field Tests) that assess key learning outcomes
  • “Capstone” experiences such as research projects, presentations, theses, dissertations, oral defenses, exhibitions, or performances, scored using a rubric
  • Other written work, performances, or presentations, scored using a rubric (C)
  • Portfolios of student work (C)
  • Scores on locally-designed multiple choice and/or essay tests such as final examinations in key courses, qualifying examinations, and comprehensive exami­nations, accompanied by test “blueprints” describing what the tests assess (C)
  • Score gains between entry and exit on published or local tests or writing samples (C)
  • Employer ratings of employee skills
  • Observations of student behavior (e.g., presentations, group discussions), undertaken systematically and with notes recorded systematically
  • Summaries/analyses of electronic discussion threads (C)
  • “Think-alouds” (C)
  • Classroom response systems (clickers) (C)
  • Knowledge maps (C)
  • Feedback from computer simulated tasks (e.g., information on patterns of actions, decisions, branches) (C)
  • Student reflections on their values, attitudes and beliefs, if developing those are intended outcomes of the course or program (C)
  • For four-year programs, admission rates into graduate programs and graduation rates from those programs
  • Quality/reputation of graduate and four-year pro­grams into which alumni are accepted
  • Placement rates of graduates into appropriate career positions and starting salaries
  • Alumni perceptions of their career responsibilities and satisfaction
  • Student ratings of their knowledge and skills and reflec­tions on what they have learned in the course or program (C)
  • Questions on end-of-course student evaluation forms that ask about the course rather than the instructor (C)
  • Student/alumni satisfaction with their learning, col­lected through surveys, exit interviews, or focus groups
  • Student participation rates in faculty research, publica­tions and conference presentations
  • Honors, awards, and scholarships earned by students and alumni

C = evidence suitable for course-level as well as program-level student learning.

Adapted from: Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). San Francisco: Jossey-Bass.


Various Assessment Methods

Data Assessment Tools Who/What Analyzed? What can be Assessed?
  • Classroom assessment techniques
  • Focus groups
  • Interviews
  • Reflective essays
  • Surveys (local or standardized
  • Alumni
  • Employers
  • Off-campus supervisors
  • Enrolled students
  • Graduating students
  • Entering students
  • Faculty

Perceptions about:

  • Campus climate
  • Perceived learning
  • Processes
  • Value-added
  • Educational outcomes
  • Attitudes
  • Values
Achievement Tests
  • Test score analysis
  • Content analysis
  • Scoring rubrics
  • Competitions
  • Embedded questions
  • Locally developed exams
  • Oral defenses (thesis)
  • Oral exams/recitals
  • Standardized tests
  • Mastery and knowledge of principles, skills
  • Value-added
  • Case studies
  • Observations
  • Campus events (sports, theatre)
  • Classes
  • Club meetings
  • Faculty offices
  • Fieldwork sites
  • Student services offices
  • Attitudes
  • Campus climate
  • Interactions
  • Processes
  • Services
  • Student involvement
  • Student learning
Academic Work
  • Content analysis
  • Scoring rubrics
  • Capstone course products
  • Homework papers
  • Portfolios
  • Presentations, performances
  • Publications
  • Research reports
  • Term papers, theses
  • Videotapes
  • Mastery and knowledge of principles, skills
  • Values
  • Processes
  • Value-added
Campus Documents
  • Course X program objectives matrix
  • Course assignment X program objectives matrix
  • Content analysis
  • Analysis of forms
  • Administrative units
  • Departments
  • Programs
  • Student services offices
  • Course syllabi, etc.
  • Student transcripts
  • Accuracy
  • Cohesion/consistency
  • Efficiency
  • Structure for promote
  • Objectives
  • Processes

Adapted from Program-Based Review and Assessment: Tools and Techniques for Program Improvement:


Intended Outcome Assessment Task Criteria/Expected Level of Achievement Results of Assessment Actions Taken
Student Learning Outcome Consider:

  • what the assessment is
  • who is responsible for implementing it
  • how the assessment will be implemented
  • when the assessment will occur
  • where the assessment will occur

  • minimum expected score for “achieving” the outcome (%, fraction, number)
  • determine acceptable “success” standard (Satisfactory, narrative indicator, grade on task, not course/test)
  • students to be included in the assessment (census or sample; majors, non-majors)

  • Report the actual results
  • Compare results with original expectations
  • Highlight key findings
  • Develop supportable conclusions from the data

  • Report what faculty/department/program has done as a result of the findings
  • Identify any changes made
  • Indicate timeline for reassessment of the outcome
  • Actions should be precise and detailed