Assessment Planning - Owens Community College

Report
Developing an Assessment Plan
Owens Community College
Assessment Day Workshops
November 13-14, 2009
Anne Fulkerson, Ph.D.
Institutional Research
Agenda
 What is assessment?
 Why assess?
 Types of assessment
 Basic steps in the assessment planning process
Identifying outcomes
 Defining measures
 Evaluating measures
 Other important considerations
 OCC assessment plan template & rubric
 Sample assessment plan
 How IR can help

What is Assessment?
“Assessment is the ongoing process of:
 Establishing clear, measurable expected outcomes of
student learning.
 Ensuring that students have sufficient opportunities to
achieve those outcomes.
 Systematically gathering, analyzing, and interpreting
evidence to determine how well student learning matches
our expectations.
 Using the resulting information to understand and improve
student learning.”
Suskie (2004), pg. 3
Why Assess?
 Demonstrate effectiveness
 Improve student learning
Establish Learning
Outcomes
Provide Learning
Opportunities
Implement
Change
Measure
Learning
Adapted from Suskie (2004) and Maki (2004)
Types of Assessment
Formative—assessment that takes place while learning
is in progress (or while a new program is being
developed) in order to provide feedback for improvement
Summative—assessment that takes place at the end of
a course or program to document that learning has
occurred or to make judgments about the efficacy of a
program
Establish Mission Statement
Identify Student Learning Outcomes
Student Learning Outcomes: Specifiable
activities, products, skills, abilities, behaviors,
attitudes, or pieces of knowledge that students attain
as a result of their involvement in a particular set of
educational experiences.
 Represent the most important competencies that all
students should possess
 Do not reflect ALL learning that might occur
 Often aligned with accrediting body standards
 No magic number, but keep them manageable
Expressing Student Learning Outcomes
Too vague:
Students will demonstrate information literacy skills
Too specific:
Students will be able to use institutional online services to
retrieve information
Better:
Students will locate information and evaluate it critically
for its validity and appropriateness
Expressing Student Learning Outcomes
Fuzzy Terms:
Know
Understand
Become aware of
Appreciate
Think critically
Demonstrate knowledge
Learn
Write proficiently
Action Words:
Remember
Identify
Perform
Create
Define
Summarize
Explain
Discuss
Describe
Solve
Find/Locate
Analyze
Evaluate
Apply
Debate/Argue
Synthesize
Integrate
Research
Choose
Construct
Design
Develop
Organize
Use
Thinking Skills
Bloom’s Taxonomy (Revised)
Based on an adaptation of Anderson & Krathwohl (2001)
http://uwf.edu/cutla/assessstudent.cfm
Sample Student Learning Outcomes
Not Measurable:
Measurable:
Recognize a need for lifelong
learning and plan for personal and
professional growth
Describe and adopt a plan for
ongoing professional
development and lifelong learning
Demonstrate an historical
knowledge of the symphonic,
string orchestra, and chamber
ensemble repertoire
Describe the historical
development of the symphonic,
string orchestra, and chamber
ensemble repertoire
http://uwf.edu/cutla/assessstudent.cfm
Identify Opportunities for Students
to Achieve Each Learning Outcome
Course / Activity / Experience
Learning Outcome
1.
2.
3.
4.
5.
Define Measures for Each Outcome
Types of Measures:
Quantitative—numeric (e.g., test scores, placement
rates, GPA, structured surveys)
Qualitative—described by words rather than numbers
(e.g., interviews, focus groups, observations, rubrics)
Direct—require students to display knowledge or skill (e.g.,
tests, performances, assignments)
Indirect—second-hand evidence (e.g., surveys, course
evaluations, journal reflections)
Word of Caution
 Don’t measure everything, just because you can
 Pick the best measures
 Keep it manageable
Determine How Measures will be
Evaluated
 Criterion-based benchmarks—compares
student performance with a pre-established standard.
 Norm-referenced benchmarks—compares
student performance with a standardized norm or a group
of peers.
 Value-added approach—compares student
performance at two points in time to see if they have
improved.
 Longitudinal/historical approach—compares
current students with prior students.
Other Things to Consider
 What resources are needed?
 Who’s responsible for collecting and analyzing the data?
 Build support through participation in the planning process
 Assessment plans are perpetual drafts
OCC Assessment Plan Template
https://intranet.owens.edu/committees/outcomes/index.html
Program Mission:
Program Vision:
Program Level Student Learning Outcomes
Program Student Learning Outcomes # 1:
Where taught?
Where
Direct
Where
measured?
measure(s)
1.
1.
1.
2.
2.
2.
Program SLO # 2.
Where taught?
Where
measured?
1.
1.
2.
2.
Program SLO # 3
Where taught?
Where
measured?
1.
1.
2.
2.
Program SLO # 4
Where taught?
Where
measured?
1.
1.
2.
2.
Program SLO # 5
Where taught?
Where
measured?
1.
1.
2.
2.
Indirect
measure(s)
Resources
needed
Person(s)
responsible
1.
2.
Schedule
Benchmark(s)
Direct
measure(s)
1.
2.
Indirect
measurer(s)
Resources
needed
Person(s)
responsible
1.
2.
Schedule
Benchmark(s)
Direct
measure(s)
1.
2.
Indirect
measurer(s)
Resources
needed
Person(s)
responsible
1.
2.
Schedule
Benchmark(s)
Direct
measurer(s)
1.
2.
Indirect
measurer(s)
Resources
needed
Person(s)
responsible
1.
2.
Schedule
Benchmarks(s)
Direct
measurer(s)
1.
2.
Indirect
measurer(s)
Resources
needed
Person(s)
responsible
1.
2.
Schedule
Benchmark(s)
OCC Assessment Plan & Report Rubric
https://intranet.owens.edu/committees/outcomes/index.html
Elements (1)
Program goals and
intended student
learning outcomes
(Appropriateness)
Developing
Lack of or incomplete
inclusion of some
established criteria.
Established
Describe how the goals & SLO align with College
&/or School mission & vision.
Describe how the goals & SLO link to accreditation
standards. if app
Describe how the goals & SLO link to industry
and/or academic standards.
Exemplary
Provides evidence supporting the statements
in “Established” category.
Evidence is appropriate
for the measurement of
student learning
(Evidence)
Lack of or incomplete
inclusion of some
established criteria.
Evidence that the measurement instrument is
aligned with goals and/or SLO.
Benchmarks in place and evaluated regularly
Artifacts when utilized are evaluated with rubrics,
when appropriate.
At least one direct & one indirect measure as
appropriate.
Normed evidence
Provide information on validity & reliability of
instrumentation.
Rubrics are constructed and/or agreed upon
by most stakeholders.
Multiple direct & indirect measures.
Individuals responsible
for delivering the
program engage in
systematic analysis of
evidence.
(Analysis of Evidence &
Shared Responsibility)
Lack of or incomplete
inclusion of some
established criteria.
Describe how the data was analyzed.
Provide evidence that appropriate stakeholders
were involved in the analysis of data.
Describe how the data was used to make decisions
about program improvement.
Describe action steps needed to improve data
collection and/or program.
Provide a summary of the analysis with references
to data.
Provide evidence that the previous plan of
action was implemented.
Provide evidence that the previous action
plan was assessed.
Provide results of the implementation of the
action plan.
Results are shared with
essential stakeholders
Lack of or incomplete
inclusion of some
established criteria.
Evidence that the report was shared with all faculty
and advisory committee members where
applicable.
Evidence that the report was shared with
additional stakeholder groups.
(Shared Responsibility)
Sample Assessment Plans
 Owens—
https://www.owens.edu/portrait/index.html
 College of Charleston, SC—
http://spinner.cofc.edu/~oap/docs.html
 Supplemental Instruction
How IR Can Help
 Data support and analysis
 Assessment consulting




Survey development
Rubric development
Identifying appropriate metrics
Identifying existing data sources
References
Maki, P. (2004). Assessing for Learning: Building a Sustainable
Commitment Across the Institution. Stylus Publishing: Herndon, VA.
Palomba, C. & Banta, T.W. (1999). Assessment Essentials. Planning,
Implementing, and Improving Assessment in Higher Education. John
Wiley & Sons: San Francisco, CA.
Suskie, L. (2004). Assessing Student Learning: A Common Sense Guide.
Jossey-Bass: San Francisco, CA.
Owens Community College, Student Learning Assessment Committee
(2009). Building your SLAC assessment plan: Instructions for the SLAC
assessment plan template version #1:
intranet.owens.edu/committees/outcomes/index.html.
University of West Florida, Center for University Teaching, Learning, &
Assessment. http://uwf.edu/cutla/Assessres.cfm.

similar documents