Mapping Student Learning Outcomes

Report
Where We Are in the Cycle
Types of Assessment Measures
And Associated Assessment Instruments
Summative & Formative
Summative & Formative
Qualitative vs. Quantitative
Qualitative vs. Quantitative
Direct & Indirect
Note: Direct assessment methods are preferred over indirect.
Direct & Indirect
Applied Experiences Assessment
Learning environments that provide in-the-field, hands-on learning and training experiences
can provide valuable assessment information. Applied experiences integrate what the student
has learned in their academic studies with “real life” situations and further develop the student
as a professions, citizen and life-long learner
Examples of Applied Experiences include:
• Practicum
• Service Learning
• Internship
• Experiential Learning
• Field Experience
• Student-teaching
• Co-op
Applied Experiences Assessment Instruments:
• Journal
• Videotape of student skills
• Progress Reports
• Portfolio
• Midterm Evaluation
• Cumulative Report
• Reflective Papers
• Performance evaluation by mentor
• Student evaluation of internship experience
• Final project/creative project/presentation
Newer Wave of Assessment Instruments
Rubrics
Learning contracts
Observations with documentation
Reflective journals
Reflective conversations/writings
Case studies
Student interviews
Videotaping
Rubrics
•
Several sample rubrics have been put in to the Sample Rubric folder on the
SharePoint site. These are examples only, and they don’t match the format being
used by the Core Curriculum rubric being used by our academic counterparts.
•
The CAS Domains and Dimensions rubric has been put in to your Assessment Tools
folder. This rubric was modified to match the format of the Core Curriculum and
has been used by other institutions as a way to map learning outcomes.
•
Additionally, the CampusLabs presentation “Rubrics 101: A Tool to Assess
Learning” has been put in to your Assessment Tools folder. This presentation
provides valuable information on the use of rubrics as a tool to measure learning
directly and objectively.
Rubrics
The Preferred Format
Guidelines for Selecting an Assessment
Method and Instrument
•
Select a method that is appropriate for your goals and objectives. (The
method that will provide the most useful and relevant information.)
•
Not all methods work for all areas or are appropriate to all outcomes.
•
Use the information you already have available.
•
Choose an assessment method that allows you to assess the strengths and
weaknesses of the program. Effective methods of assessment provide both
positive and negative feedback. Finding out what is working well is only one
goal of your assessment work.
•
Remember, the data you collect must have meaning and value to those who
will be asked to make changes based on the findings.
Guidelines for Selecting an Assessment
Method and Instrument
•
Use multiple methods to assess each learning outcome. Many outcomes will be
difficult to assess using only one measure. The advantages to selecting more
than one method include:
1. Multiple measures can assess different components of a complex task.
2. There is no need to try to design a complicated all-purpose method.
3. Greater accuracy and authority achieved when several methods of assessment
produce similar findings.
4. Provides opportunity to pursue further inquiry when methods contradict each
other.











Alumni surveys
Culminating assignments
Content analysis
Course-embedded
assessment
Curriculum analysis
Delphi technique
ePortfolio
Employer surveys
Focus groups
Institutional data
Matrices












Observation
Performance assessment
Portfolio evaluations
Pre/post evaluation
Quasi-experiments
Reflective essays
Rubrics
Standardized test
instrument
Student self efficacy
Surveys
Syllabus analysis
Transcript analysis



You give course grades
The grades measure the specific outcome
The courses are consistent over time
suggested for ROTCs



Your observers are experts
The setting is controlled
The outcome is Ethical Reasoning
suggested for CAPS & ODoS Counseling



Diverse people will do the assessments
You want to enforce consistency
You need to document the criteria




You want to measure cumulative effects
Your students will write the essays
The outcome can be expressed in an individual’s own words
The outcome is Written Communication
suggested for OSRR, DRC & HORIZONS



The outcome is best measured in the eyes of others
The outcome is Leadership and Teamwork
You have opportunity to gather these evaluations
◦ (ideally at the close of a seminar or event)
suggested for leadership training programs




You conduct group sessions
The outcome can be measured as an answer to a question
Some students won’t know the answer at the start
You can use Clickers
suggested for CCO workshops & intramural sports training




The location is unusual
The learning is best measured in the moment
You don’t care about getting a valid sample
These surveys can be facilitated:
◦ by Clickers
◦ Campus Labs Baseline w/ iPhone or iPod



If you are fishing for details
The outcome is Creative Thinking
The outcome is Oral Communication




You have lots of time
You are determined to get lots of detail
The outcome is Integrative Learning
You trust method more than people
◦ Experiments may create ethical concerns




the portfolios exist
the portfolios have value
Criteria exist for assessing the portfolios
Staff capacity exists to review students’ efforts
Suggested for future assessments
Instrument/Tool Mapping Process
Instrument/Tool Mapping Process
•
Now, review your original Student Learning Outcome mappings. Although you
may have indicated several outcomes students take away after participating in
your program, you now want to decide which of them you can assess for 2012
(this academic year).
•
Your original program worksheet will be where you retain all of the learning
outcomes for your program and participants. You will want to map only those
learning outcomes you can assess in 2012. Over time, the worksheet will
document which year you document each of the outcomes you have mapped.
•
For each program you originally mapped a Student Learning Outcome for, your
SharePoint folder will contain a Assessment Tool spreadsheet that will provide you
with a way to indicate which tool you want to use to assess each of the outcomes.
•
Columns A, B, C, and D will be pre-populated with information on each of the
Student Learning Outcomes you mapped back to the CAS dimensions. (Refer to
the mapping spreadsheet handout #1)
Instrument/Tool Mapping Process
• If after mapping your SLOs you determine you can’t assess one of the
outcomes, or if you aren’t going to be able to assess the outcome in
2012, remove it from your mapping tool spreadsheet.
• This mapping tool spreadsheet will become your authoritative source for
information on the mapping of your outcomes back to the CAS
dimensions, the Purdue Core Competencies and the Strategic Plan.
Timeline
•
Review your current student learning outcome statements…the ones you created
in phase 2. Make sure you have indicated in the Student Learning Outcomes
section, only those outcomes you will assess in 2012.
•
Your review will need to be complete by May 25. At that point, and for a period of
one week, all information in your folders will be frozen in order for the work to
begin on moving your information in to the Tool Mapping worksheet.
•
You will have the information moved over for you to the Tool Mapping worksheet
by June 4, and you will have the entire month of June to begin completing the
information requested in the remaining columns of the Tool Mapping worksheet.
All mappings will need to be completed by June 29.
•
At ANY point along the way, if you have any questions or need assistance, please
call or email either Dan or Andy
• Dan: 4-7416 [email protected]
• Andy: 4-6743 [email protected]
Closing the Loop

similar documents