Study of Alignment Quality among the Pennsylvania

Report
Developing Assessments for and
of Deeper Learning
[Day 2b-afternoon session]
Santa Clara County Office of Education
June 25, 2014
Karin K. Hess, Ed.D.
[email protected] or
[email protected]
Presentation Overview
• Clarify understandings & common
misconceptions about rigor/DOK, deeper learning
• Use the Hess Validation Tools to examine sample
performance tasks
• Give rubrics the chocolate chip cookie “taste test”
• Be inspired by Karin’s performance assessment
coaching tips
• Plan & get feedback on future assessment
activities and/or support to teachers
What we know (from research) about HighQuality Assessment:
• Is defined by agreed-upon standards/ expectations
• Measures the individual’s learning & can take
different forms/formats
• Measures the effectiveness of instruction and
appropriateness of curriculum
• Is transparent:
– Students know what is expected of them and how they will
be assessed
– Assessment criteria are clear and training is provided to
educators and reviewers/raters.
• Communicates information effectively to students,
teachers, parents, administration and the public at
large
Simply put, HQ assessments have…
• Clarity of expectations
• Alignment to the intended content expectations (skills &
concepts)
• Reliability of scoring and interpretation of results
• Attention to the intended rigor (tasks & scoring guides)
• Opportunities for student engagement & decision
making
• Opportunities to make the assessment “fair” & unbiased
for all
• Linked to instruction (opportunity to learn)
2. The DOK
Instruction
& Assessment
Matrix Instructional
Decisions…
Paths
Selected Response
Each standard has an assigned Depth
of Knowledge.
Performance Tasks
Constructed Response
DOK 1
DOK 2
Recall and Reproduction
Skills and
Concepts
Remember
Understand
Recall, locate
basic facts,
definitions,
details, events
The DOK determines the cognitive
level of instruction.
DOK 4
DOK 3
Extended Thinking
Reasoning and
Thinking
Select appropriate
words for use when
intended meaning
is clearly evident.
Explain relationships
Summarize
State central idea
Use context for word
meanings
Use information using
text features
Apply
Analyze
Use concepts to solve
non-routine problems and
justify
Analyze or interpret author’s
craft (e.g., literary devices,
viewpoint, or potential bias)
to critique a text
.
Cite evidence and develop a
logical argument for
conjectures based on one text
or problem
Evaluate
Create
Explain, generalize or
connect ideas using
supporting evidence
(quote, text, evidence)
.
Develop a complex model or
approach for a given situation
Develop an alternative solution
-Explain how concepts or
ideas specifically relate to
other content domains.
Devise an approach
among many alternatives
to research a novel
problem
Analyze multiple sources
or multiple text
Analyze complex abstract
themes
Evaluate relevancy,
accuracy and
completeness of
information across texts
or sources
Synthesize across multiple
sources/ texts
Articulate a new voice, theme,
or perspective
5
First we consider alignment…
• It’s really about validity – making decisions
about the degree to which there is a “strong
match” between grade level content
standards + performance and the
assessment/test questions/tasks
• And making valid inferences about learning
resulting from an assessment score
Alignment (validity) Questions:
• Is there a strong content match between
assessment/test questions/tasks and grade
level standards?
• Are the test questions/tasks (and the
assessment as a whole) more rigorous, less
rigorous, or of comparable rigor (DOK) to
grade level performance standards?
Some Common Misconceptions about DOK
1. All kids can’t think deeply; or Kids don’t need
scaffolding to get there.
2. Webb’s DOK model is a taxonomy (4 vs 1)
3. Bloom verbs & levels = Webb DOK
4. DOK is about difficulty.
5. All DOK levels can be assessed with a
multiple choice question (that’s just dumb!)
6. “Higher order” thinking = deeper learning
7. Multi-step tasks, multiple texts, or complex
texts always means deeper thinking
Basic Task Validation Protocol Handout #2a
(K. Hess, Linking Research with Practice, Module 3, 2013)
• Table Groups review the technical criteria and
descriptions of the Basic Validation Protocol
• Select a sample assessment task to review
• Handout 2b – Writing CRM
• Discuss what you see in terms of these criteria:
• Purpose & use? – how might you use results?
• Clarity – is it clear what is expected?
• Alignment: are task content + rigor/DOK appropriate for
grade level, use of data?
• Engagement: is there opportunity for student decision
making?

similar documents