Data Analysis within an RtI2 Framework: Linking Assessment to

Report
Data Analysis within an RtI2 Framework:
Linking Assessment to Intervention
Aimee R. Holt, PhD
Middle Tennessee State University
1
What is RTI2?
•A systematic and data-based method for
addressing academic concerns:
–identifying
–defining &
–resolving
Brown-Chidsey & Steege (2010)
2
RTI2 is a general education initiative….
•Components of RTI2
–High-quality instruction
–Frequent assessment of academic
skills
–Data-based decision making
Brown-Chidsey & Steege (2010)
3
Problem Solving
• At each tier within RTI2, a problem solving model is
employed to make decisions
Problem Identification
•Define the Problem
•Develop an Assessment Plan
Plan Evaluation
Problem Analysis
•Analyze the
Assessment Plan
Results
•Develop an
Intervention Plan
•Analyze the
Results of
Implementation
•Determine Next
Steps
Implement Plan
Progress Monitor
4
What would Assessment at Tier I look like?
5
Universal Screeners
•LEAs are required to:
–Administer a nationally normed,
–skills-based universal screener
–to students at their grade level
6
–For K-8, Universal Screeners should be
administered 3X per year
–In grades 9-12, there are multiple sources
of data that can be reviewed, such as:
• EXPLORE, PLAN and ACT; Tennessee Comprehensive
Assessment Program (TCAP) which includes Writing (TCAPWA), End of Course (EOC), 3-8 Achievement and in 2014-2015,
Partnership for Assessment of Readiness for College and
Careers (PARCC); TVAAS
7
Characteristics of Appropriate
Universal Screening Tools
•Helps answer questions about efficiency of
core program
–Aligns with curriculum for each grade level
•Skills mastery aligns with state mandated
year-end assessment
Ikeda, Neessen, & Witt (2008).
8
3 Types of CBM’s
•General Outcome Measures (GOM’s)
•Skill Based Measures
•Sub-skill Mastery Measures
9
General Outcome Measures
•GOMs
–sample performance
–across several goals at the same time
– capstone tasks
• Ex. Oral reading fluency
•Can be used for
–screening (benchmarking),
–survey & specific level assessment
–progress monitoring
10
Skills-Based Measures
•SBM are similar to GOM’s but can be used
when capstone tasks are not available
– Ex. Math computation
•Can be used for
–screening (benchmarking),
–survey & specific level assessment
–progress monitoring
11
Subskill Mastery Measures
• SMMs are very narrow in focus
– Ex. Names of letters
•Should not be used for benchmarking
–(exception… early skills such as Letter
Naming Fluency, Letter Sound Fluency,
Number Naming Fluency)
12
Example Reading Skills Typically
Assessed by Universal Screeners
Grade
6th
Areas Typically Assessed by Universal Screeners
Oral Reading Fluency ; Reading for understanding
5th
4th
3rd
Oral Reading Fluency; Reading for understanding
Oral Reading Fluency; Reading for understanding
Oral Reading Fluency; Reading for understanding
2nd
1st
Oral Reading Fluency; Reading for understanding
Letter Naming Fluency (beginning); Phonemic Awareness;
Phonics; Word Identification Fluency;
Oral Reading Fluency (end)
K
Letter Naming Fluency; Phonemic Awareness:
Early Phonics Skills including Letter Sound Fluency
13
What would Data Analysis at Tier I look like?
14
Making Decisions about Group Data
•Review universal screening data to answer
the following questions:
–Is there a class wide problem?
–Who needs a Tier II intervention?
•Be sure to examine students at the margin
–Does anyone need Tier III now?
15
16
Who needs a Tier II or Enrichment?
•Winter Benchmark for ORF:
–90th %- 153;
–25th % - 72;
ORF
Maze
154/100% 26 /98%
•Winter Benchmark for Maze:
–90th % - 25;
154/85%
26 /79%
68/ 95%
09 /94%
68/88%
08 /80%
–25th % - 9;
•Instructional level criteria
•For contextual reading – 93-97%
correct
•For most other academic skills –
85-90% correct
17
Examining students at the Margins
•Winter Benchmark for ORF:
–90th
%- 153;
–25th % - 72;
•Winter Benchmark for Maze:
–90th % - 25;
–25th % - 9;
ORF
75/96%
Maze
11 /100%
80/100% 10 /97%
73/82%
11/75%
•Instructional level criteria
•For contextual reading – 9397% correct
18
Identifying who needs Tier III
•Winter Benchmark for ORF:
–25th
% - 72;
–10th % -44
•Winter Benchmark for Maze:
–25th % - 9;
ORF
Maze
46 / 76%
6 / 80%
42 / 83%
5 / 75%
–10th % - 6
•Instructional level criteria
•For contextual reading –
93-97% correct
19
Referral to Tier II Decision Tree
Core literacy instruction has been implemented with fidelity
≥80% of student needs are met by core instruction
Differentiated instruction has been provided in a small group
within core literacy instruction
Student has been present for ≥75% of instructional days
Student has passed vision and hearing screening
Data indicates performance below the 25th% on universal
screening of student achievement compared to national norms
Additional Assessment data supports universal screening data
20
What do we mean by linking assessment to intervention?
21
Linking Assessment to Interventions….
• Research has shown that effective interventions have
certain features in common:
–Correctly targeted to the student’s deficit
–Appropriate level of challenge (instructional range)
–Explicit instruction in the skill
–Frequent opportunities to practice (respond)
–Provide immediate corrective feedback
(e.g., Brown-Chidsey & Steege, 2010; Burns, Riley-Tillman, &
VanDerHeyden, 2013; Burns, VanDerHeyden, & Boice, 2008;)
22
Academic Instruction in Reading
•Both NCLB and IDEA require that instruction in the
general education setting cover all 5 areas of
reading identified by the National Reading Panel
• Phonemic Awareness
• Phonics
• Fluency
• Vocabulary
• Text Comprehension Strategies
23
Linking the 5 skill areas to 3 SLD areas
Basic Word Reading
Phonemic Awareness
Phonics
Reading Fluency
Fluency
Reading Comprehension
Vocabulary
Text Comprehension
Strategies
24
Phonological Awareness
•A metacognitive understanding that words
we hear have internal structures based on
sound
–Research on PA has shown that it exerts an
independent causal influence on word-level reading.
(Berninger & Wagner, 2008)
–Phoneme – smallest unit of speech
•The English language has 44-46 phonemes
25
Phonics
•Alphabetic principle - Linking phonological
(sound) and orthographic (symbol) features of
language (Joseph, 2006)
–Important for learning how to read and spell
• National Reading Panel –students with explicit AP
instruction showed benefits through the 6th grade
–Phonological awareness is a prerequisite skill
26
• Word Reading Skills - (McCormick, 2003)
–Word identification: the instance when a reader
accesses one or more strategies to aid in reading
words (e.g., applying phonic rules or using
analogies)
• Decoding – blending sounds in words or using
letters in words to cue the sounds of others in a
word (Joseph, 2006)
–Word recognition: the instant recall of words or
reading words by sight; automaticity
27
Fluency
•“ The ability to read a text quickly,
accurately, and with proper expression” (NRP,
2000 p.3-5)
•Most definitions of fluency include an
emphasis on prosody – the ability to read with
correct expression, intonation and phrasing
(Fletcher et al., 2007)
•National Reading Panel -Good reading fluency skills
improved recognition of novel words, expression during
reading, accuracy and comprehension
28
Vocabulary & Text Comprehension Skills
• Vocabulary knowledge – including understanding
multiple meanings of words; figurative language etc..
• Identifying stated details
• Sequencing events
• Recognizing cause and effect relationships
• Differentiating facts from opinions
• Recognizing main ideas – getting the gist of the passage
• Making inferences
• Drawing conclusions
29
What Would Assessment at Tier II Look Like?
30
So you have identified your “at risk
students”- now what?
•You will need to conduct Survey Level Assessment (SLA)
for these students
•Survey Level Assessment (SLA)
–Can be used to: (a) provide information on the difference
between prior knowledge and skills deficits to be used to
plan instructional interventions & (b) serve as baseline
for progress monitoring
31
Why is it important to conduct Survey Level
Assessments before beginning Tier II interventions?
•The primary question being addressed by
the survey level assessment at Tier II is
–“What is the CATEGORY of the problem”
–(What is the specific area of academic
deficit?)
(e.g., Riley-Tillman, Burns, Gibbons, 2013)
32
An Example of Survey Level Assessment Using
DIBELS
Grade
CBM Assessed
Benchmarked
6th
Oral Reading Fluency
Fall, Winter, Spring
5th
Oral Reading Fluency
Fall, Winter, Spring
4th
Oral Reading Fluency
Fall, Winter, Spring
3rd
Oral Reading Fluency
Fall, Winter, Spring
2nd
Oral Reading Fluency
Fall, Winter, Spring
1st
Oral Reading Fluency
Winter, Spring
1st
Nonsense Word Fluency
Fall, Winter, Spring
1st
Phoneme Segmentation Fluency Fall, Winter, Spring
1st
Letter Naming Fluency
Fall
K
Nonsense Word Fluency
Winter, Spring
K
Phoneme Segmentation Fluency Winter, Spring
K
Letter Naming Fluency
Fall, Winter, Spring
K
Initial Sound Fluency
Fall, Winter
1) Start at student’s
grade level
2)Test backwards by
grade until the
student has reached
the “low risk”
benchmark for a
given skill
•Low risk/
established indicates
the student has
“mastered” that skill
33
For example….. In reading
•
comprehension &
fluency =
•comprehension intervention
•
comprehension +
low fluency, but
decoding =
•fluency intervention
•
comprehension +
fluency +
decoding,
but
phonemic awareness skills
•decoding intervention
Riley-Tillman et al., (2013)
34
Let’s look at Michael a 2nd grade
student
•At the fall benchmark, he was identified on
ORF as being in the some risk range.
•His score was 30 wcpm
Problem
Identification
• Survey level assessment were conducted
using:
• DORF 1st grade – (fluency)
• DNWF 1st grade – (decoding)
• DPSF 1st grade – (phonemic awareness)
Problem
Analysis
35
Michael’s Scores
• DORF – 35 wcpm
•DNWF – 28 scpm
•DPSF – 38 pcpm
DIBELS Scores Representing
Skills Mastery
Fall
DORF
DNWF
DPSF
DLNF
> 24
> 35
> 37
Winter Spring
> 20
> 40
> 50
> 50
> 35
> 35
-----
36
What next….
•You link your assessment data to an intervention
that targets the category of skill deficit that was
identified
•You select progress monitoring probe(s) that
assess that skill
•You set the student’s goal for improvement
– You can use ROI & Gap Analysis Worksheets to
help with this
37
What progress monitoring is not…
•It is NOT an instructional method or
intervention
•Think of progress monitoring as a template
that can be laid over goals and objectives from
an assortment of content areas
38
What Would Data Analysis at Tier II Look Like?
39
40
Referral to Tier III Decision Tree
Tier II intervention(s) have occurred daily for 30 minutes in
addition to core instruction
Intervention logs attached
(3) Fidelity checks completed and attached
Implementation integrity has occurred with at least 80% fidelity
Student has been present for ≥75% of intervention sessions
Tier II intervention(s) adequately addressed the student’s area
of need
41
Tier II intervention was appropriate and research-based
Research based interventions are:
□ Explicit
□ Systematic
□ Standardized
□ Peer reviewed
□ Reliable/valid
□ Able to be replicated
Progress monitoring has occurred with at least 10-15 weekly data points –OR8-10 bi-monthly data points
Gap analysis indicates that student’s progress is not sufficient for making
adequate growth with current interventions
42
Does a student require Tier III
intervention?
•Step 1: Need to check to see if the data can be
interpreted
–A minimum of 8-10 data points, if progress
monitoring every other week, OR 10-15 data
points, if progress monitoring weekly to make a
data-based decision to change to Tier III.
43
• Step 2: Examine Rate of Improvement
– You can compare the student’s actual
ROI to the goal that was established
– You can use the ROI worksheets
• Let’s complete one for Michael
44
Completing the ROI Worksheet for
Michael
Assessment Used:
DIBELS NWF
Student’s score on first probe administered:
28
Student’s score on last probe administered:
37
Fall benchmark expectation:
24
Spring benchmark expectation:
50
Step 1
50
____________
Spring
benchmark
expectation
-
24
_____________
Fall benchmark
expectation
/
36
_________
Number of
weeks
=
0.72
___________
Typical ROI
(slope)
45
37
28
13
0.69
46
0.72
1.44
0.72
1.08
47
You also can visually analyze the graphed progress
monitoring data
– Calculate the trend line of the intervention data
points and compare it to the aim (goal) line.
» If the slope of the trend line is less than the
slope of the aim line, the student may need to
be moved to Tier III.
» Especially if it appears that given the
student’s current ROI that they will not meet
year end grade level standards
48
Dual Discrepancy
• -A student should be deficient in level and have a
poor response to evidenced-based interventions
(slope) to the degree that he/she is unlikely to meet
benchmarks in a reasonable amount of time without
intensive instruction to move:
–
between Tier II to Tier III as well as between Tier
III and referral for a comprehensive special
education evaluation.
–(e.g., Brown-Chidsey & Steege, 2008; Lichenstien, 2008)
49
What Would Assessment at Tier III Look Like?
50
Specific Level Assessment
•Functional analysis of skills
–Are used to:
•(a) identify specific skills deficits;
•(b) students prior knowledge; &
•(c) serve as baseline for progress monitoring
–specific level assessments rely primarily on
subskill mastery measures.
•“drill down” to specific deficits
51
Functional Analysis
RIOT/ICEL Matrix
•R- review
•I – instruction
•I – interview
•C – curriculum
•O – observe
•E – environment
•T - test
•L- learner
52
Linking Assessment Data to Intervention
at Tier III
•The learner
–focus on alterable learner variables
–identify academic entry level skills
Instruction
Student
Match =
Success
•The task
–level of the material the student is expected to
master
Task
•The instruction
–research-based methods and management
strategies used to deliver curriculum
53
Targets for Academic Instructional
Materials
•Instructional level
•contextual reading – 93-97% correct
•other academic skills – 85-90% correct
–Produce larger gains more quickly
Gravois, T.A., & Gickling, E.E. (2008). Best practices in
instructional assessment. In A. Thomas & J.Grimes (Eds.), Best
practices in school psychology (5th ed., pp. 503 518). Bethesda,
MD: National Association of School Psychologists.
54
Phonemic Awareness Hierarchy
Alliteration
• identifying initial, final & medial sounds in words
Blending
• blending individual sounds to make a whole word
Segmenting
• breaking a whole word into it’s individual parts
Manipulating
• Deleting: saying the new word created by omitting a syllable or individual sound in a
word
• Substituting: changing the initial, final, or medial sound in a word to create a new word
• Reversing: saying the sounds of a word in reverse order to create a new word
Daly, Chafouleas, & Skinner (2005)
55
Let’s look at Michael again…..
•Specific Level Assessment –
•Phonics:
•Decoding Skills test
•Developmental Spelling Analysis
Problem
Analysis
•Sight words:
•Graded word list
•Phonemic Awareness:
•LAC 3
56
Linking specific level assessment data
to interventions….
•Basing interventions on direct samples of
student’s academic skills has been shown to
result in larger effect sizes than interventions
derived from other data
–This is also known as a skill by treatment
interaction
–Burns, Codding, Boice & Lukito, (2010)
57
What Would Data Analysis at Tier III Look Like?
58
•Need to look at 3 areas
»Level
»Slope
»Variability
59
Level
• Central location of data within a phase
• often compared to benchmark (goal/aim line)
• can also look at mean or median for each phase
– (e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman &
Burns, 2009)
• Can conduct a Gap Analysis using the worksheet
60
Slope/Trend
•How the central location changes over time
•With academic data we are usually looking for
an increase in skills
•Target students ROI can be compared with peer
groups ROI or benchmark
(e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009)
61
2 approaches for analyzing slope
•Calculate ROI and compare to an
identified peer group using the ROI
worksheet
•Plot the trend line and compare the aim
(goal) line to the slope (trend) line
62
Variability
•Should be examined both within and between
phases
–General rule- most of the variability in the data
should explained by the trend line
• 80% of the data points should fall with in 15% of
the trend line
63
64
Referral for SLD Evaluation Decision Tree
Tier III Intervention(s) have occurred daily for 60 minutes in
addition to core instruction
Intervention logs attached
(5) Fidelity checks completed and attached
Implementation integrity has occurred with at least 80% fidelity
Student has been present for ≥75% of intervention sessions
Tier III intervention(s) adequately addressed the student’s area of
need
65
Referral for SLD Evaluation Decision Tree
Tier III intervention was appropriate and research-based
Research based interventions are:
□ Explicit
□ Systematic
□ Standardized
□ Peer reviewed
□ Reliable/valid
□ Able to be replicated
Progress monitoring has occurred with at least 10-15 weekly
data points –OR- 8-10 bi-monthly data points at Tier III
Gap analysis indicates that student’s progress is not sufficient
for making adequate growth with current interventions
66
Referral for SLD Evaluation Decision Tree
The following have preliminarily been ruled out as the
primary cause of the student’s lack of response to
intervention
□ Visual, motor, or hearing disability
□ Emotional disturbance
□ Cultural factors
□ Environmental or economic factors
□ Limited English proficiency
□ Excessive absenteeism
67
Deciding to refer for SLD evaluation
•As part of the teams decision to refer for an
SLD evaluation, a Gap Analysis should be
conducted
•Let’s look at how to complete the Gap
Analysis worksheet with Michael
68
Gap Analysis
Assessment Used:
Student’s current benchmark performance:
Student’s current rate of improvement (ROI):
Current benchmark expectation:
End of year benchmark expectation:
Number of weeks left in the school year:
2nd ORF
66
1.3
90
90
5
Is Gap
Significant?
90
________
Current
benchmark
expectation
66
/ ________
Current
performance
1.4
= _________
Current gap
□ Yes
□ No
69
Conducting a Gap Analysis
•Step 2
66
90
24
24
5
4.8
24
1.3
18
70
Additional Consideration
71
SEM
• Additionally, we cannot ignore issues such as interpreting
CBM scores in light of SEM or CI when those scores are used
for such as diagnoses and eligibility determinations
• For more detailed discussion including suggested SEM
guidelines for oral reading fluency scores in grades 1-5 see:
– Christ, T. J. Silberglitt, B. (2007). Estimates of the standard error of
measurement for curriculum-based measures of oral reading fluency.
School Psychology Review, 36, pp. 130-146.
72
Use of Progress Monitoring in Special Education
•Because CBM data
–can be directly tied to skill development necessary to be
successful in the curriculum,
–they possess a higher level of sensitivity, and
–allows for graphic representation;
–they allows for development of a higher quality IEP
•Progress monitoring should continue after the IEP is initiated
•Exit criteria can be set to determine if early reevaluation can
be completed due to student success.
73
Helpful Resources
74
Helpful Resources from NASP
75
76
Additional Helpful Resources
•Guilford Press
77
78
79
80
81

similar documents