Day 2: Evaluating Performance: Schoolwide Assessment of Student

Report
Institute on Beginning
Reading
Day 2: Evaluating Performance:
Schoolwide Assessment of
Student Performance
Content Development
Content developed by:
Roland H. Good, Ph. D.
College of Education
University of Oregon
Beth Harn, Ph. D.
College of Education
University of Oregon
Edward J. Kame’enui, Ph. D.
Professor, College of Education
University of Oregon
Deborah C. Simmons, Ph. D.
Professor, College of Education
University of Oregon
Michael D. Coyne, Ph. D.
University of Connecticut
Prepared by:
Patrick Kennedy-Paine
University of Oregon
Katie Tate
University of Oregon
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
2
Acknowledgments
 Oregon Department of Education
 U.S. Department of Education, Office of
Special Education Programs
 Bethel School District, Eugene, Oregon
Dr. Drew Braun, Dr. Carl Cole, Lori Smith, Rhonda
Wolter, Administrators, Staff, and Students
 Dr. Sharon Vaughn, University of Texas at Austin,
Texas Center for Reading and Language Arts
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
3
Permissions
 Some video clips are used with the
permission of Reading Rockets, a project
of Greater Washington Educational
Telecommunications Association (WETA).
 More information is available at:
http://www.ReadingRockets.org/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
4
Copyright
 All materials are copy written and should
not be reproduced or used without
expressed permission of Dr. Edward J.
Kame’enui or Dr. Deborah C. Simmons.
Selected slides were reproduced from
other sources and original references cited.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
5
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
6
Guiding Questions
1.
2.
Today’s
Focus
3.
4.
5.
6.
Goals: What outcomes do we want for our students in
our state, district, and schools?
Knowledge: What do we know and what guidance can
we gain from scientifically based reading research?
Progress Monitoring Assessment: How are we doing?
What is our current level of performance as a school? As
a grade? As a class? As an individual student?
Outcome Assessment: How far do we need to go to
reach our goals and outcomes?
Core Instruction: What are the critical components that
need to be in place to reach our goals?
Differentiated Instruction: What more do we need to do
and what instructional adjustments need to be made?
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
7
IBR Foundational Features:
Translating Research into Practice
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
8
Building an Effective Reading Program for
All Students: Essential Components
For Each
Student
Assessment
Goals
 Efficient
 Informative at the
 School
 Class
 Individual Level
For All
Students
Instruction
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
9
Start-Up Activity:
Reviewing Day 1
Answer the following questions based on what you learned
in Day 1.
1.
By implementing scientifically-based instructional practices within a
prevention model, we will enable more students to be ________ .
2.
The goal of schoolwide reading model is to:
a)
Help schools build capacity and sustained use of scientifically based
practices specifically tailored to their school
b)
Maximize the number of students being readers by the end of grade 3
c)
Prevent individual children from experiencing reading frustration
improving instruction for all
d)
All of the above
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
10
Start-Up Activity:
Reviewing Day 1
Answer the following questions based on what you learned
in Day 1.
3.
What is the primary assessment system we will use to evaluate
our school’s progress in meeting the early literacy and reading
needs of all children?
_______________
4.
One way of achieving our goals is to systematically pace our
instruction of the big ideas. We can determine when to introduce
and how to sequence key instructional objectives by using:
a)
Lock-step following of the curricular program without linkage to student
learning
b)
Curriculum maps
c)
Our instincts
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
11
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
12
Reading Assessment for Different
Purposes
An effective, comprehensive reading
program includes reading assessments for
four purposes:
 Outcome - Provides a bottom-line evaluation of the
effectiveness of the reading program in relation to
established performance levels.
 Screening - Designed as a first step in identifying
children who may be at high risk for delayed
development or academic failure and in need of
further diagnosis of their need for special services or
additional reading instruction.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
13
Reading Assessment for Different
Purposes
An effective, comprehensive reading
program includes reading assessments for
four purposes:
 Diagnosis - Helps teachers plan instruction by
providing in-depth information about students’ skills
and instructional needs.
 Progress Monitoring - Determines through frequent
measurement if students are making adequate
progress or need more intervention to achieve gradelevel reading outcomes.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
14
Role of Assessment
 Role of Assessment:
Video of Dr. Edward
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Kame’enui
 Purpose of Timely
Assessment
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
15
Role of Assessment
 Role of Assessment: Video of Dr. Edward
Kame’enui
 Purpose of Timely Assessment:
Assessing the quality of our investment
_____________________________________
 How well do we want the lowest reader in each grade
to read?
 1st Grade: 40 wpm minimum, 60 wpm desirable
 2nd Grade: 90 wpm
 3rd Grade: 110 wpm
 What is the significance of reading this well?
Good indicator of comprehension
_______________________________
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
16
Outcome Assessment
 Purpose: To determine level of proficiency in
relation to norm or criterion.
 When: Typically administered at end of year. Can
be administered pre/post to assess overall
growth.
 Who: All students
 Relation to instruction: Provides index of overall
efficacy but limited timely instructional
information.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
17
Screening Assessment
 Purpose: To determine children who are likely to
require additional instructional support (predictive
validity).
 When: Early in the academic year or when new
students enter school.
 Who: All students
 Relation to instruction: Most valuable when used
to identify children who may need further
assessment or additional instructional support.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
18
Diagnostic Assessment
 Purpose: To provide specific information on skills
and strategy needs of individual students.
 When: Following screening or at points during
the year when students are not making adequate
progress.
 Who: Selected students as indicated by
screening or progress monitoring measures or
teacher judgment.
 Relation to Instruction: Provided specific
information on target skills; highly relevant.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
19
Progress Monitoring Assessment
 Purpose: Frequent, timely measures to
determine whether students are learning enough
of critical skills.
 When: At minimum 3 times per year at critical
decision making points.
 Who: All students
 Relation to Instruction: Indicates students who
require additional assessment and intervention.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
20
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
21
Purposes of Assessment in the
Schoolwide Model
“Teaching without assessment is like driving a
car without headlights.”
 Assessment for all children must:
1. Focus on essential, important skills
2. Be instructionally relevant
3. Be efficient to administer
4. Be sensitive to change in skill performance
5. Measure fluency of performance
DIBELS provide the feedback to ensure our program
is meeting the needs of all children
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
22
Essential Features of DIBELS
(Dynamic Indicators of Basic Early Literacy Skills)
Preventing Reading Difficulties Through Early
Identification
 Dynamic – Responsive to Changes in Student
Performance
 Identifies students who need additional support
 Evaluates student response to intervention
 Indicators – Focused on an Essential Skill
 Enables assessment to be efficient
 Basic Early Literacy Skills – Relevant to Instructional
Planning
 Links essential literacy skills to prevent reading failure
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
23
Relation of DIBELS to Purposes
of Assessment
 Utility of DIBELS
Purpose of Assessment
Utility
Screening
Yes
Progress Monitoring
Yes
Diagnostic
Possibly with expert
teachers
Outcome
Selected measures
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
24
The Need for Results-Focused
Assessment
 Instructional Time is Precious: Need to spend
time teaching, not testing
 DIBELS measures do not assess all aspects of
reading
 Short duration fluency-based measures
 Some Skills are More Important Than Others:
 Assesses skills predictive of later reading proficiency
 Provides timely feedback to schools and teachers to
enable responsive instruction
 Allows early identification of students who need
instructional support
 Assesses whether children are learning enough
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
25
Acknowledgments
University of Oregon Research Team that
Developed the DIBELS Measures:
 Primary Researchers:
Roland Good Ruth Kaminski
 Contributing Researchers:
Scott Baker
Cheri Cornachione
Hank Fien
Lisa Habedank Stewart
Rachell Katz
Debby Laimon
Karen Rush
Michelle Shinn
Joshua Wallin
John Bratten
Patricia Coyne
Kathleen Fleming
Beth Harn
Jennie Knutson
Elida Lopez
Dawn Sheldon-Johnson
Sylvia Smith
Jennifer Watson
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Shaheen Chowdri
Shanna Davis
Jerry Gruba
Diane Hill
Katherine Kohler
Ambre ReMillard
Mark Shinn
David VanLoo
26
Acknowledgments
DIBELS research was supported and funded by:
Early Research Institute on Measuring Growth and
Development (H180M10006) and Student-Initiated
Grants (H023B90057; 90CD0819; H023B90057),
funded by the U. S. Department of Education,
Special Education Programs.
Further information and research on the measures
is available at: http://dibels.uoregon.edu
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
27
What DIBELS Assess: Critical
Outcomes and Indicators
 The NRP and NRC reports identified five
essential skills or “Big Ideas”:
 Phonological Awareness: The ability to hear and manipulate
sounds in words.
 Alphabetic Principle: The ability to associate sounds with letters
and use these sounds to read words.
 Accuracy and Fluency with Connected Text: The effortless,
automatic ability to read words in connected text to develop
understanding.
 Vocabulary: The ability to understand (receptive) and use
(expressive) words to acquire and convey meaning.
 Comprehension: The complex cognitive process involving the
intentional interaction between reader and text to extract
meaning.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
28
Assessing Each Big Idea with
DIBELS
Big Idea
DIBELS Measure
Phonological Awareness
Initial Sounds Fluency (ISF)
Phonemic Segmentation Fluency
(PSF)
Alphabetic Principle
Nonsense Word Fluency (NWF)
Fluency and Accuracy
Oral Reading Fluency (ORF)
Vocabulary
Comprehension
Word Use Fluency (WUF)
Oral Reading Fluency (ORF) &
Retell Fluency (RTF)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
29
Why Focus on Fluency?
To gain meaning from text, students must
read fluently.
 Proficient readers are so automatic with each
component skill (phonological awareness,
decoding, vocabulary) that they focus their
attention on constructing meaning from the
print (Kuhn & Stahl, 2000).
 Component skills need to be well developed to
support understanding.
 It is not enough to be simply accurate; the skill
must be automatic.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
30
Role of Automaticity or Fluency
 Role of Automaticity
or Fluency: Video of
Dr. Reid Lyon
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
31
Role of Automaticity or Fluency
 Role of Automaticity or Fluency: Video of Reid
Lyon
 The focus of reading instruction is not only on getting
students to know sounds or letters but to:
Get to the meaning
__________________
 Building automaticity in the component skills is
Learning to ride a bike
analogous to: _____________________
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
32
First Grade Curriculum Map
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
33
A Qualitative Difference in
Beginning Readers
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
I’ve thrown a lot of rocks into the lake by our cabin.
Sometimes
I think
fill the whole
lake.
In
one minute,
weI’ve
canthrown
obtainina enough
reliabletoindicator
of early
But it never
seems toThe
get two
full. students
As you can
tell, I substantially
like to throw
reading
proficiency.
require
rocks. But
throwing toward
rocks the
is always
lot more
fun with
different
instruction
goal ofabeing
lifelong
Grandpa. He can make anything….
readers.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
34
What Are the Skill Differences
Between These Readers?
 The “on-track” reader has a strategic
approach to reading:
 Alphabetic Principle:
___________________________
decodes words she does not know.
 Fluency with connected text:
___________________________________
reads words with accuracy and speed to enable
comprehension
____________
 Other attributes: ____________________
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
35
What Are the Skill Differences
Between These Readers?
 The struggling reader does not have an
effective strategy to gain access to the
meaning of the passages:
 Alphabetic Principle:
Has an ineffective strategy for reading unknown
____________________________________
words.
_____
 Fluency with connected text:
Limited fluency deters comprehension
_____________________________
 Other attributes: ____________________
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
36
OSAReading/Literature,
Reading/Literature,
Spring,
Spring,
OSA
Grade
Grade3 3
Prevention Oriented: Relation Between
ORF and Other Outcome Measures
240
230
220
210
200
190
180
170
0
20
40
60
80
100
120
140
Oral Reading Fluency, Spring, Grade 1
160
 88% of students who met the end-of-first-grade ORF goal went on
to meet or exceed Oregon’s State Benchmark Test in grade 3.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Play audio clip
37
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade,
class and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
38
How Do We Change Reading
Outcomes?
1. Earlier rather than later: prevention
oriented
2. Schools not just programs
3. Results not just improvement
4. Science not just opinion
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
39
Results Focused: Evaluating
Progress At Multiple Levels
Schoolwide DIBELS can answer:
1. How are we doing as a school?
2. How are we doing at each grade?
3. How is each class doing?
4. How are individual students doing?
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
40
How Are We Doing as a School?
End of Year Histogram - Oral Reading Fluency
36%
43%
Low Risk
Some Risk
At Risk
End of Year
Benchmark:
40 CWPM
How would you describe this
school’s end-of-year first graders?
Circle one of the following:
a) All on-track
b) Majority on-track
c) Some on-track
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
41
What Skills Did These First Graders
Have at the End of Kindergarten?
End of Year Histogram - Phoneme Segmentation Fluency
16%
60%
End of Year
Benchmark:
35 correct
phonemes
Established
Emerging
Deficit
 Almost half the kindergartners finished the year without
strong skills in phonological awareness
risk for reading difficulties, a
 Making these students at
______
prediction in this case that came true.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
42
DIBELS Tell Us if Odds Are in
Our Favor
Scatter Plot: The Relation Between Phonological
Awareness and Oral Reading Fluency
Students in this
section had deficit
alphabetic principle
skills at the middle
of First Grade and
ended the year as
at risk readers.
Students in this section had
established alphabetic principle
skills at the middle of First
Grade and ended the year as
readers.
Odds of being an Established Reader
on ORF in May of first grade when
Established on PSF in May of
kindergarten is 37 out of 44, or 87%.
Odds of being an Established Reader
on ORF in May of first grade when
Deficit on PSF in May of kindergarten is
1 out of 6, or 16%.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Play audio clip
43
A Compass is Only Helpful If We
Know Our Destination (Outcomes)
 Each measure has a scientifically-based goal
 Two parts to every goal:
 How much / How well?
 By when?
Measure
How Much?
By When?
Initial Sounds Fluency
25 or more
Middle of K
Phonemic Segmentation
Fluency
35 or more
End of K
Nonsense Word Fluency
50 or more
Middle of First
1st: 40 or more
2nd: 90 or more
3rd: 110 or more
1st: End of Year
2nd: End of Year
3rd: End of Year
Oral Reading Fluency
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
44
Stepping Stones of Early Literacy
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Video of Dr. Roland Good
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
45
When to Administer DIBELS
 Monitoring student skill development
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
46
Allocating Resources More
Efficiently
 Early identification of students most in need of additional
instructional support
Mid-Year Kindergarten Class List
Name
Initial Sound
Score Percentile
Mari
1
4
Christian
2
5
Ann
5
10
Debbie
10
25
Yasmin
13
35
Kaimana
16
45
Jillian
19
54
Chance
20
57
Jimmy
21
60
Sam
21
60
Justin
23
65
Adam
25
70
Jumpei
28
76
Miyu
29
78
Zach
34
85
Kilia
42
92
Fluency
Letter Naming
Status
Score Percentile
Deficit
35
67
Deficit
21
42
Deficit
13
29
Emerging
47
87
Emerging
40
77
Emerging
21
42
Emerging
30
58
Emerging
41
79
Emerging
40
77
Emerging
50
90
Emerging
30
58
Established
5
13
Established
28
54
Established
28
54
Established
27
52
Established
49
89
Fluency
Status
Low risk
Some risk
At risk
Low risk
Low risk
Some risk
Low risk
Low risk
Low risk
Low risk
Low risk
At risk
Low risk
Low risk
Low risk
Low risk
Instructional Recommendation
Strategic Intensive Intensive Strategic Strategic Strategic Strategic Benchmark
Strategic Strategic Benchmark
Strategic Benchmark
Benchmark
Benchmark
Benchmark
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Additional Intervention
Needs Substantial Intervention
Needs Substantial Intervention
Additional Intervention
Additional Intervention
Additional Intervention
Additional Intervention
- At Grade Level
Additional Intervention
Additional Intervention
- At Grade Level
Additional Intervention
- At Grade Level
- At Grade Level
- At Grade Level
- At Grade Level
47
How to Use DIBELS in Your
School: Schoolwide Administration
 Designed to Collect Data
Efficiently at the School
Level
 Short duration: 1-minute
administration
 Repeatable with 20 alternate
forms
 Reproducible and convenient
to use
 Fluency based
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
48
Training: Standardized Method of
Administration
 For scores to be useful, we must administer the
measures according to standardized administration and
scoring directions.
 Presenting each measure:
 Present the directions as written
 Use the specific materials
 Timing each measure:
 Use a stopwatch
 Scoring each measure:
 Follow scoring rules for each measure
 Score immediately after completing
 Standardization provides each child an equal
opportunity to display skills.
 Engage student to do his or her best
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
49
Separating Teaching & Testing
Time
 Scores will be used to assist in making
instructional decisions
 Therefore, we must administer the measures
without:
 Assisting the student during the task
 Modifying the task, materials, or time
Standardized, reliable data collection and
scoring are essential!
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
50
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
51
Learn the Measures
 Three things to consider for each measure:
 What essential skill does it assess?
 What is the appropriate time and grade?
 What is the goal (how much, by when)?
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
52
Phonemic Segmentation Fluency
(PSF):
 What important skill does it assess?
Phonological Awareness
 The ability to hear and manipulate sounds in
words at the phrase level
 What is the appropriate time and grade?
 Mid-year kindergarten through first grade
 What is the goal?
 How well? 35 phonemes or more
 By when? End of kindergarten
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
53
What PSF Looks Like
As you view the video, attend to:
 The child:
 Characterize task performance (circle one):




Complete Segmentation with Fluency
Partial Segmentation with Fluency
Partial Segmentation with No Fluency
Some Segmentation with Errors
 The examiner:
 Comfortable with materials
 Comfortable with student
 Comfortable with administration
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
54
What PSF Looks Like
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
55
How Do We Administer and Score
the PSF Measure?

Materials:
1. Examiner copy of word list with phoneme
scoring columns. Student has no materials
when assessing phonological awareness.
2. Stopwatch
3. Pencil

Preparing the Student:
1. Good testing conditions (e.g., lighting, quiet,
comfortable)
2. Provide model in standardized manner and
follow correction procedures as necessary
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
56
How Do We Administer and Score
the PSF Measure?
1. Place the segmentation word list in front of you but shield
it so the student cannot see what you record.
2. Say these specific directions to the student:
I am going to say a word. After I say it, you tell me all the
sounds in the word. So, if I say “Sam,” you say /s/ /a/ /m/.
Let’s try one. (One second pause.) Tell me the sounds in
“mop.”
CORRECT RESPONSE:
If student says, /m/ /o/ /p/,
you say
Very good.
INCORRECT RESPONSE:
If student gives any other response,
you say,
The sounds in “mop” are /m/ /o/ /p/.
Your turn. Tell me the sounds in
“mop.”
"OK. Here is your first word."
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
57
Maximizing Administration Time
 Stopwatch:
 Present the first word and start the stopwatch and time for 1 minute.
 Scoring:
 Underline each different, correct sound segment produced. (See specific
scoring rules and examples.)
 Put a slash (/) through sounds produced incorrectly.
 Maintaining momentum:
 As soon as the student is finished saying the sounds, present the next
word.
 Allow the student 3 seconds for each sound segment.
 Discontinue:
 If a student has not given any correct sound segments in the first 5
words, discontinue the task and record a score of zero (0).
 Ending testing:
 At the end of 1 minute, stop timing and calculate the number of correct
phonemes per minute.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
58
Scoring Rules for PSF
Correct Segmentation:
 A correct sound segment is any different, correct
part of the word. For example, the sound /t/ is a
correct segment of "trick", as are /tr/ and /tri/ (see rule
2, following page).
 Examiner says "trick," student says "t...r...i...k"
 Examiner says "cat," student says "k...a...t"
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“t...r...i...k”
“k...a...t”
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
4/4
3/3
59
Elongating Sounds
Correct Segmentation:
 No need for an audible pause between the sounds to
receive credit.
 If you can hear each individual sound when the
student runs them together, score each sound as
correct.
 Use your professional judgment based on the
response and your knowledge of your program. If still
not sure, do not give credit
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
rest
“rrrreeeessssttt” /r/ /e/ /s/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
4 /4
60
Errors in Segmenting: No
Segmentation
No Segmentation:
 If student repeats the entire word, no credit is given for
any correct parts.
 Circle the word to indicate no segmented response
was given.
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“trick”
“cat”
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
0/4
0/3
61
Errors in Segmenting: Incomplete
Segmentation
Incomplete segmentation:
 Student is given partial credit for each sound segment
produced correctly, even if student has not
segmented at the phoneme level.
 The underline indicates the size of the sound segment.
 For example:
Examiner says “trick,” student says “tr...ick”
Examiner says “cat,” student says “c...at”
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“tr...ik”
“c…at”
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
2/4
2/3
62
Errors in Segmenting:
Overlapping Sounds
Overlapping:
 Student receives credit for each different, correct
sound segment of the word.
 Underline the different sound segments produced
 For example:
Examiner says “trick,” student says “tri...ick”
Examiner says “cat,” student says “c...cat”
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“tri...ick”
“c…cat”
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
2/4
1/3
63
Errors in Segmenting: Omission
of Sounds
Omission:
 Student does not receive credit for sound segments
not produced. If student provides the initial sound only,
be sure to wait 3 seconds for elaboration.
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“t...ik”
“c” (3 seconds)
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
2/4
1/3
64
Errors in Segmenting:
Mispronunciation of Sounds
Mispronunciation:
 Student does not receive credit for sound segments
that are mispronounced.
 Put a slash (/) through the incorrect sounds.
 For example, there is no /ks/ sound in the word "trick."
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“t...r...i... ks”
“b…a...t”
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
3/4
2/3
65
Student Characteristics
Pronunciation & Dialect:
 Student is not penalized for imperfect pronunciation
due to dialect or articulation.
 For example, if the student says /r/ /e/ /th/ /t/ for "rest"
because of articulation difficulties, give full credit. Use
professional judgment and prior knowledge of the student’s
speech pattern to assess skill performance.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
66
Student Characteristics
Schwa Sounds:
 Schwa sounds (/u/) added to consonants are not
counted as errors.
WORD:
STUDENT
SAYS:
SCORING
PROCEDURE:
trick
cat
“tu...ru...i...ku”
“ku...a...tu”
/t/ /r/ /i/ /k/
/k/ /a/ /t/
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
CORRECT
SEGMENTS
4/4
3/3
67
Let’s Try Again
Benchmark K-2
DIBELSTM Phoneme Segmentation Fluency
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
hat
/h/ /a/ /t/
hear
as
/a/ /z/
punch /p/ /u/ /n/ /ch/
__5_/6
means /m/ /ea/ /n/ /z/
by
/b/ /ie/
_5__/6
seem
/s/ /ea/ /m/
ship
/sh/ /i/ /p/
_0__/6
ought
/o/ /t/
pack
/p/ /a/ /k/
_3__/5
jam
/j/ /a/ /m/
if
/i/ /f/
_5__/5
yell
/y/ /e/ /l/
ham
/h/ /a/ /m/
_5__/6
calls
/k/ /o/ /l/ /z/
as
/a/ /z/
_5__/6
key
/k/ /ea/
crowd /k/ /r/ /ow/ /d/
_2__/6
/h/ /ea/ /r/
_5_/6
Total 35
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
68
Analyzing the Observation for
Instructional Implications
 Current Skills
 Emerging phonological awareness at the phoneme
level.
 Strong on initial and final consonants and medial
vowels.
 Inconsistent with the task.
 Instructional Needs
 Integrate with alphabetic principle instruction.
 Need more practice to build automaticity.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
69
Tips for Scoring
 Score what you hear!
 Practice with at least 7 students before using the
scores to make programming decisions.
 One sound won’t make a major difference in skill
assessment, but pondering for 5 seconds on whether
to score 2 or 3 phonemes on a response will.
 Look over words you are presenting to increase
the pacing.
 Practice phonemes in the booklet to increase
reliability and consistency in scoring.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
70
Breakout Activity: Practicing the
Measure
Locate the “Phonemic Segmentation Fluency

Breakout Activity”
1. Form a 3-person group
2. Assign roles:

Examiner

Student

Observer
3. Practice administering measure (3 rounds)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
71
Initial Sounds Fluency (ISF):
 What important skill does it assess?
Phonological Awareness
 The ability to hear and manipulate sounds in
words.
 What is the appropriate time and grade?
 Beginning of the year, kindergarten
 What is the goal?
 How well? 25 phonemes or more
 By when? Middle of kindergarten
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
72
What ISF Looks Like
 As you view the video, attend to:
 The child:
 Characterize task performance (circle one):




Sound Isolation with Fluency
Sound Isolation with Limited Fluency
Sound Recognition with Limited Fluency
Some Sound Recognition with Errors
 The examiner:
 Comfortable with materials
 Comfortable with student
 Comfortable with administration
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
73
What ISF Looks Like
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
74
How Do We Administer and Score
the ISF Measure?

Materials:
1.
2.
3.
4.

Examiner probe
Student picture pages
Stopwatch
Pencil
Preparing the student:


Good testing conditions
(e.g., lighting, quiet,
comfortable)
Provide model in
standardized manner
and follow correction
procedures as
necessary
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
75
How Do We Administer and Score
the ISF Measure?
1. Place student copy of 4
randomized pictures in front of
child.
2. Say these specific directions to
the child:
“This is mouse, flowers, pillow,
letters (point to each picture while
saying its name). Mouse (point to
mouse) begins with the sound
/m/. Listen, /m/, mouse. Which
one begins with the sounds
/fl/?"
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
76
How Do We Administer and Score
the ISF Measure?
 Correct Response on Sample Item:
Student points to flowers, you say: “Good. Flowers
begins with the sounds /fl/.”
 Incorrect Response:
“Flowers (point to flowers) begins with the sounds /fl/.
Listen, /fl/, flowers. Let's try it again. Which one
begins with the sounds /fl/?”
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
77
How Do We Administer and Score
the ISF Measure?
 "Pillow (point to pillow) begins with the sound /p/.
Listen, /p/, pillow. What sound does letters (point to
letters) begin with?"
 Correct Response: If the student says /l/ you say: “Good.
Letters begins with the sound /l/.”
 Incorrect Response: If the student says any other
response, you say: “Letters (point to letters) begins with
the sound /l/. Listen, /l/, letters. Let's try it again. What
sound does letters (point to letters) begin with?”
 Then you say: "Here are some more pictures. Listen
carefully to the questions."
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
78
Maximizing Administration Time
 Stopwatch:
 Read the question, start stopwatch. After child gives response, stop
stopwatch. Record the total time to answer each of the 16 questions.
 When the examiner is talking, the watch is not running.
 Scoring:
 Score is correct or incorrect (see specific scoring rules and examples).
 Maintaining momentum:
 Make sure to introduce each picture page.
 Allow student 5 seconds to answer each question.
 Discontinue:
 If a student gets no items correct in the first 5 items, discontinue the task
and record a score of zero (0).
 Ending testing:
 After administering all 16 items, record the total duration of
thinking/response time found on your stopwatch.
 Count number of items correct.
 Calculate final score (see formula).
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
79
Scoring Rules for ISF
 Identification Responses (“Which picture begins
with…?”)
 If the child points to the correct picture or names it,
score as correct.
PROMPT:
STUDENT
SAYS:
Which picture begins with /p/?
“pie”
SCORE:
0
1
 If the child names or renames the picture with a word
that begins with the target sound, score as correct.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
80
Scoring Rules for ISF
 Identification Responses (“Which picture begins
with…?”)
 If the child points to the correct picture or names it,
score as correct.
PROMPT:
STUDENT
SAYS:
Which picture begins with /p/?
“pie”
SCORE:
0
1
 If the child names or renames the picture with a word
that begins with the target sound, score as correct.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
81
Scoring Rules for ISF
 Production Responses (“What sound does …. begin
with?”)
 Correct Initial Sound or Sounds: If the word starts with
an initial consonant sound, the child can respond with the
first consonant or consonant-consonant blend. For
example, if the word is “clock,” a correct initial sound
would be /c/ or /cl/. The student must give the sound,
not the letter name.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
82
Let’s Try Again
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
83
Analyzing the Observation for
Instructional Implications
 Current Skills
 Emerging phonological awareness at the initial sound
level.
 Inconsistent production for initial sounds.
 Very accurate on identification of sounds.
 Instructional Needs
 Develop overall phonological awareness at the
phoneme level.
 Integrate skills in phonological awareness with
alphabetic principle.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
84
Tips for Scoring
 Make sure to introduce each picture page.
 Score what you hear!
 Practice with at least 7 students before using the scores to make
programming decisions.
 Practice with stopwatch.
 Time how long it takes student to answer question.
 Make sure to record the total time at the end.
 Look over the words and pictures you are presenting to
increase pacing.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
85
Quick Review
 PSF and ISF assess what big idea?
 Phonological awareness: Ability to hear and
manipulate sounds in words.
 When do we want students to have completely
established skills in phonological awareness at
the phoneme level?
 End of kindergarten (a score of 35 or more on the PSF
measure)
 Why? PA is not enough to make a reader…
but it is predictive.
 (see next pages for kindergarten curriculum maps)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
86
Quick Review
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
87
Moving From Sound to Print:
Mapping Phonemes to the Print
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
88
Relation of PA to the Alphabetic
Principle
 The odds of having established alphabetic principle skills in time, given
student had established PA skills at the end of kindergarten was 29 of 38, or
76%.
 The odds of having established alphabetic principle skills in time, given
student had limited PA skills at the end of Kindergarten was 0 of 2, or 0%.
Phonological awareness does not guarantee proficiency on the
alphabetic principle, but the skills are highly linked.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Play audio clip
89
Role of Alphabetic Principle:
Mapping the Phonemes to Print
 What is the Alphabetic Principle? The
ability to associate sounds with letters and
use these sounds to read words.
 Comprised of two parts:
 Alphabetic Understanding: Letter-sound
correspondences.
 Phonological Recoding: Using systematic
relationships between letters and phonemes (lettersound correspondence) to retrieve the
pronunciation of an unknown “printed string” or to
spell.
 (see next page for first grade curriculum map)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
90
Role of Alphabetic Principle:
Mapping the Phonemes to Print
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
91
Role of Alphabetic Principle
 Role of Alphabetic
Principle: Video of
Dr. Louisa Moats
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
92
Role of Alphabetic Principle
 Role of Alphabetic Principle: Video of
Louisa Moats
 If students can decode nonsense words then
students understand:
 Words are made up of sounds
 Sound-symbol correspondence
 Structure of words
 People who are proficient at reading nonsense
Reading for meaning
words are better at: _________________
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
93
Nonsense Word Fluency (NWF):
 What important skill does NWF assess?
 Alphabetic Principle:The ability to associate sounds
with letters and use these sounds to read words.
 What is the appropriate time and grade?
 Middle of the year in kindergarten and throughout first
grade
 What is the goal?
 First Grade:
 How well? 50 letter-sounds or more
 By when? Middle of first grade
 Kindergarten:
 How well? 25 letter-sounds or more by end of kindergarten
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
94
What NWF Looks Like
 As you view the video, attend to:
 The child:
 Characterize task performance (circle one):




Reads at the word level with Fluency
Reads at the word level with Limited Fluency
Reads at the sound level with Fluency
Reads at the sound level with Limited Fluency
 The examiner:
 Comfortable with materials
 Comfortable with student
 Comfortable with administration
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
95
What NWF Looks Like
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
96
How Do We Administer and Score
the NWF Measure?


Materials:
1. Examiner probe
2. Student pages (practice page
“sim lut” and test page)
3. Stopwatch
4. Pencil
Preparing the student:


Good testing conditions (e.g.,
lighting, quiet, comfortable)
Provide the model in standardized
manner and follow correction
procedures as necessary
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
97
How Do We Administer and Score
the NWF Measure?
Say these specific directions to the child:
“Look at this word (point to the first word on the practice
probe). It’s a make-believe word. Watch me read the
word: (point to the letter “s”) /s/, (point to the letter “i”)
/i/, (point to the letter “m”) /m/ “sim” (run your finger fast
through the whole word). I can say the sounds of the
letters, /s/ /i/ /m/ (point to each letter), or I can read the
whole word “sim” (run your finger fast through the
whole word).
“Your turn to read a make-believe word. Read this word
the best you can (point to the word “lut”). Make sure
you say any sounds you know.”
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
98
How Do We Administer and Score
the NWF Measure?
CORRECT RESPONSE:
If the child responds
“lut” or with some or all
of the sounds, say
INCORRECT OR NO RESPONSE:
If the child does not respond within 3 seconds or
responds incorrectly, say
That’s right. The
sounds are
/l/ /u/ /t/ or “lut”
Watch me: (point to the letter “l”) /l/, (point
to the letter “u”) /u/, (point to the letter “t”) /t/.
Altogether the sounds are /l/ /u/ /t/
(point to each letter) or “lut” (run your finger
fast through the whole word). Remember,
you can say the sounds or you can
say the whole word. Let’s try again.
Read this word the best you can (point
to the word “lut”).
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
99
How Do We Administer and Score
the NWF Measure?
Student Copy
kik
kaj
lan
yuf
bub
wuv
nif
suv
yaj
tig
woj
fek
nul
pos
dij
nij
vec
yig
zof
mak
sig
av
zem
vok
sij
pik
al
dit
um
sog
faj
zin
og
viv
vus
nok
boj
tum
vim
wot
yis
zez
nom
feg
tos
mot
nen
joj
vel
sav
Place the student copy of the
probe in front of the child.
Here are some more makebelieve words (point to the
student probe). Start here (point to
the first word) and go across the
page (point across the page).
When I say “begin,” read the
words the best you can. Point
to each letter and tell me the
sound or read the whole
word. Read the words the
best you can. Put your finger
on the first word. Ready,
begin.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
100
Maximizing Administration Time
 Stopwatch:
 Start watch after student says the first word/sound and time for 1 minute.
 Scoring:
 Underline each correct letter sound produced (see specific scoring rules
and examples).
 Slash each incorrect letter sound produced.
 Maintaining momentum:
 Allow the student 3 seconds for each letter sound. After 3 seconds,
provide the sound to keep the student moving.
 Discontinue:
 If a student does not get any correct in the first row, discontinue the task
and record a score of zero (0).
 Ending testing:
 At the end of 1 minute, put a bracket after the last letter-sound/word
produced and calculate the total letter-sounds correct in one minute.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
101
Scoring Rules for NWF
1.
Correct Letter Sounds A correct letter
sound is scored as the most common
sound in English.
–
2.
For example, all the vowels are scored
for the short sound and the most
common sound for the letter “c” is /k/.
See pronunciation guide for remaining
letter sounds.
Marking the booklet Underline exactly
the way the student completes task.

For example, if the student goes
sound-by-sound, underline each letter
individually. If the student reads the
target as a whole word, underline the
entire word.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
102
Scoring Rules for NWF
3. Partially Correct Responses If a word is partially correct,
underline the letter sounds produced correctly. Put a
slash (/) through the letter if the letter sound is incorrect.

For example, if stimulus word is "sim" and student says "sam,"
the letters "s" and "m" would be underlined because those letter
sounds were produced correctly, giving a score of 2.
4. Repeated sounds Letter sounds pronounced twice while
sounding out the word are given credit only once.

For example, if stimulus word is "sim" and the student says
/s/ /i/ /im/, the letter "i" is underlined once and the student
receives 1 point for the phoneme "i" even though the letter "i"
was pronounced correctly twice (a total of 3 for the entire word).
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
103
Scoring Rules for NWF
5. 3-second rule - sound by sound If student hesitates for
3 seconds on a letter, score the letter sound incorrect,
provide the correct letter sound, point to the next letter,
and say, "What sound?"

This prompt may be repeated. For example, if the stimulus word
is "tob" and the student says /t/ (3 seconds), prompt by saying,
"/o/ (point to b) What sound?"
6. 3-second rule - word by word If student hesitates for 3
seconds on a word, score the word incorrect, provide
the correct word, point to the next word, and say, "What
word?"

This prompt may be repeated. For example, if the stimulus
words are "tob dos et" and the student says, "tob" (3 seconds),
prompt by saying "dos (point to et) What word?"
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
104
Scoring Rules for NWF
7.
Insertions Insertions are not scored as incorrect.
For example, if the stimulus word is "sim" and the student says
"stim," the letters "s" "i" and "m" would be underlined and full
credit given for the word, with no penalty for the insertion of /t/.
8.
Skipping Rows If student skips an entire row, draw a
line through the row and do not count the row in
scoring.
9.
Self-corrections If student makes an error and then selfcorrects within 3 seconds, write "SC" above the letter
and count it as correct.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
105
Let’s Try Again: Practice Scoring
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
f o j
h o n
t u m
l e n
a j
__/14
12
s u v
k a m
r e s
k i c
f a v
10
__/15
i d
w o d
n e j
s o k
w i f
7
__/14
Total 29
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
106
Analyzing the Observation for
Instructional Implications
 Current Skills
 Approaches at the whole word level initially
 Few letter-sound errors
 Can blend sounds together to the word level
 Instructional Needs
 Increase automaticity for all letter-sounds
 Increase automaticity in phonological
recoding (“fof” instead of /f/ /o/ /f/)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
107
Breakout Activity
Locate the “Nonsense Word Fluency Breakout

Activity”

Form a 3-person group

Assign roles:


Examiner

Student

Observer
Practice administering measure (3 rounds)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
108
Tips for Scoring
 Score for the most common sounds of the letters.
 Short vowels: i (big), e (beg), a (bag), u (bug), o (bog)
 “Hard” sounds: “c” = /k/, “g” = /g/, “j” = /j/
 A point for each letter, whether it is sound-bysound or read as a whole word.
 Score what you hear!
 Underline exactly the way the student completes the
task.
 Practice with at least 7 students before using the
scores to make programming decisions.
 Look over words you are presenting to increase
pacing.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
109
Letter Naming Fluency (LNF):
 What important skill does LNF assess?
 LNF not directly linked to a Big Idea: Used as a risk indicator
 What is the appropriate time and grade?
 Through kindergarten and fall of first grade
 What is the goal?
 While letter naming is a good predictor of early reading success,
knowledge of letter sounds is more important to word reading.
 Research indicates a score of 8 or below in the beginning of
kindergarten is predictive of later reading difficulty.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
110
What LNF Looks Like
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
111
How Do We Administer and Score
the LNF Measure?

Materials:
1. Examiner probe
2. Student page
3. Stopwatch
4. Pencil

Preparing the student:

Good testing conditions
(e.g., lighting, quiet,
comfortable)

Provide the model in
standardized manner and
follow correction
procedures as necessary
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
112
How Do We Administer and Score
the LNF Measure?
Say these specific directions to the child:
"Here are some letters" (point). "Tell me the names
of as many letters as you can. When I say 'begin,'
start here" (point to first letter in upper left hand
corner) "and go across the page" (point). "Point to
each letter and tell me the name of that letter. Try to
name each letter. If you come to a letter you don't
know, I'll tell it to you. Put your finger on the first
letter. Ready?"
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
113
Maximizing Administration Time
 Stopwatch:
 Start watch after student says the first letter name and time for 1
minute.
 Scoring:
 Slash each incorrect letter name produced.
 Maintaining momentum:
 Allow student 3 seconds for each letter name; after 3 seconds,
say the name to keep the student moving.
 Discontinue:
 If student does not get any correct in the first row, discontinue the
task and record a score of zero (0).
 Ending testing:
 At the end of 1 minute, put a bracket after the last letter-name
produced and calculate the total letter-names correct in 1 minute.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
114
Scoring Rules for LNF
1. Correct Letter Names Student must say the correct
letter name to receive credit.
–
If the student provides the letter sound rather than the letter
name, say, "Remember to tell me the letter name, not the
sound it makes." This prompt may be provided only once.
2. Self-corrections If student makes an error and selfcorrects within 3 seconds, write "SC" above the letter
and do not count as an error.
3. Skipping Rows If student skips an entire row, draw a
line through the row and do not count the row when
scoring.

Skipped or omitted letters are not counted in scoring.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
115
Tips for Scoring
 Score for the letter names.
 If student skips a row, follow the student’s
lead and keep going.
 Give the student 3 seconds for each letter.
 Score what you hear!
 Practice with at least 7 students before using
the scores to make programming decisions.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
116
Oral Reading Fluency (ORF):
 What important skill does it assess?
 Fluency and accuracy with connected text: The effortless,
automatic ability to read words in connected text leads to
understanding.
 What is the appropriate time and
grade?
 Middle of first grade through third grade
 What is the goal:
 To be fluent at the skill by end of first
grade.
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
 How well? 40 correct words or more
 By when? End of first grade
 What about second grade?
 How well? 90 correct words or more
 What about third grade?
 How well? 110 correct words or more
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
117
ORF Benchmark Levels
Progressive Benchmark Levels Indicative of
Low Risk for Reading Difficulties
Beginning of
Year
Grade 1
Grade 2
Grade 3
44
76
Middle of
Year
20
68
91
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
End of
Year
40
90
110
118
Instructional Priorities
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
119
Importance of Fluency with
Connected Text
 The ability to accurately and quickly apply word
reading strategies to reading connected text.
Automatic and fluent reading allows students to
allocate cognitive resources to comprehension.
 “Fluency may be almost a necessary condition for
good comprehension and enjoyable reading
experiences.” (Nathan & Stanovich, 1991)
 Oral reading fluency will not tell you everything you
need to know about student reading performance.
However, there is a strong relationship between oral
reading fluency and comprehension.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
120
Role of Automaticity or Fluency
 Role of
Automaticity or
Fluency: Video of
Louisa Moats
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
121
Role of Automaticity or Fluency
 Role of Automaticity or Fluency: Video of
Louisa Moats
 Why do nonfluent readers “get worn out” after
reading for a period of time?
much attention devoted to figuring out
 too
________________________________________
words
______
too long to get to the end of passage
 takes
________________________________________
and
student can’t remember the beginning
________________________________________
the sense of the passage as they
 lose
___________________________________
struggle,
pause, and make word-reading
______________________________________
errors
______
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
122
Fluent Readers Display
Orchestrated Reading Skills
 Fluent readers are
able to:
 Focus their attention on
understanding the text
 Synchronize skills of
decoding, vocabulary,
and comprehension
 Read with speed and
accuracy
 Interpret text and make
connections between
the ideas in the text
 Nonfluent readers:
 Focus attention on
decoding
 Alter attention to
accessing the meaning
of individual words
 Make frequent word
reading errors
 Have few cognitive
resources left to
comprehend
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
123
Frustration: Reading With Poor
Word Recognition
He had never seen dogs fight as these w______ish c___ f______t,
and his first ex________ t______t him an unf________able l______n.
It is true, it was a vi___ ex________, else he would not have lived to
pr___it by it. Curly was the v________. They were camped near the
log store, where she, in her friend__ way, made ad________ to a
husky dog the size of a full-_______ wolf, the_____ not half so large
as ____he. ____ere was no w___ing, only a leap in like a flash, a
met______ clip of teeth, a leap out equal__ swift, and Curly’s face
was ripped open from eye to jaw. It was the wolf manner of
fight_____, to st____ and leap away; but there was more to it than
this. Th____ or forty huskies ran _o the spot and not com_____d
that s______t circle. Buck did not com_______d that s______t
in_____, not the e__ way with which they were licking their chops.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
124
What ORF Looks Like
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
125
How Do We Administer and Score
the ORF Measure?

Materials:
1. Examiner probe
Say these specific directions to the child: “Please read this
(point) out loud. If you get stuck, I will tell you the word so
you can keep reading. When I say "stop," I may ask you to
tell me about what you read, so do your best reading. Start
here (point to the first word of the passage). Begin.”
2. Student passages
3. Stopwatch
4. Pencil

Preparing the student:

Good testing conditions
(e.g., lighting, quiet,
comfortable)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
126
How Do We Administer and Score
the ORF Measure?
Say these specific directions to the child:
“Please read this (point) out loud. If
you get stuck, I will tell you the word
so you can keep reading. When I say
"stop," I may ask you to tell me about
what you read, so do your best
reading. Start here (point to the first
word of the passage). Begin.”
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
127
Maximizing Administration Time
 Stopwatch:
 Start watch after student says the first word and time for 1 minute.
 Scoring:
 Slash each word produced incorrectly.
 Maintaining momentum:
 Allow student 3 seconds for each word. After 3 seconds, say the word to
keep the student moving.
 Discontinue:
 If student does not get any correct in the first row, discontinue the task
and record a score of zero (0).
 If student scores less than 10 on the first passage, do not administer the
other two passages.
 Ending testing:
 At the end of 1 minute, put a bracket after the last word produced and
calculate the number of correct words in one minute.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
128
Scoring Rules for ORF:
Scoring Directions are Similar to Marston, D. (1989)
1. Correctly Read Words are pronounced correctly. A word
must be pronounced correctly given the context of the
sentence.

Example: The word “read” must be pronounced /reed/ when
presented in the context of the following sentence:
Ben will read the story.
not as:
“Ben will red the story.”
WRC = 5
WRC = 4
2. Self-corrected Words are counted as correct. Words
misread initially but corrected within 3 seconds are
counted as correct.

Example:
Dad likes to watch sports.
read as:
“Dad likes to watch spin...(3 seconds)…sports.”
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
WRC = 5
WRC = 5
129
Scoring Rules for ORF
3. Repeated Words are counted as correct.
Words said over again correctly are ignored.

Example:
I have a goldfish.
read as:
“I have a ...have a goldfish.”
WRC = 4
WRC = 4
4. Dialectic variations in pronunciation that are
explainable by local language norms are not
errors.

Example:
We took the short cut.
read as:
“We took the shot cut.”
WRC = 5
WRC = 5
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
130
Scoring Rules for ORF
5. Inserted Words are ignored. When students add
extra words, they are not counted as correct
words nor as reading errors.

Example:
I ate too much.
WRC = 4
read as:
“I ate way too much.”
WRC = 4
6. Mispronounced or Substituted Words are
counted as incorrect.

Example:
She lives in a pretty house.
read as:
“She lives in a pretty home.”
WRC = 6
WRC = 5
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
131
Scoring Rules for ORF
7. Omitted/Skipped Words are counted as errors.

Example:
Mario climbed the old oak tree.
WRC = 6
read as:
“Mario climbed the tree.”
WRC = 4
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
132
Scoring Rules for ORF
Words must be read in accordance with the
context of the passage
8. Hyphenated Words count as two words if both parts can
stand alone as individual words. Hyphenated words
count as one word if either part cannot stand alone as
an individual word.
9. Numerals and Dates must be read correctly in the
context of the sentence.
10. Abbreviations must be read as pronounced in normal
conversation. For example, “TV” could be read as
"teevee" or "television," but “Mr.” must be read as
"mister."
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
133
Breakout Activity
 Locate the “Oral Reading Fluency Breakout
Activity”
 Form a 3-person group
 Assign roles:
 Examiner
 Student
 Observer
 Practice administering measure (3 rounds)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
134
Tips for Scoring
 Student must read exactly what is on the page.
 Self-corrections and insertions are ignored and not
counted as errors.
 Simply slash errors until you feel comfortable writing in
the error types.
 Score what you hear!
 Practice with at least 7 students before using the scores to make
programming decisions.
 Look over passages you are presenting to ensure pacing
is efficient.
 Use the middle score of the three passages read to
assess the student’s skill.
 Have student read all three passages in one sitting
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
135
Kindergarten Benchmark
Assessment
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
136
Grade 1 Benchmark Assessment
Quic kT ime™ and a YUV420 codec decompress or are needed to s ee this pi cture.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
137
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
138
Student Performance: Are We
Making Progress?
End of Year Histogram - ORF, Year 1
28% Low risk for reading difficulties
34% Some risk for reading difficulties
38% At risk for reading difficulties
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
139
Student Performance: Are We
Making Progress?
End of Year Histogram - ORF, Year 2
After changes in curricular program, instruction, time,
professional development:
57% Low risk for reading difficulties
20% Some risk for reading difficulties
22% At risk for reading difficulties
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
140
Student Performance: Are We
Making Progress?
After 4 years of sustained focused effort:
Cross-Year Boxplot
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Play audio clip
141
Class List Reports: Identifying At-Risk
Students in the Middle of First Grade
Name
Kevin
John
Leone
Yvonne
Katrina
Brian
Tara
Chiara
Kawena
Levi
Ryan
Chester
Jesse
Brian
Sara
Joshua
Lansen
Miki
Jennifer
Travis
Isaac
Phoneme Segmentation Fluency Nonsense Word Fluency
Oral Reading Fluency
Score %ile Status
Score %ile Status
Score %ile Status
12
5
Emerging
11
3
Deficit
0
3
At Risk
0 < 1 Deficit
19
6
Deficit
0
3
At Risk
44
33
Established
22
8
Deficit
1
6
At Risk
20
8
Emerging
23
9
Deficit
0
3
At Risk
2
1
Deficit
27 14 Deficit
7 19 At Risk
3
2
Deficit
27 14 Deficit
8 22 Some Risk
5
2
Deficit
28 15 Deficit
1
6
At Risk
15
6
Emerging
28 15 Deficit
18 49 Some Risk
8
3
Deficit
31 19 Emerging
0
3
At Risk
20
8
Emerging
34 23 Emerging
11 31 Some Risk
9
4
Deficit
37 27 Emerging
15 43 Some Risk
15
6
Emerging
38 29 Emerging
85 94 Low Risk
18
7
Emerging
39 30 Emerging
3
9
At Risk
7
3
Deficit
39 30 Emerging
8 22 Some Risk
17
7
Emerging
40 32 Emerging
10 28 Some Risk
51
48
Established
41 34 Emerging
5 14 At Risk
46
38
Established
45 41 Emerging
32 70 Low Risk
38
23
Established
52 52 Established
13 37 Some Risk
19
8
Emerging
64 68 Established
31 69 Low Risk
45
35
Established
127 95 Established
62 86 Low Risk
38
23
Established
129 96 Established 150 > 99 Low Risk
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
Instructional Recommendation
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Intensive - Needs Substantial Intervention
Strategic - Additional Intervention
Strategic - Additional Intervention
Benchmark - At Grade Level
Intensive - Needs Substantial Intervention
Strategic - Additional Intervention
Strategic - Additional Intervention
Intensive - Needs Substantial Intervention
Benchmark - At Grade Level
Strategic - Additional Intervention
Benchmark - At Grade Level
Benchmark - At Grade Level
Benchmark - At Grade Level
142
Instructional Status Terminology
For Each Measure
Quarterly Benchmark
Goals
Final Benchmark Goals
and Later
Low Risk
Established
Some Risk
Emerging
At Risk
Deficit
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
143
Critical Values & Progressive
Benchmarks
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
144
Critical Values & Progressive
Benchmarks
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
145
Critical Values & Progressive
Benchmarks
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
146
Quick Review
 What are the two measures used to assess phonological
awareness? ISF
__________
& PSF
 What is the only measure not administered for a full 60
seconds? __________
ISF
 Which measure do we use as a risk indicator for reading
difficulty, but is not directly linked to a big idea of early
literacy? _________________
LNF
 This measure has students read made-up words to
assess phonetic analysis skills and avoid the chance the
NWF
student has the word memorized. ______________
 Which measure has the strongest linkage to reading
comprehension without a direct assessment of it?
ORF
______________
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
147
Benchmarks and Levels of Low
Risk for Each DIBELS Measure
Beginning
ISF:  8
LNF:  8
Middle
______:
 25
ISF
LNF :  27
PSF:  18
NWF:  13
LNF :  40
____:  35
PSF
NWF
____:  25
PSF:  35
NWF  50
_____:
ORF:  20
PSF:  35
NWF:  50
ORF  40
_____:
ORF:  44
ORF
________:
 68
ORF:  90
ORF:  77
ORF:  92
______:
ORF  110
Kindergarten
First
Second
Third
LNF :  37
PSF:  35
NWF:  24
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
End
148
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
149
Developing a Plan To Collect
Schoolwide Data
Areas Needing to be Considered When
Developing A Plan:
1. Who will collect the data?
2. How long will it take?
3. How do we want to collect the data?
4. What materials does the school need?
5. How do I use the DIBELS Website?
6. How will the results be shared with the school?
More details are available in the document entitled
“Approaches and Considerations of Collecting Schoolwide Early Literacy
and Reading Performance Data” in your supplemental materials
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
150
Who Will Collect the Data?
 At the school-level, determine who will
assist in collecting the data
 Each school is unique in terms of the
resources available for this purpose, but
consider the following:
 Teachers, Principals, educational assistants, Title 1 staff,
Special Education staff, parent volunteers, practicum
students, PE/Music Specialist Teachers
 The role of teachers in data collection:
 If they collect all the data, less time spent in teaching
 If they collect no data, the results have little meaning
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
151
How Do We Want to Collect Data?
 Common Approaches to Data Collection:
 Team Approach
 Class Approach
 Combination of the Class and Team
 Determining who will collect the data will
impact the approach to the collection
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
152
Team Approach
 Who? A core group of people will collect all the
data
 One or multiple day (e.g., afternoons)
 Where Does it Take Place?
 Team goes to the classroom
 Classrooms go to the team (e.g., cafeteria, library)
 Pros: Efficient way to collect and distribute
results, limited instructional disruption
 Cons: Need a team of people, place, materials,
limited teacher involvement, scheduling of
classrooms
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
153
Class Approach
 Who? Teachers collect the data
 Where Does it Take Place?
 The classroom
 Pros: Teachers receive immediate
feedback on student performance
 Cons: Data collection will occur over
multiple days, time taken away form
instruction, organization of materials
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
154
Combination of Team & Class
Approaches
 Who? Both teachers and a team
 Where Does it Take Place?
 Teachers collect the data
 Team goes to the classroom
 What Might it Look Like?
 Kindergarten and First grade teachers collect their
own data and a team collects 2nd-3rd grade
 Pros: Increases teacher participation, data can
be collected in a few days, limited instructional
disruption
 Cons: Need a team of people, place, materials,
scheduling
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
155
How Long Will It Take?
Kindergarten
Time of Year /
Measure(s)
Beginning
ISF & LNF
Middle
ISF, LNF, PSF
End
ISF, LNF, PSF, & NWF
Approximate
Time per Pupil
4 min.
6-7 min.
9 min.
Number of Data
Collectors
Pupils
Assessed per
30 Minute Period
1
6-8
2
12-16
3
18-24
4-5
24-40
6-8
36-48
1
4-5
2
8-10
4-5
16-25
6-8
24-40
1
3-4
2
6-8
4-5
12-20
6-8
18-32
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
156
How Long Will It Take?
First Grade
Time of Year /
Measure(s)
Beginning
LNF, PSF, & NWF
Middle
PSF, NWF, & ORF
End of Year
NWF & ORF
Time per Pupil
6-7 min.
8-9 min.
7 min.
Number of Data Collectors
Pupils
Assessed per
30 Minute Period
1
4-5
2
8-10
4-5
16-25
6-8
24-40
1
3-4
2
6-8
4-5
12-20
6-8
18-32
1
4-5
2
8-10
3
12-15
4-5
16-25
6-8
24-40
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
157
How Long Will it Take?
Second & Third Grade
Measure
ORF
Time per Pupil
5 min.
Number of
Collectors
Pupils Assessed per
30 Minute Period
1
6-7
2
12-14
3
18-21
4-5
24-35
6-8
36-56
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
158
What Materials Does the School
Need?
 DIBELS Materials
 Benchmark booklets
 Color coding
 Labeling
 Student stimulus materials
 Binding, laminating, etc.
 Other Materials
 Stopwatches
 Pencils, clipboards
 Class rosters
See document entitled “Approaches and Considerations of
Collecting Schoolwide Early Literacy and Reading Performance
Data” at website:
http://dibels.uoregon.edu/logistics/data_collection.pdf
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
159
How Do I Use the DIBELS
Website?

Entering and generating
reports using the DIBELS
website begins with setting
up your school.

Sign up to get a user name
and password at:
http://dibels.uoregon.edu

Create your school in the
system (a manual for using
the website is available on
the website as well as in your
supplemental materials)
Introduction
Data System
Measures
Download
Benchmarks
Grade Level
Logistics
Sponsors
Trainers
FAQ
Contact
Information
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
160
Using the DIBELS Website
Creating your
school in
DIBELS web:
1. Creating
classrooms
2.
Populating
classrooms
with students
Enter/Edit Data
View/Create
Reports
Interpret
Reports
Administrative
Menu
Migrate
Students
System Status
3.
Creating
users
FAQ
Manual
Contact
Information
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
161
Entering Data on
DIBELS Website
After your school has created the classrooms
with students, you can enter the data you
collected by selecting the classroom
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
162
Generating Reports
 Two main types of
reports generated from
DIBELS Website:
 PDF Reports:
Downloadable reports
designed for printing.
The school and district
PDF reports combine
the most common
reports into a single
file.
 Web Reports:
Individual reports
designed for quick
online viewing. Select
the specific report you
would like.
Enter/Edit Data
View/Create
Reports
Interpret
Reports
Administrative
Menu
Migrate
Students
System Status
FAQ
Manual
Contact
Information
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
163
How Will the Results Be Shared
With the School?
 Schedule time soon after data collection to
share and distribute results
 School-level: Staff meeting
 Grade-level: Team meetings
 Determine a method of addressing
concerns
 Identifying at-risk students
 Answering questions about the results
 Re-thinking the data collection approach
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
164
Web Resources
 Materials
 Administration and scoring manual
 All grade-level benchmark materials
 Progress monitoring materials for each measure (PSF, NWF,
ORF, etc.)
 Website
 Tutorial for training on each measure with video examples
 Manual for using the DIBELS Web Data Entry website
 Sample schoolwide reports and technical reports on the
measures
 Logistics
 Tips and suggestions for collecting schoolwide data (see website)
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
165
Objectives: What You Will
Learn and Do
The objectives of today’s session are to:
1. Differentiate purposes of assessment.
2. Delineate how the DIBELS assessment system differs from
traditional assessment systems.
3. Use DIBELS to evaluate outcomes at the school, grade, class
and student level.
4. Administer and score DIBELS.
5. Interpret DIBELS results.
6. Develop a plan to use DIBELS quarterly with all students.
7. Evaluate the current assessment system in your school.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
166
Planning & Evaluation Tool (PET)
 As school teams, you will work together
on the Planning and Evaluation Tool
(Simmons & Kame’enui, 2000)
 The second section focuses on
Assessment.
 Complete this section based on the
information presented in today’s session
and your knowledge of your school’s
current assessment practices.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
167
Day 2: PET Time
 Complete Element 2 of the Planning &
Evaluation Tool: Assessment.
 Review each item.
 Determine whether you will have individuals complete
items independently or as a group (e.g, Grade level
teams: All K teachers complete 1 PET, all Grade 1
teachers complete a separate PET).
 Report the score for each item and document the
information sources available to substantiate the score
reported.
 Allow approximately 15-30 minutes for completion.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
168
Day 2: PET Time
0
Not in place
1
Partially in place
2
Ful ly in place
II. Assessment continued
EVALUATION CRITERIA
DOCUMENTATION OF EVIDENCE
II. Assessment – Instruments and procedures for assessing reading achievem ent are clearly
specified, m easure important skill s, provide reliable and valid inform ation about student
perform ance, and inform instruction in im portant, meaningful, and maintainable ways.
Assessment:
1. A schoolwide assessm ent system
and database are established and m aintained
for docum enting student perform ance and
monitoring progress (x 2).
2. Measures assess student
perform ance on prioritized goals and
objectives.
EVALUATION CRITERIA
DOCUMENTATION OF EVIDENCE
7. Student performance data are
analyzed and summarized in meaningful
form ats and routinely used by grade-level
teams to evaluate and adjust instruction (x 2).
8. The building has a “resident” expert
or experts to m aintain the assessment system
and ensure m easures are collected reliably,
data are scored and entered accurately, and
feedback is provided in a tim ely fashion.
/20 Total Points
3. Measures are technically adequate
(i.e., have high reliability and validity) as
docum ented by research.
10 = 50%
%
Percent of Implementation:
16 = 80%
20 = 100%
4. All users receive training and
followup on measurement administration,
scoring, and datainterpretation.
5. At the beginning of the year,
measures identify students' level of
perform ance and are used to determine
instructional needs.
6. Measures are administered
form atively throughoutthe year to document
and m onitor student reading performance
(i.e., quarterly for all students; every 4 weeks
for students at risk).
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
169
Reflections and Reports
 After schools complete Element II, review items
individually and ask schools to volunteer their current
status with respect to Assessment.
 Ask schools to identify particular items in which they
scored full points and ones in which there is room for
improvement.
 This information will be used to formulate a schoolspecific Reading Action Plan (RAP) on Day 4 of the IBR.
Good, Harn, Kame'enui, Simmons, & Coyne
© 2003
170

similar documents