Student Learning Objectives Pilot Test Overview and Identifying Learning Goals Aurora Public Schools Fall 2013 Introductions Center for Transforming Learning and Teaching Catalyzing and co-creating the transformation of learning environments through the use of assessment so that all are engaged in learning and empowered to positively contribute in a global society. www.ctlt.org Facilitator/Trainer: Julie Oxenford O’Brian Coach/Trainer: Mary Beth Romke Julie@ctlt.org Mary@ctlt.org Purpose Introduce the major components of and context for the Student Learning Objective (SLO) Pilot Test. Begin to identify SLO Learning Goals. Introductions Capture on a Sticky Note: One burning question you have about our work together. Discuss at your table group Name, role, teaching assignment Why you agreed to participate in this pilot test. Your question about our work together Group share out: reasons for participating Materials Norms The standards of behavior by which we agree to operate while we are engaged in learning together. Overview Tools, p. 1 Learning Targets Engage in learning activity during this session Complete online selfassessment Complete follow-up tasks. Describe the purpose of Student Learning Objectives (SLOs) as part of teacher evaluation. Understand the purpose and major components of the student learning objective pilot test. Describe the process involved in identifying an SLO Learning Goal. Explain cognitive complexity using the Webb Depth of Knowledge framework. Evaluate the cognitive complexity of learning goals/objectives. Describe and exemplify success criteria for SLO Learning Goals. Activity: Monitoring your learning Turn to Progress Monitoring (Note catcher, p. 3-4) Re-write today’s learning targets in language that has meaning for you. Create a bar graph which describes where you currently believe you are in relationship to each learning target. Leave the “reflections” column blank for now. Learning Target Describe the purpose of Student Learning Objectives (SLOs) as part of teacher evaluation. In my words: I know how student learning objectives will be incorporated into my evaluation. I don’t know what this Is I need more practice I’ve got It I can apply it in a new way Reflections SLO Session One Agenda Background and Purpose of SLOs Purpose and Components of Pilot Test SLO Template and Process SLO Learning Goals Determining Depth of Knowledge Success Criteria Colorado State Context SB10-191: Ensuring Quality Instruction through Educator Effectiveness Evaluating the effectiveness of educators is crucial to improving the quality of education in Colorado. Educators are evaluated in significant part based on the impact they have on the learning growth of their students. SB 10-191 Teacher Evaluation Requirements 50% Professional Practices (Observational Rubric) 50% measures of teacher contribution to student learning growth: At least one individually attributable measure. At least one collectively attributable measure. When available, state summative assessment results (TCAP). When available, Colorado Growth Model results. Other local measures (e.g. pre- post- assessment, student learning objectives). Student Learning Objectives One approach districts may use to develop measures of student academic growth attributable to individual educators. Definition: A participatory method of setting measurable goals, or objectives for a specific assignment or class, in a manner aligned with the subject matter taught, and in a manner that allows for the evaluation of the baseline performance of students and the measureable gain in student performance during the course of instruction. Teacher Evaluation Components Student Growth (individual attribution) Student Growth (collective attribution) Professional Practice (observation) Academy District 20 Collective Attribute TCAP Administrative/Teacher-defined assessments of individual attribution (38% for those without an individual TCAP attribution) Professional Practices 12% 13% 50% 25% Harrison School District Collective Attribute District Assessment TCAP/MGP National Assessments Professional Practices 5% 5% 25% 50% 15% *Harrison has identified 97 different roles in the district. Each role’s pie chart is weighted differently. All the scales are based on the 4-5 grade core teacher shown above. APS Individually Attributable Measures Teacher Evaluation Revision Committee (TERC) Members Process Options Pre-Post SLOs Why SLOs? Suggestion 1 from 8/27/13 TERC subcommittee meeting Collective Attribute TCAP SLOs Rubric (Professional Practices) 30% 50% 10% 10% Suggestion 2 from 8/27/13 TERC subcommittee meeting Collective Attribute TCAP SLOs Rubric (Professional Practices) 10% 20% 50% 20% Suggestion 3 from 8/27/13 TERC subcommittee meeting Collective School/District SLOs Rubric (Professional Practices) 0% 15% 50% 35% Suggestion 4 from 8/27/13 TERC subcommittee meeting Collective Attribute SLOs Rubric (Professional Practices) 10% 50% 40% Where are SLOs being used? Districts and States across the US Denver Public Schools New Hampshire Austin Hawaii Rhode Island North Carolina New York Ohio Georgia Massachusetts Utah Pennsylvania Wyoming Maine Why take an SLO approach? Documented to: Have high levels of credibility with educators – expectations situated directly within the classroom context and space Be Adaptable to new assessments Be Adaptable to all teaching assignments Have Face validity – developed by teachers Most qualities above are not applicable to evaluating teacher impact on student learning growth using other common approaches What SLOs are not. . . Simple computation of pre- and postassessment score differences Representative of output measures Results based on one measure/assessment instrument An “add-on” for teachers engaged in effective teaching practices SLO Session One Agenda Background and Purpose of SLOs SLO Learning Goals Purpose and Components of Pilot Test SLO Template and Process Determining Depth of Knowledge Success Criteria Purpose of the SLO Pilot Test Develop an approach that can be used in Aurora to measure student learning growth that can be attributed to an individual educator. Ensure this approach focuses educators on instructional practices that are likely to improve student learning results at the same time. Learn from “trying it out” in real schools in APS. Determine how to integrate SLOs with related APS initiatives. Build the plane while flying it! Project Basics Select a partner and take out the SLO Pilot Test Overview (Overview Tools, p. 3) Discuss these questions with your partner: What will be the focus of this pilot test? What does participating in the project mean to me? What am I concerned about? (capture on sticky notes) Integration with other Initiatives SLOs are context dependent. Critical outcomes of the pilot test: Determine how SLO efforts can be integrated with other district initiatives. Determine how SLO efforts can support and accelerate related reforms. Ensure relevant district resources are utilized in support of SLOs. Design Today: Project Overview and SLO Learning Goals Follow-up: Identify “candidate” SLO Learning Goals Four additional learning sessions (in person, about four weeks apart): Introduce/explore practices (identifying and tracking student progress towards SLOs). De-Brief how it went “trying-out” developing SLOs and associated practices in your classroom since the last learning session. Between Learning Sessions: Follow-up Try out new practices in your classroom On-site support (local instructional leaders) SLO Session One Agenda Background and Purpose of SLOs Purpose and Components of Pilot Test SLO Template and Process SLO Learning Goals Determining Depth of Knowledge Success Criteria SLO Process 10. Determine Teacher Rating 1. Identify the SLO learning goal 9.Analyze assessment results 2. Select measures of student learning in relationship to the Learning Goal 3. Establish performance targets using baseline data 8.Revise SLO if necessary 7. Monitor student progress towards SLO Learning Goal 6. Implement SLOrelated instruction 4. Plan instruction 5. Receive approval of SLO SLO Terminology Take out Student Learning Objectives Terminology (Overview Tools, p.5). Work with a partner to answer the following questions: How long is an “instructional interval” in the context of SLOs? What is the difference between a “Big Idea” and a “Learning Goal” (for an SLO)? Is there a difference between a “learning goal” and a “learning objective”? If so, what is the difference? What are “success criteria”? How are they related to “learning goals or objectives”? What is the relationship between an “assessment instrument” and a “measure”? Student Learning Objectives Form Take out Student Learning Objectives Form (Overview Tools, p.9) How are Student Population and Instructional Interval determined? SLO components: 1. Student Learning Goal 2. Measures (of student learning growth in relationship to the student learning goal) 3. Performance Targets (for student performance groups) 4. Progress Monitoring 5. Results (student performance and teacher performance) SLO Components Learning Goal Learning Goal Standards Reference Rationale Measures Success Criteria Evidence Sources Alignment of Evidence Performance Targets Day One Collection and Scoring Baseline Data Day Two Day Three and Four Performance Groups Performance Targets Progress Monitoring Rationale for Targets Check Points Progress Monitoring Evidence Sources Instructional Strategies SLO Results Student Performance Results Targets Met Teacher Performance Day Five Activity: Exploring SLO Components SLO Component Descriptions (Tools, p.11). Consider the components of an SLO Learning Goal. Independently (silently) read the first row. When you have finished reading, look at your partner and “say something”. The something could be: A summary A connection A new idea Continue until you have considered all of the components of a Student Learning Objective Full Group considers questions. . . SLO Session One Agenda Background and Purpose of SLOs SLO Learning Goals Purpose and Components of Pilot Test SLO Template and Process Determining Depth of Knowledge Success Criteria SLO Learning Goals A description of what students will be able to do at the end of the instructional period. Based on the intended standards and curriculum that are being taught and learned. Reflective of the most critical content taught and learned during the instructional interval (the Big Ideas). Levels of Objectives Level of Objective Scope Time needed to learn Purpose or function Example of use Global Educational Instructional Broad Moderate Narrow Two or more years (often many) Weeks, months, or academic year Hours or days Provide vision Design curriculum Prepare lesson plans Plan units of instruction Plan daily lessons, activities, experiences and exercises Plan a multi-year curriculum (e.g., elementary reading) A Taxonomy for Learning, Teaching, and Assessing: A revision of Bloom’s taxonomy of educational objectives, 2001 Overview Tools, p. 1 Statements of Intended Learning Learning Goals, Learning Objectives, Learning Targets include: A verb that describes a cognitive process or processes. The content or knowledge to which the cognitive process(s) applies. SLO Learning Goal Components Learning Goal Standards/Benchmarks Rationale Success Criteria How SLO Learning Goal Components are Related SLO Learning Goal is based on: The big ideas for the content area/grade level, and Relevant content standards. The Learning Goal Rationale explains why the learning goal is appropriate. Success Criteria are guidelines, rules, or principles by which student performance is evaluated. They describe what level of performance constitutes having met the Learning Goal. SLO Learning Goal Process 1. 2. 3. 4. 5. 6. 7. Identify the “big ideas” for the grade level and content area. Identify learning goals associated with at least one “big idea” that would be achieved across several units, and/or which have related goals in prior or subsequent grade levels. These become candidates to be an SLO Learning Goal. Determine which standards are associated with each candidate SLO Learning Goal. Prioritize possible Learning Goals based on the learning needs of the student population (identifying two or three top priorities). Determine the cognitive complexity (depth of knowledge) of the priority SLO Learning Goal candidates. Eliminate candidate SLO learning goals with a depth of knowledge less than 2 for primary (grades K-2) and less than 3 for upper elementary and secondary. Select the SLO Learning Goal. Describe the rationale for your selection. Process for determining SLO Learning Goals Turn to the Process for Determining SLO Learning Goals (Tools, p. 3). Independently, review each step and make notes about your concerns regarding each step. Discuss at your table: How is this similar to other work in which you have engaged? How is it different? What concerns do you have about the process? The “Big Ideas” What are “big ideas”? Declarative statements that describe concepts that transcend grade levels. Frame or context for identifying the SLO learning goal. Should not be the “focus” of learning goal development. Learning Goals vs. Big Ideas Different format: Learning goals describe what students should know, understand and be able to do. Big ideas are declarative statements. Different purpose: Big Ideas create context for learning goals. SLO Learning Goals focus instruction and learning activity. Resources for determining “Big Ideas” APS Pacing Guides CCSS ELA: Anchor Standards Portraits of students who are college ready CCSS Math: Colorado Academic Standards: Graduate Competencies Grade Level Expectations (that cut across grade levels) CDE Sample Curriculum Unit Overviews: Grade Level Descriptions Generalizations Your ideas? Content Standards Learning Goals must reference statements of intended learning from relevant standards documents. Statements of intended learning: Example: Colorado Academic Standards (CAS): Evidence Outcomes Common Core State Standards: Standard CAS: Identify place value from ten-thousandths to millions. Common Core State Standard: Use place value understanding to round multi-digit whole numbers to any place. Capture the full language from the standard statement (not just a numerical reference) Identifying a Learning Goal (example) Big idea: Writing from sources requires using evidence from texts to present careful analysis, well defended claims, and clear information. Example Standards Write arguments to support claims with clear reasons and relevant evidence: Introduce claims, acknowledge alternative or opposing claims, and organize the reasons and evidence logically. Support claim(s) with logical reasoning and relevant evidence, using accurate credible sources and demonstrating an understanding of the topic or text. Use words, phrases, and clauses to create cohesion and clarify the relationships among claim(s), reasons, and evidence. Establish and maintain a formal style. Provide concluding statement or section that follows from and supports the argument presented. Example Learning Goal Students gather relevant evidence from multiple sources to make and support strong written arguments. SLO Learning Goal Process 1. 2. 3. 4. 5. 6. 7. Identify the “big ideas” for the grade level and content area. Identify learning goals associated with at least one “big idea” that would be achieved across several units, and/or which have related goals in prior or subsequent grade levels. These become candidates to be the SLO Learning Goal. Determine which standards are associated with each candidate SLO Learning Goal. Prioritize possible Learning Goals based on the learning needs of the student population (identifying two or three top priorities). Determine the cognitive complexity (depth of knowledge) of the priority SLO Learning Goals. Eliminate candidate SLO learning goals with a depth of knowledge less than 3 for secondary and less than 2 for elementary. Select the SLO Learning Goal. Describe the rationale for your selection. Practice Developing an SLO Learning Goal: Step one through three Find a partner(s) who has a similar content area focus (stay with same partner(s)). Turn to the note catcher, p. 10 Step one: Describe the “big ideas” for your focus grade level and content area. Step two: Identify several “candidate” Learning Goals for at least two different “big ideas”. Step three: Identify standards associated with your big idea and candidate learning goals. Step four: Prioritize based on learning data Review student performance data in your content area/grade level. Identify challenges for your student population. Prioritize learning goals based on these areas of challenge. Your follow-up from today. . . we won’t practice this now. SLO Session One Agenda Background and Purpose of SLOs SLO Learning Goals Purpose and Components of Pilot Test SLO Template and Process Determining Depth of Knowledge Success Criteria SLO Learning Goal Process 1. 2. 3. 4. 5. 6. 7. Identify the “big ideas” for the grade level and content area. Identify learning goals associated with at least one “big idea” that would be achieved across several units, and/or which have related goals in prior or subsequent grade levels. These become candidates to be the SLO Learning Goal. Determine which standards are associated with each candidate SLO Learning Goal. Prioritize possible Learning Goals based on the learning needs of the student population (identifying two or three top priorities). Determine the cognitive complexity (depth of knowledge (DOK)) of the candidate SLO Learning Goals. Eliminate candidate SLO learning goals with a depth of knowledge less than 2 for primary and less than 3 for grades 3-12. Select the SLO Learning Goal. Describe the rationale for your selection. Depth of Knowledge (DOK) and SLOs The DOK level reflected in the SLO Learning Goal should target the DOK level reflected in the associated standards. The DOK level reflected in the learning goal sets an expectation for how you want students to demonstrate learning. Assessments used for SLOs will also need to target the same DOK level reflected in the learning goal. What is cognitive rigor? The kind and level of thinking required of students to successfully engage with and solve a task. Ways in which students interact with content. Focuses on complexity of content standards and assessment items or task. “Measures the degree to which the knowledge elicited from students on assessments and performance indicators or through questioning is as complex as what students are expected to know and do as stated in the state standards.”Norman Webb 57 Cognitive Rigor Models Different states/schools/teachers use different models to describe cognitive rigor. Each addresses something different. Bloom – What type of thinking (cognitive process) is needed to complete a task? Webb – How deeply do you have to understand the content to successfully interact with it? How complex is the content? Bloom’s Taxonomy Original  vs. Revised Cognitive Dimension  - Tools, p. 5 Original Revised Cognitive Dimension Knowledge -- Define, duplicate, label, list, name, order, recognize, relate, recall Remember -- Retrieve knowledge from long-term memory, recognize, recall, locate, identify Comprehension -- Classify, describe, discuss, explain, express, identify, indicate, locate, recognize, report, review, select, translate Understand -- Construct meaning, clarify, paraphrase, represent, translate, illustrate, give examples, classify, categorize, summarize, generalize, predict… Application -- Apply, choose, demonstrate, dramatize, employ, illustrate, interpret, practice, write Apply -- Carry out or use a procedure in a given situation; carry out or use /apply to an unfamiliar task Analysis -- Analyze, appraise, explain calculate, categorize, compare, criticize, discriminate, examine Analyze -- Break into constituent parts, determine how parts relate Synthesis -- Rearrange, assemble, collect, compose, create, design, develop, formulate, manage, write Evaluate -- Make judgments based on criteria, check, detect inconsistencies/fallacies, critique Evaluation -- Appraise, argue, assess, choose, compare, defend, estimate, explain, judge, predict, rate, core, select, support, value Create -- Put elements together to form a coherent whole, reorganize elements into new patterns/ structures Explore Revised Bloom’s Taxonomy Work with a partner to review the Cognitive Dimension of the Revised Bloom’s Taxonomy (Tools, p. 7). Consider: How is the Cognitive dimension organized? What is the difference between different types of thinking or cognitive processes? How do the examples clarify the taxonomy? Identify questions for full group discussion. Explore DOK Levels Take out: Webb’s Depth of Knowledge Framework Level Definitions (Tools, p. 9) and Depth of Knowledge Levels Chart (Tools, p.11). Work with a partner to consider: How are DOK levels described? What is the difference between each DOK level? How do the examples clarify the levels? Identify questions for full group discussion. DOK Level 1 Examples Locate or recall facts found in text Apply a well-known formula Orally read words in connected text with fluency and accuracy State an opinion without support Name the notes of the C Major scale Represent math relationships in words, pictures, or symbols Perform a simple science process or a set of procedures DOK Level 2 Examples Identify and summarize the major events, problem, solution, conflicts in literary text Explain the cause-effect of historical events Retrieve information from a table, graph, or figure and use it to solve a problem requiring multiple steps Develop a brief text that may be limited to one paragraph Make a puzzle or game about the topic Create a questionnaire or survey to answer a question Write a diary/blog entry for a character or historical figure 63 DOK Level 3 Examples Compare consumer actions and analyze how these actions impact the environment Analyze or evaluate the effectiveness of literary elements Solve a multi-step problem and provide support with a mathematical explanation that justifies the answer Write a letter to the editor after evaluating a product Use reasoning and evidence to generate criteria for making and supporting an argument of judgment Prepare a speech to support your perspective about global climate change Make a booklet or brochure about a topic or an organization 64 DOK Level 4 Examples Gather, analyze, organize, and synthesize information from multiple sources to draft a reasoned report Analyze and explain multiple perspectives or issues with or across time periods, events, or cultures Conduct a project that specifies a problem, identify solution paths, solve the problem, and report the results Write and produce an original play Critique the historical impact of policy, writings, and discoveries Illustrate how multiple themes (historical, geographic, social) may be interrelated Relate mathematical or scientific concepts to other content areas, other domains, or other concepts 65 Some things to consider. . . Extended time alone is not the distinguishing factor for a learning goal with a DOK Level 4. DOK is not hierarchical. DOK and Bloom’s Taxonomy are different - the DOK Level is NOT determined by the verb, but rather the context in which the verb is used and the depth of thinking required. DOK is about complexity—not difficulty! The intended student learning outcome determines the DOK level. What mental processing must occur? While verbs may appear to point to a DOK level, it is what comes after the verb that is the best indicator of the rigor/DOK level. DOK 1 - Describe three characteristics of metamorphic rocks. (Simple recall) DOK 2 - Describe the difference between metamorphic and igneous rocks. (Requires cognitive processing to determine the differences in the two rock types) DOK 3 - Describe a model that you might use to represent the relationships that exist within the rock cycle. Provide evidence to support your decision. (Requires deep understanding of the rock cycle and a determination of how best to represent it by providing evidence) 67 Combining Blooms and DOK Cognitive Rigor Matrix combines Blooms Revised Taxonomy and Depth of Knowledge in one table. Not a crosswalk between these two classification tools. Not to be used as a rubric. Available in a variety of content areas: English Language Arts Social Studies Math Science Writing The Cognitive Rigor Matrix: Applies Webb’s DOK to Bloom Cognitive Process Dimensions Depth + thinking Level 1 Recall & Reproduction Remember - Recall, locate basic facts, details, events Understand Level 2 Skills & Concepts Level 3 - Select appropriate words to use when intended meaning is clearly evident - Specify, explain relationships - summarize – identify main ideas - Explain, generalize, or connect ideas using supporting evidence (quote, example…) - Explain how concepts or ideas specifically relate to other content domains or concepts Apply - Use language structure (pre/suffix) or word relationships (synonym/antonym) to determine meaning – Use context to identify meaning of word - Obtain and interpret information using text features - Use concepts to solve non-routine problems - Devise an approach among many alternatives to research a novel problem Analyze - Identify whether information is contained in a graph, table, etc. – Compare literary elements, terms, facts, events – analyze format, organization, & text structures - Analyze or interpret author’s craft (literary devices, viewpoint, or potential bias) to critique a text – Analyze multiple sources - Analyze complex/abstract themes – Cite evidence and develop a logical argument for conjectures - Evaluate relevancy, accuracy, & completeness of information - Synthesize information within one source or text - Synthesize information across multiple sources or texts Evaluate Create - Brainstorm ideas about a topic - Generate conjectures based on observations or prior knowledge Strategic Thinking/ Reasoning Level 4 Extended Thinking Cognitive Rigor Matrices Considering both classification systems together, deepens understanding of learning goals. Examples: English Language Arts and Social Studies (Tools, p. 13) Math and Science (Tools, p. 15) Writing (Tools p. 17) Select one to explore further. Definition of “deconstructing” When we deconstruct a statement of intended learning, we break it into its component parts. Stiggins, 2004 Deconstructing means to determine… What students need to know - the knowledge, concepts or content? In what cognitive processes students need to engage? © CTLT 2008 We deconstruct to… Clarify the type of thinking (or skills) required of learners to meet the learning goal; Determine what knowledge students will require to meet the learning goal; and Classify the cognitive rigor of the learning goal. How to deconstruct Think of the verbs in our learning goal as the cognitive processes and/or types of thinking that are being required. Think of the nouns in our learning goal as clues to the knowledge (concepts or content). Use Bloom’s Taxonomy and Webb’s DOK framework to classify the rigor of the learning goal. How to Deconstruct 1. Circle the cognitive processes or skills— the verbs. 2. Underline the key concepts or knowledge—the important nouns and noun phrases. 3. Use the Cognitive Rigor Matrix to: Characterize the type of thinking based on Revised Bloom’s Taxonomy. Assign a DOK level. Example Deconstruct the instructional objective: Recognize that different forms of writing have different patterns of organization. Cognitive Process (Blooms): Remember Depth of Knowledge Level: One Practice Turn to Deconstruction Practice (note catcher, p. 14). Work with a partner to deconstruct and categorize the example instructional objectives. Share your ideas with the full group. Applied to your Learning Goals Go back to your candidate learning goals. Deconstruct them identifying the cognitive process (Blooms) and Depth of Knowledge, capture in your note catcher. What is the cognitive rigor of your candidate learning goals? A good SLO learning goal is cognitively complex (as measured by Depth of Knowledge) with a DOK >= 3 for secondary and upper elementary and a DOK >= 2 for primary (K-2). Have you identified an SLO learning goal with sufficient cognitive rigor? Deconstructing A tool for your “toolkit.” Used to get at the meaning of statements of intended learning: Used in determining the cognitive rigor of SLO Learning Goals. Collaborative process used to determine meaning of learning objectives (including grade level expectations and evidence outcomes). First step towards ensuring accuracy in assessment. Used in the process of critiquing/selecting assessment items. SLO Session One Agenda Background and Purpose of SLOs SLO Learning Goals Purpose and Components of Pilot Test SLO Template and Process Determining Depth of Knowledge Success Criteria Success Criteria Guidelines, rules, or principles by which student responses, products or performances are evaluated. They describe what to look for in student products or performances to judge quality. They are also used to determine if the learning goal(s)/target(s) was/were met. Success Criteria We use success criteria to: Define student competency, mastery, performance, knowledge on tasks, assessments, courses, etc. We also need to use success criteria to: Help define and justify what each performance level means on the SLO learning goal. How good is “good enough”? Talk to your neighbor about. . . What evidence do teachers use to decide students are learning? How have we traditionally answered that question? Why is this question important? Why is it important for students to understand how good is good enough? What is good enough (traditionally)? Behavior: Attendance Participation Compliance Grades: In terms of the learning, what is the difference in learning between an F and A, between B+ and A- ? Percentages: In terms of the learning, what is the difference between 89 and 90? In a standards-based system Success Criteria: Define the nature of quality, how good is good enough as the basis for judging student learning progress. Define levels of performance on the SLO Learning Goals and instructional level targets. May also be called: Proficiency Descriptions Performance Criteria Background Reading Discussion Work with a partner. Each partner reads one of the following: Clarke (2001) Developing a “learning culture” in the school (Toolkit, p. 21). Moss & Brookhart (2009). What does it mean to share learning targets and criteria for success? (Toolkit, p. 23). Discuss: What are success criteria/performance criteria/ proficiency descriptors? Why is it important to identify success criteria? What are good success criteria? Success Criteria. . . Reflect what “quality is” in the product, performance, or task. Do not. . . Leave important things out Include the trivial (e.g. neatness) Video: Student Benefits John McKinney, Grade 8, Science What did you hear students say about the benefits of success criteria, or knowing what proficiency looks like? Rubrics A tool for: Clarifying instructional goals (for teachers). Communicating Scoring success criteria (to students). assignments and assessments. Includes descriptions of more than one level of performance. More on rubrics later today. Posting Exemplars Engaging Learners in Describing Success Criteria Hearing the learner perspective Video: Emily’s Story Take a few minutes to read Emily’s papers (Tools, p. 35). Take notes as you watch the video: How were learners involved in defining and interpreting success criteria? How did this enhance student motivation and learning? Stiggins, R. J., Arter, J. A., Chappuis, J., & Chappuis, S. (2004). Classroom assessment for student learning: Doing it right – using it well. Commonly used methods of sharing success criteria with students include: 1. Provide students with lists of attributes. 2. Provide students with a scoring guide or rubric. 3. Provide examples of work that demonstrates mastery and engage students in determining why the work demonstrates mastery. 4. Share work that does not demonstrate mastery with students and have them describe what it would take for the work to demonstrate mastery. 5. Facilitate student evaluation of different work examples using rubrics or scoring guides. 6. Facilitate student development of rubrics or scoring guides after analyzing examples of work. 7. Other? (Tools, p. 39) Success Criteria Teacher Considerations for the SLO learning goal What performance label would you use to describe student success on (having met) the SLO learning goal? What expectations do you have to define “success” on the learning goal? Which assessments, tasks, or data sources would give you the best information about student performance on the learning goal? How would you weigh the body of evidence collected about student performance on the learning goal? Taking it into practice Practice developing an SLO Learning Goal: Identify candidate SLO Learning Goals (starting with “big ideas”). Specify associated standards. Determine which represent learning needs for your students. Determine the cognitive complexity of the candidate SLO Learning Goals. Select one, write as an “objective statement”. Describe success criteria for your SLO Learning Goal. Practice engaging learners with success criteria. Bring a draft SLO Learning Goal (note catcher) and artifacts from your experience engaging learners in understanding success criteria to our next in-person learning session. Give us Feedback!! Written: Use sticky notes + the aspects of this session that you liked or worked for you. The things you will change in your practice or that you would change about this session. ? Question that you still have about the topics we addressed today. Ideas, ah-has, innovations Oral: Share out one ah ha! References Anderson, L.W., Krathwohl, D.R., Eds. (2001). A Taxonomy for Learning Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. New York, NY: Addison Wesley Longman, Inc. Bloom, B. (1984). The search for methods of group instruction as effective as on to one tutoring. Educational Leadership, 41(8): 4-17. Bransford, J, Brown, A & Cocking, R. (Eds). (1999). How People Learn: Brain, Mind, Experience, and School. Washington DC: National Academy Press Hess, K., Carlock, D., Jones, B., & Walkup, J. (2011). What exactly do “fewer, clearer, and higher standards” really look like in the classroom? Using a cognitive rigor matrix to analyze curriculum, plan lessons, and implement assessments. National Center for the Improvement of Educational Assessment, Accessed on-line at: http://www.nciea.org/publication_PDFs/cognitiverigorpaper_KH12.pdf Stiggins, R. J., Arter, J. A., Chappuis, J., & Chappuis, S. (2004). Classroom assessment for student learning: Doing it right – using it well. Webb, N. L. (1997).Criteria for alignment of expectations and assessments in mathematics and science education. Council of Chief State School Officers and National Institute for Science Education Research Monograph No. 6. Madison, WI: University of Wisconsin.