Publication - The CT Coalition to End Homelessness

Report
Building a Sustainable Coordinated
Access System Through the Use of Best
Practices and Critical Data Elements
P R E S E N TAT I O N TO T H E C C E H
A N N UA L T R A I N I N G I N S T I T U T E
5/8/14
Coordinated Access
 Design a coordinated system of triage, assessment,
and entry that effectively moves people out of
homelessness quickly by providing increased access
to appropriate housing interventions.
Connecticut Coordinated Access Network (CAN) Template
Client /Family in
Housing crisis
1. First effort:
shelter diversion
Shelter
Diversion
Client
CAN
Client calls
2-1-1
2-1-1
Diversion to
non-housing
resources
If DV case,
refer to DV
provider
Shelter
If client needs
not met with
non-housing
resources,
complete basic
screen and refer
client for
CAN intake
DV Services
2. If diversion not
possible, complete
assessment (VISPDAT or similar )
and refer to shelter
3. If shelter is needed,
but not available, refer
to HOT team
4. Shelter/HOT team
refers client to
appropriate housing
resource
HOT
Team
RRH
TH
PSH
“One of the great mistakes is to
judge policies and programs by
their intentions rather than their
results”
Milton Friedman
During This Workshop We Will:
• Discuss principles of evaluation research to document
coordinated access activities and outcomes
• Learn how to utilize data to inform systems change
• Examine preliminary process and outcome data from
coordinated access efforts in CT
• Introduce tools used to evaluate effectiveness of
coordinated access systems
• Take a preliminary look at what our new HMIS can do
to help us track and evaluate our coordinated access
efforts
4 Types of Program Evaluation
1) Process Evaluation- assesses whether a program or
system is operating as it was intended to
2) Outcome Evaluation- assesses whether a program is
achieving the goals it intended to
3) Impact Evaluation- assesses program outcomes against
what would have happened if the program was not in
place (ex.- randomized control trial)
4) Cost-benefit analysis- identifies all relevant program
costs and benefits (in financial terms) to determine
whether the program is cost effective
Evaluation Cycle
Identify
research
questions
Communicate
findings to
stakeholders
Analyze data
Choose
performance
indicators
and choose
critical data
elements
Collect valid
and reliable
data
Planning the Evaluation
Who Should Be Involved?
 Housing providers (leaders and front line staff)
 HMIS lead
 CCEH
 211
 Health care providers
 Behavioral health care providers
 Representatives from the school system
 Consumers
 Others?
Other Considerations:
 Who should manage and oversee the evaluation process?
 How can we get program buy-in?
 Who is responsible for analyzing and reporting out on the data?
 How are results funneled back to the system?
Identifying Our Research Questions for Coordinated Access
 What does a successful coordinated access system look like?
 How many people will access the system
 How many will be diverted from shelter
 Number referred for shelter
 Number enrolled in shelter
 Number who access system multiple times in a specified timeframe
 How will clients benefit?
 Shorter period of time to shelter entry
 Rates of homelessness declining
 Shorter periods of time homeless
 Less chronic homelessness
 Percentage diverted who come back into the system
 How will agencies benefit (efficiency, collaboration)
 How does coordinated access impact staff time spent on answering
phones, intakes, etc.
 How well are agencies collaborating
Results Based Accountability- research questions should address the following
domains: How much, How well, Is anyone better off
Choosing performance indicators and critical data elements
 How do we operationalize our research questions?
 Clear and consistent definitions (ex. what is an episode of
homelessness, categories for exit destination, chronic
homelessness)
 Assess what data we already collect through HMIS
 HEARTH indicators
 Shelter length of stay
 New entries into homelessness
 Repeat episodes of homelessness
 Job and income growth
 VI-SPDAT or other universal assessment tool
Choosing performance indicators and critical data elements
 What additional data should we collect?
 How do we gather enough data to show meaningful outcomes, but not so
much as to waste client and staff time?
 What data are realistic to collect given our timeframe and resources?
 Consumer surveys
 Can qualitative data be collected and analyzed?
 Interviews, focus groups, observations
 This process can be very labor and cost intensive
 How will data collection be coordinated?
Collecting valid and reliable data
 Data quality
 Train, train, train
 Ongoing monitoring
 What if what we’re looking to collect is not in HMIS
 Use of common database/ forms
 HMIS
ECM Reporting - SUR
ECM Reporting
 Other ECM reports:
 Data Quality
 Data Timeliness
 APR
 CAPER
 PATH
 SSVF
 HOPWA
 Services by Program
 Clients Served
Reporting - VITALS
 VITALS (Valuable Information to Assess Local Systems)
 Vision: Dashboard-style report
 Relevant HEARTH measures
 Web-based reporting portal
 Aggregated information for programs and communities
 Dynamic, drill-down capability
 Level of Analysis
 Program
 Community (CoC / Sub CoC)
 State
 Currently under development
Analyze data
 Are there any comparison data available
 Same system before coordinated access was implemented (PIT count)
 Different, but similar community
 What timelines are data pulled for?
 Keep in mind that some outcomes are short term and some are long
term
 For process and outcome measures, break out results by subgroups to
identify any trends
 Gender
 Race
 geographical area
 Age
 household type (single vs. family)
 Chronic homeless
Communicate findings to stakeholders
 Communication should be ongoing and involve staff at all levels of the
program or system
 Preliminary findings should be discussed along the way and not just
at the end (this allows for changes to be made if things are not going
in the desired direction)
 All findings should be discussed- both positive and negative, intended
and unintended
 Limitations of the data must be discussed so that everyone is clear on
what the data do and do not tell us
 Create a system by which program changes can or will be made based
on evaluation results
 Discussion on which results are presented to the community at large
What Have Data on
Coordinated Access
Efforts in CT Shown Us
So Far?
New London Coordinated Access: Families
 Coordinated Access system has been in place for about 2.5 years
 211 is the primary point of contact
 Process Data
 In 2013, 948 calls to 211
 54% were diverted from shelter
 46% were scheduled for appointments for shelter
 About a 30% no show rate
 163 were diverted at the time of shelter intake
 Outcome Data
 50% Reduction in family shelter units/beds since 2011
 In 2013, 44 families were rapidly rehoused
 Average length of time in the shelter was 45.5 days
New London Coordinated Access: Singles



System for single adults started November 2013 with 211 as the first,
centralized contact
Process Measures
 An average of 20 new intakes each week
 Rate of diversion of 16% in New London and 46% in Norwich
 Of the 743 calls that came in between 7/1/13-4/30/14: 66% enrolled
in shelter, 16% diverted, 12% wait listed, 4%v admitted to Covenant,
and 1% no need for shelter.
 Shelter length of stay: 60% stayed 30 days or less
Preliminary Outcome Data
 47% of shelter exits were positive (permanent housing,
treatment, or other stable housing situation)
Bridgeport Preliminary Data
This provides an example of the way data can be used to track population need over a
period of several years.
Decrease in Chronic Homelessness
Total Chronically Homeless in Greater
Bridgeport
•
180
160
140
120
100
80
89 Housed
through
CABHI
Program
60
40
20
0
VI Registry Change 2012-2014
PIT Change 2011-2013
First Count
171
137
Most Recent Count
83
81
Questions?
Additional Resources
What Gets Measured, Gets Done: A Toolkit on Performance
Measurement for Ending Homelessness- ABT Associates
http://www.endhomelessness.org/library/entry/what-gets-measured-getsdone-a-toolkit-on-performance-measurement-forCoordinated Assessment Toolkit: Evaluation- National Alliance
to End Homelessness
http://www.endhomelessness.org/library/entry/coordinated-assessmenttoolkit-evaluation
Presenter Contact Information
Meredith Damboise
Director of Quality Assurance, New Haven Home Recovery
[email protected]
Kelley Traister
Quality Assurance and Compliance Specialist, New Haven Home Recovery
[email protected]
Brian Roccapriore
Director of HMIS and Strategic Analysis, Connecticut Coalition to End
Homelessness
[email protected]

similar documents