Download RTF Of Presentation

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Business intelligence wikipedia , lookup

Transcript
Collaborating with a
university to enhance
your program evaluation
efforts.
Cathy Chambless, PhD, MPA, CRC
Center for Public Policy & Administration
University of Utah
September 21, 2010
Advantages of external
evaluation
 Objectivity
 Expertise
 Credibility
 Outsourcing intermittent tasks
Identify potential
evaluation resources
 Logical partners
 UCEDDR, TACE, RCE
 look at their products – what have
they done that is similar to what
you want?
 Are they familiar with :
 populations served (people with
disabilities)?
 program structures
(federally-state partnerships)?
Models
 Internal employees
 External contract
 Hybrid
 Collaborative model
Finding mutual benefits
 DVR needs
 Evaluation expertise
 Professional products
 Timely service
 Collaborative team
Universities have:
 A mission to do community service
 High level of expertise and quality
 For rates that are below private market.
 Faculty
 Research or teaching interests
 Students
 Service Learning projects/Practica
 Staff

Looking for “soft money” projects
Issues
 Indirect costs (F&A rates) :
 may be reduced for government agencies
or NGO’s.
 Procurement:
 public agencies may not have to get formal
“bids” from public universities for
contracts.
 Institutional Review Board:
 ensures protection of clients and staff
Create successful
collaboration: SRC, VR,
and external evaluator
 VR: Prepare description of what
you need
 Comprehensive statewide needs
assessment
 Consumer satisfaction evaluation
 Administrative data analysis
 SRC: Involve members in defining
project
 Substance
 Transparency
 Evaluator:
written proposal
 To define the project as evaluator sees it.
After contractor has been
chosen
 Meet to discuss the written
proposal and ask questions,
 Clarify each other’s jargon,
 Help external evaluator understand
your processes
 Learn what’s involved in the
“evaluation method”
Define your role
 As overseer of the project
 What are the lines of
communication?
 How often do you want updates?
 Who should be involved?
Collaboration Example –
CSNA
Comprehensive Statewide Needs
Assessment
University evaluators presented
to full meeting of SRC
•


Purposes of project
How it relates to the functions of the
SRC
SRC appointed a CSNA
subcommittee
•

To meet regularly with evaluation team.
Formed a ‘Stakeholders Group’
•



VR administration staff (evaluation
coordinator, director of support
services, case services
SRC subcommittee
University Evaluators
Function of stakeholders
group
•
•
Focus research questions
Refine the survey questions
Help interpret administrative
data (RSA compiled data).
• Field test surveys.
• Review overall results to
interpret meaning.
•
Post mortem –
 What could improve process next
time?
Resources
 InfoUse (2008, November). Developing a Model
Comprehensive Statewide Needs AssessmentWith
Corresponding Training Materials For StateVR Agency
Staff and SRC Members: The ModelVR Needs
Assessment Technique. Berkeley: Author.
 Altschuld, James W, Mumar, David (2010).
Needs Assessment: An Overview, Los Angeles: Sage
Publishing.