Skip to content

Evaluation in Social Work The Art and Science of Practice

Best in textbook rentals since 2012!

ISBN-10: 0195308069

ISBN-13: 9780195308068

Edition: 4th 2006 (Revised)

Authors: Yvonne A. Unrau, Peter A. Gabor, Richard M. Grinnell

List price: $60.00
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Social work practice is founded upon the linkage between the objectives and goals of clients, programs, and agencies, and the evaluation process is critical for making sure those links are strong. Building on its earlier editions with seven new chapters and complete revisions of the others, as well as a strong online companion website prescence, this text is more relevant and user-friendly than ever. It provides a straightforward introduction to program evaluation couched within thequantitative and qualitative traditions - the two approaches most commonly used to generate relevant social work knowledge. The result gives students a sound conceptual understanding of how the ideas of…    
Customers also bought

Book details

List price: $60.00
Edition: 4th
Copyright year: 2006
Publisher: Oxford University Press, Incorporated
Publication date: 9/14/2006
Binding: Paperback
Pages: 560
Size: 9.80" wide x 7.72" long x 0.98" tall
Weight: 2.090
Language: English

Richard M. Grinnell, Jr. is a professor and holds the Clair and Clarice Platt Jones/Helen Frays Endowed Chair of Social Work Research within the School of Social Work at Western Michigan University, and Yvonne A. Unrau is an associate professor within the School of Social Work at Western Michigan University.

Richard M. Grinnell, Jr. is Professor and holds the Clair and Clarice Platt Jones/Helen Frays Endowed Chair of Social Work Research within the School of Social Work at Western Michigan University. Peter A. Gabor is a Professor within the Faculty of Social Work at the University of Calgary. Yvonne A. Unrau is Professor within the School of Social Work at Western Michigan University.

About the Authors
Preparing for an Evaluation
Becoming an Accountable Practitioner
Evaluation and Accountability
The Council on Social Work Education
The National Association of Social Workers
Quality Improvement in Social Service Programs
Case-Level Evaluation
Program-Level Evaluation
Case-Level and Program-Level Data for Quality Improvement
Evaluation and the Profession
Contributing to Evidence-Based Practice
Collaborating With Program Stakeholders
Policymakers
General Public
Funders
Administrators
Practitioners
Clients
A Word About Collaboration Among Stakeholder Groups
Integrated Accountability With Service Delivery
Client-Centered Practice
Evaluation From a Person-in-Environment Perspective
Evaluation From a Program-in-Environment Perspective
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Approaches to Accountability
The External Project Approach
Characteristics of the External Project Approach
Externally Driven
Resistant Social Workers
Intrusiveness
Periodic or No Feedback to Social Workers
Large Recommended Changes
Not Practical in Applied Settings
Difficult to Incorporate in Practice Settings
The Internal Monitoring Approach
Characteristics of the Internal Monitoring Approach
Internally Driven
Cooperative Social Workers
Integrated
Ongoing Continuous Feedback
Minor Recommended Change
Easy to Incorporate in Practice Settings
Advantages of the Internal Monitoring Approach
Increased Understanding of Programs
Relevant Feedback
Timely Feedback
Self-Protection
Practitioner and Client Satisfaction
Professionalism
Fine-Tuning Programs
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Designing Client-Centered Programs
Social Service Agencies
Agency Mission Statements
Agency Goals
Requirements for Goals
Agency Objectives
Social Service Programs
An Agency Versus a Program
Program Designs
Program Goals
Unintended Program Results
Program Goals Versus Agency Goals
Types of Program Objectives
Knowledge-Based Objectives
Affect-Based Objectives
Behavioral-Based Objectives
Qualities of Program Objectives
Meaningful
Specific
Measurable
Directional
Program Versus Practice Objectives
Bob's Self-Sufficiency
Jane's Job Dissatisfaction
Program Activities
Program Logic Models
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Getting Ready for an Evaluation
Program Scope and Evaluation
Planning With Stakeholders
Asking Evaluation Questions
Mapping Concepts
Reviewing the Literature
Developing Schedules
Tasks
Roles
Timelines
Producing Documentation
Identifying Data Needs
Focusing Evaluation Efforts
Client Demographics
Service Statistics
Quality Standards
Feedback
Client Outcomes
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Doing an Evaluation
Doing a Needs Assessment
What Are Needs Assessments?
Social Problems
Social Needs
Perceived Needs
Normative Needs
Relative Needs
Expressed Needs
Program Solutions
Steps in Doing a Needs Assessment
Focusing the Problem
Developing Needs Assessment Questions
Identifying Targets for Intervention (Unit of Analysis)
Establishing Target Parameters
Sampling (Data Sources)
Developing a Data Collection Plan
Reviewing Existing Reports
Secondary Data Analyses
Individual Interviews
Group Interviews
Telephone and Mail Surveys
Analyzing and Displaying Data
Collecting Quantitative Data
Collecting Qualitative Data
Disseminating and Communicating Findings
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Doing a Process Evaluation
Purposes of Process Evaluations
To Improve a Program's Operations
To Generate Knowledge for Our Profession
To Estimate Cost Efficiency
Steps in a Process Evaluation
Deciding What Questions to Ask
What Is the Program's Background?
What Is the Program's Client Profile?
What Is the Program's Staff Profile?
What Is the Amount of Service Provided to Clients?
What Are the Program's Interventions and Activities?
What Administrative Supports Are in Place to Support Client Service Delivery?
How Satisfied Are the Program's Stakeholders?
How Efficient Is the Program?
Developing Data Collection Instruments
Ease of Use
Appropriateness to the Flow of a Program's Operations
Design With User Input
Developing a Data Collection Monitoring System
Number of Cases to Include (Unit of Analysis)
Times to Collect the Data
Methods for Collecting the Data
Scoring and Analyzing Data
Developing a Feedback System
Disseminating and Communicating Results
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Doing an Outcome Evaluation
Purpose of Outcome Evaluations
Uses of Outcome Evaluations
Improving Program Services to Clients
Influencing Decisions?
Generating Knowledge for the Profession
Steps in Outcome Evaluations
Conceptualizing and Operationalizing Program Objectives
Operationalizing Variables and Stating the Outcomes
Designing a Monitoring System
Deciding the Number of Clients to Include (Unit of Analysis)
Deciding When Data Will Be Collected
Deciding How Data Will Be Collected
Analyzing and Displaying Data
Developing a Feedback System
Disseminating and Communicating Results
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Doing an Efficiency Evaluation
Cost Effectiveness Versus Cost Benefit
When to Evaluate for Efficiency
Steps in Conducting a Cost-Benefit Evaluation
Deciding on an Accounting Perspective
The Individual Program's Participants' Perspective
The Funding Source's Perspective
Applying the Procedure
Specifying the Cost-Benefit Model
Looking at Costs
Looking at Benefits
Applying the Procedure
Determining Costs
Direct Costs
Indirect Costs
Applying the Procedure
Determining Benefits
Applying the Procedure
Adjusting for Present Value
Applying the Procedure
Completing the Cost-Benefit Analysis
Applying the Procedure
Cost-Effectiveness Analyses
Applying the Procedure
A Few Words About Efficiency-Focused Evaluations
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Gathering Data and Making Decisions
Measuring Practice and Program Objectives
Why Measurement Is Necessary
Objectivity
Precision
Types of Measuring Instruments
Rating Scales
Graphic Rating Scales
Self-Anchored Rating Scales
Summated Scales
Goal Attainment Scaling (GAS)
Creating Practice Objectives From Program Objectives
Measurement by the Numbers
Standardized Measuring Instruments
Is the Measurement Useful?
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Data Sources, Sampling, and Data Collection Methods
Data Sources
Sampling
Probability Sampling
Nonprobability Sampling
Data Collection Methods
Obtaining Existing Data
Documents and Reports
Data Sets
Obtaining New Data
Face-to-Face Individual Interviews
Surveys
Group Interviews
Observation
Fitting Data Collection to the Program
Ease of Use
Appropriateness to the Flow of Program Operations
Design With User Input
Developing a Data Collection Monitoring System
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Developing a Data Information System
Staff Members' Roles in Developing a Data Information System
Establishing an Organizational Plan
Case-Level Data Collection
Program-Level Data Collection
Data Collection at Intake
Data Collection at Client Contact
Data Collection at Termination
Data Collection to Obtain Feedback
Data Management
Manual Data Management
Computer-Assisted Data Management
Reporting
A Look to the Future
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Using Graphics to Report Evaluation Data
Characteristics of an Effective Graphic
Bar Charts
Pie Charts
Line Graphs
Illustrations
Photographs
Checklist
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Analyzing Qualitative Data
Narrative Data
The Analysis Process
Get to Know Your Data
Focus the Analysis
Focus by Question or Topic, Time Period, or Event
Focus by Case, Individual, or Group
Categorize Information
Preset Categories
Emergent Categories
Identify Patterns and Connections Within and Between Categories
Within Category Description
Larger Categories
Relative Importance
Relationships
Interpretation-Bringing It All Together
"Nuts and Bolts" of Narrative Analysis
Data Management Tips
Check Your Data
Add ID Numbers
Prepare Data for Analysis
Make Copies
Identify the Source of All Data
Mark Key Themes
Define Categories
Cut and Sort
Make Connections
Enhancing the Process
Use Several Sources of Data
Track Your Choices
Involve Others
Pitfalls to Avoid
Avoid Generalizing
Choose Quotes Carefully
Address Limitations and Alternative Explanations
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Making Decisions With Data
Objective Data
Subjective Data
Case-Level Decision Making
The Engagement and Problem-Definition Phase
The Practice Objective Setting Phase
The Intervention Phase
Deterioration, or No Change
Insufficient, or Slow Change
Satisfactory Change
The Termination and Follow-Up Phase
Program-Level Decision Making
Process
Outcome
Problems and Cases
Program
Agency
Using Outcome Monitoring Data in Program-Level Decision Making
Acceptable Results
Mixed Results
Inadequate Results
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Knowing the Contexts of Evaluations
Evaluation Politics, Ethics, and Standards
Politics of Evaluation
Appropriate and Inappropriate Uses of Evaluation
Misuses of Evaluation
Justifying Decisions Already Made
Inappropriate Use of Public Relations
Used for Performance Appraisals
Fulfilling Funding Requirements
Proper Uses of Evaluation
Internal Decision Making
External Decision Making
Political Influences on the Evaluation Process
Manipulating the Evaluation Process
Misdirecting the Evaluation Process
Program Objectives
Sample Selection
Data Collection Methods
Interpretation of Findings
Professional Standards for Evaluation
Utility
Feasibility
Propriety
Accuracy
Principles of Evaluation Practice
Evaluation and Service Delivery Activities Should Be Integrated
Involve From the Beginning as Many Stakeholder Groups as Possible
Involve All Levels of Staff in the Evaluation Process
Make Explicit the Purpose of the Evaluation
Provide a Balanced Report and Disseminate Early and Regularly
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Culturally Appropriate Evaluations
The Impact of Culture
Bridging the Culture Gap
Cultural Awareness
Intercultural Communication
Nonverbal
Verbal
Cultural Frameworks
Orientation to Information
Decision Making
Individualism
Tradition
Pace of Life
Putting It Together: The Practice of Culturally Competent Evaluation
Cultural Awareness
Intercultural Communication Skills
Developing Specific Knowledge About the Culture
Adapting Evaluations
Working with Stakeholders
Adapting Processes
Providing Meaningful Products
Summing Up and Looking Ahead
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Writing Grant Proposals
Gathering Background Information
Program Concept
Your Program
Program Expenses
Components of a Proposal
Executive Summary
Statement of Need
Program Description
Objectives
Methods, or Interventions
Staffing/Administration
Evaluation
Sustainability
Budget
Expense Budget
Support and Revenue and Statement
Budget Narrative
Organizational Information
Conclusion
Letter Proposal
What Happens Next?
Summing Up
Recap and Online Materials
Study Questions
References, Further Reading, and Resources
Credits
Index