| |
| |
Acknowledgments | |
| |
| |
Preface | |
| |
| |
The Author | |
| |
| |
| |
Introduction to Evaluation | |
| |
| |
| |
Foundations of Evaluation | |
| |
| |
A Brief Overview of Evaluation History | |
| |
| |
Evaluation: Purpose and Definition | |
| |
| |
Performance Improvement: A Conceptual Framework | |
| |
| |
Making Evaluation Happen: Ensuring Stakeholders' Buy-In | |
| |
| |
The Evaluator: A Job or a Role? | |
| |
| |
The Relationship to Other Investigative Processes | |
| |
| |
When Does Evaluation Occur? | |
| |
| |
General Evaluation Orientations | |
| |
| |
Challenges That Evaluators Face | |
| |
| |
Ensuring Commitment | |
| |
| |
Benefits of Evaluation | |
| |
| |
Basic Definitions | |
| |
| |
| |
Principles of Performance-Based Evaluation | |
| |
| |
| |
Evaluation Is Based on Asking the Right Questions | |
| |
| |
| |
Evaluation of Process Is a Function of Obtained Results | |
| |
| |
| |
Goals and Objectives of Organizations Should Be Based on Valid Needs | |
| |
| |
| |
Derive Valid Needs Using a Top-Down Approach | |
| |
| |
| |
Every Organization Should Aim for the Best That Society Can Attain | |
| |
| |
| |
The Set of Evaluation Questions Drives the Evaluation Study | |
| |
| |
| |
Models of Evaluation | |
| |
| |
| |
Overview of Existing Evaluation Models | |
| |
| |
Overview of Classic Evaluation Models | |
| |
| |
Selected Evaluation Models | |
| |
| |
Selecting a Model | |
| |
| |
Conceptualizing a Useful Evaluation That Fits the Situation | |
| |
| |
| |
Kirkpatrick's Four Levels of Evaluation | |
| |
| |
Kirkpatrick's Levels | |
| |
| |
Comments on the Model | |
| |
| |
Strengths and Limitations | |
| |
| |
Application Example: Wagner (1995) | |
| |
| |
| |
Phillips's Return-on-Investment Methodology | |
| |
| |
Phillips's ROI Process | |
| |
| |
Comments on the Model | |
| |
| |
Strengths and Limitations | |
| |
| |
Application Example: Blake (1999) | |
| |
| |
| |
Brinkerhoff's Success Case Method | |
| |
| |
The SCM Process | |
| |
| |
Strengths and Weaknesses | |
| |
| |
Application Example: Brinkerhoff (2005) | |
| |
| |
| |
The Impact Evaluation Process | |
| |
| |
The Elements of the Process | |
| |
| |
Comments on the Model | |
| |
| |
Strengths and Limitations | |
| |
| |
Application Example | |
| |
| |
| |
The Cipp Model | |
| |
| |
Stufflebeam's Four Types of Evaluation | |
| |
| |
Articulating Core Values of Programs and Solutions | |
| |
| |
Methods Used in CIPP Evaluations | |
| |
| |
Strengths and Limitations | |
| |
| |
Application Example: Filella-Guiu and Blanch-Pana (2002) | |
| |
| |
| |
Evaluating Evaluations | |
| |
| |
Evaluation Standards | |
| |
| |
The American Evaluation Association Principles for Evaluators | |
| |
| |
Application Example: Lynch et al. (2003) | |
| |
| |
| |
Tools and Techniques of Evaluation | |
| |
| |
| |
Data | |
| |
| |
Characteristics of Data | |
| |
| |
Scales of Measurement | |
| |
| |
Defining Required Data from Performance Objectives | |
| |
| |
Deriving Measurable Indicators | |
| |
| |
Finding Data Sources | |
| |
| |
Follow-Up Questions and Data | |
| |
| |
| |
Data Collection | |
| |
| |
Observation Methodology and the Purpose of Measurement | |
| |
| |
Designing the Experiment | |
| |
| |
Problems with Classic Experimental Studies in Applied Settings | |
| |
| |
Time-Series Studies | |
| |
| |
Simulations and Games | |
| |
| |
Document-Centered Methods | |
| |
| |
Conclusion | |
| |
| |
| |
Analysis of Evaluation Data: Tools and Techniques | |
| |
| |
Analysis of Models and Patterns | |
| |
| |
Analysis Using Structured Discussion | |
| |
| |
Methods of Quantitative Analysis | |
| |
| |
Statistics | |
| |
| |
Graphical Representations of Data | |
| |
| |
Measures of Relationship | |
| |
| |
Inferential Statistics: Parametric and Nonparametric | |
| |
| |
Interpretation | |
| |
| |
| |
Communicating the Findings | |
| |
| |
Recommendations | |
| |
| |
Considerations for Implementing Recommendations | |
| |
| |
Developing the Report | |
| |
| |
The Evaluator's Role After the Report | |
| |
| |
| |
Continual Improvement | |
| |
| |
| |
Common Errors in Evaluation | |
| |
| |
Errors of System Mapping | |
| |
| |
Errors of Logic | |
| |
| |
Errors of Procedure | |
| |
| |
Conclusion | |
| |
| |
| |
Continual Improvement | |
| |
| |
What Is Continual Improvement? | |
| |
| |
Monitoring Performance | |
| |
| |
Adjusting Performance | |
| |
| |
The Role of Leadership | |
| |
| |
| |
Contracting for Evaluation Services | |
| |
| |
The Contract | |
| |
| |
Contracting Controls | |
| |
| |
Ethics and Professionalism | |
| |
| |
Sample Statement of Work | |
| |
| |
| |
Intelligence Gathering for Decision Making | |
| |
| |
Performance Measurement Systems | |
| |
| |
Issues in Performance Measurement Systems | |
| |
| |
Conclusion | |
| |
| |
| |
The Future of Evaluation in Performance Improvement | |
| |
| |
Evaluation and Measurement in Performance Improvement Today | |
| |
| |
What Does the Future Hold? | |
| |
| |
Conclusion | |
| |
| |
References and Related Readings | |
| |
| |
Index | |