Table of contents for Program evaluation and performance measurement : an introduction to practice / James C. McDavid and Laura R. L. Hawthorn.

Bibliographic record and links to related information available from the Library of Congress catalog.

Note: Contents data are machine generated based on pre-publication provided by the publisher. Contents may have variations from the printed book or be incomplete or contain other coding.


Counter
TABLE OF CONTENTS
Acknowledgements
Dedication
Preface	
CHAPTER 1: KEY CONCEPTS AND ISSUES IN PROGRAM EVALUATION AND PERFORMANCE MEASUREMENT
	Introduction
		Integrating Program Evaluation and Performance Measurement				Connecting Evaluation and Performance Management	
		The Practice of Program Evaluation: the Art and Craft of Fitting Round 			 Pegs into Square Holes
	A Typical Program Evaluation: Assessing the Neighbourhood Integrated Service 	Team Program in Vancouver		
		Implementation Concerns
		The Evaluation
		Connecting the NIST Evaluation to this Book
	What is a Program?	
	Key Concepts In program evaluation	
	Key Evaluation Questions	
	Formative and Summative Program Evaluations
	Ex Ante and Ex Post Program Evaluations
	Analyzing Cause and Effect Linkages in Program Evaluations
	The Process of Conducting a Program Evaluation
		General Steps in Conducting a Program Evaluation							The Evaluation Assessment	
			The Evaluation Study	
	Summary	
	Discussion Questions for Chapter 1
	References for Chapter 1	
CHAPTER 2: UNDERSTANDING AND APPLYING PROGRAM LOGIC MODELS	
	Introduction		
	A Basic Logic Modelling Approach
	Logic Models That Categorize and Specify Intended Causal Linkages
	Applying the Framework for Constructing Logic Models
	Flow Charts
	Constructing Program Logics in Program Evaluations
		Specifying Program Objectives
	Program Technologies
	Program Objectives, Program Environments, and Organizational Objectives	
		Normative Goals and Behavioral Goals in Organizations
	Strengths and Limitations of Program Logics
	Summary
	Discussion Questions
	Appendix 1: Applying What You Have Learned in Chapter Two: Developing a 	 	Logic Model for the Compass Program
	Nova Scotia COMPASS Program: Program Description
	Answer Key
	References for Chapter 2
CHAPTER 3: RESEARCH DESIGNS FOR PROGRAM EVALUATIONS
	Introduction	
	What is Research Design?	
		The Origins of Experimental Design
	Why Pay Attention to Experimental Designs?	
	Using Experimental Designs to Evaluate Programs: The ELmira Nurse Home 	Visitation Program
		The Elmira Home Visitation Program	
		Random Assignment Procedures	
		The Findings	
		Policy Implications of the Home Visitation Research Program
	Establishing Validity in Research Designs	
	Defining and Working with the Four Kinds of Validity	
		Statistical Conclusions Validity	
		Working with Internal Validity	
			Threats to Internal Validity	
			Introducing Quasi-Experimental Designs: The Connecticut 					Crackdown on Speeding and the Neighbourhood Watch Evaluation 			in York, Pennsylvania
				The Connecticut Crackdown on Speeding	
				The York Neighbourhood Watch Program	
				Findings and Conclusions from the Neighbourhood Watch 					Evaluation	
		Construct Validity	
		External Validity	
	Testing the Causal Linkages in Program Logic Models	
	Research Designs and Performance Measurement	
	Summary	
	Discussion Questions
	References for Chapter 3	
CHAPTER 4: MEASUREMENT IN PROGRAM EVALUATION
	Introduction	
	Measurement Procedures	
	Illustrating Measurement Terminology
	Measurement Validity
		Types of Measurement Validity	
			Validity Types That Relate a Single Measure to a Corresponding 				Construct	
				Face validity	
				Content Validity	
				Response Process Validity	
			Internal Structure Validity	
			Validity Evidence Based on Relationships with Other Variables					Concurrent Validity
				Predictive validity	
				Convergent validity	
				Discriminant validity	
	Levels of Measurement	
		Nominal Level of Measurement	
		Ordinal Level of Measurement	
		Interval and Ratio Levels of Measurement	
	Units of Analysis
	Sources of Data in Program Evaluations and Performance Measurement Systems	
		Existing Sources of Data	
		Sources of Data Collected by the Program Evaluator						Surveys as a Data Source in Evaluations	
	Using Surveys to Estimate the Incremental Effects of Programs	
	Survey Designs and Research Designs	
	Validity of Measures and the Validity of Causes and Effects	
	Summary	
	Discussion Questions	
	References for Chapter 4
CHAPTER 5: APPLYING QUALITATIVE EVALUATION METHODS
	Introduction	
	Comparing and Contrasting Different Approaches to Qualitative Evaluation			Understanding the Issue of Paradigms	
		The Pragmatic Approach	
	Qualitative Evaluation Methods: Some Basics	
		Key Differences Between Qualitative and Quantitative Evaluation 				Approaches	
	Structuring Qualitative Program Evaluations	
		Identifying Evaluation Questions and Issues in Advance	
		Identifying Research Designs and Appropriate Comparisons	
		Identifying Appropriate Samples	
		Structuring Data Collection Instruments	
		Collecting and Coding Qualitative Data	
	The Credibility and Generalizability of Qualitative Findings
	Connecting Qualitative Evaluation Methods to Performance Measurement	
	The Power of Case Studies	
	Summary	
	Discussion Questions	References?
CHAPTER 6: ASSESSING THE NEED FOR PROGRAMS	
	Introduction	
	What Are Needs?	
	Benchmarking Needs: Criteria for Establishing the Existence and Magnitude of 	Needs	
		Theories, Models, or Frameworks as Benchmarks	
		Moral or Ethical Values as Benchmarks	
		Comparisons Within or Among Jurisdictions as Benchmarks	
		Service Providers as Benchmarks	
		Current or Prospective Clients as Benchmarks	
	Steps in Conducting Needs Assessments	
		Become Familiar with the Political Context	
		Identify the Users and Uses of the Needs Assessments	
		Identify the Target Population(s) Who Will Be or Are Currently Being 			Served	
		Inventory Existing Services, and Identify Potential Gaps	
		Identify Needs, Using Complementary Strategies for Collecting and 			Recording Data	
			Data Sources and Data Collection Methods					Demographic Data	
				Surveying Current and Prospective Clients	
				Qualitative Methods in a Needs Assessment
				Sampling Procedures	
				Sample Sizes	
			Validity Issues
		Prepare a Document that Integrates Evidence, Benchmarks, Conclusions, 			and Recommendations	
		Communicate the Results of the Needs Assessment	
		Implement the Recommendations of the Needs Assessment	
	The St. Columba Collaboration Project: Needs Assessment in a Newark 	Neighborhood	
	Summary	
	Discussion Questions	
	Appendix: Designing a Needs Assessment for a Small Non-Profit Organization	
	References for Chapter 6	
CHAPTER 7: CONCEPTS AND ISSUES IN ECONOMIC EVALUATION	
	Introduction	
		Why a Program Evaluator Needs to Know About Economic Evaluation			Connecting Economic Evaluation with Program Evaluation: Program 			Technologies and Outcome Attribution	
			High- and Low-Probability Program Technologies
			The Attribution Issue	
	Three Types of Economic Evaluation	
		The Choice of Economic Evaluation Method
	Economic Evaluation in the Performance Management Cycle	
	Historical Developments in Economic Evaluation	
		Distinguishing Operational Costs from Social Costs: Key to Developing 			Economic Evaluation Methods	
	Cost-Effectiveness Analysis	
		Steps for Cost-Effectiveness Analysis	
			Specify the set of alternative projects	
			Decide whose benefits and costs count (standing)	
			Catalogue the costs and benefits and select measurement indicators 			(units)	
			Predict the impacts quantitatively over the life of the project	
			Monetize (attach dollar values) to all costs [and benefits]	
			Discount costs [and benefits] to obtain present values	
			Compute the net present value (NPV) of each alternative
			Perform sensitivity analysis	
			Make a recommendation based on the NPV and sensitivity 					analysis.	
	Cost-Utility Analysis	
	Cost-Benefit Analysis	
		Key Concepts in Cost-Benefit Analysis	
		Estimating Willingness-to-Pay	
		Internal Rate of Return	
		The Marginal Value of Money	
	Cost-Effectiveness Analysis Example: A Study of Falls Prevention Among the 	Elderly	
	Strengths and Limitations of Economic Evaluation	
		Strengths of Economic Analysis	
		Limitations of Economic Evaluation
	Summary	
	Discussion Questions	
	References for Chapter 7	
CHAPTER 8: PERFORMANCE MEASUREMENT AS AN APPROACH TO EVALUATION	
	Introduction	
	Growth of Performance Measurement	
		Government Deficits and the Transformation of Public Expectations for 			Governments	
	Metaphors That Support and Sustain Performance Measurement	
		Government as a Business	
		Organizations as Open Systems	
		Organizations as Machines	
	Comparing Program Evaluation and Performance Measurement	
	Summary	
	Discussion Questions	
	References for Chapter 8	
CHAPTER 9 DESIGN AND IMPLEMENTATION OF PERFORMANCE MEASUREMENT SYSTEMS	
	Introduction
	Key Steps in Designing and Implementing a Performance Measurement System	
		Identify the organizational champions of this change	
		Understand what performance measurement systems can and cannot do	
		Establish multi-channel ways of communicating that facilitate top-down, 			bottom-up, and horizontal sharing of information, problem identification 			and problem solving
		Clarify the expectations for the organizational uses of the performance 			information that is created	
		Identify the resources available for designing, implementing, and 				maintaining the performance measurement system	
		Take the time to understand organizational history around similar 				initiatives	
		Develop logic models for the programs for which performance measures 			are being developed, and identify the key constructs to be measured			Identify any constructs that apply beyond single programs	
		Involve prospective users in reviewing logic models and constructs in the 			proposed performance measurement	
		Measure the constructs that have been identified as parts of the 				performance measurement system	
		Record, analyze, interpret and report the performance data	
		Regularly review feedback from the users and, if needed, make changes 			to the performance measurement system	
	Summary	
	Discussion Questions	
	References for Chapter 9	
CHAPTER 10: USING AND SUSTAINING PERFORMANCE MEASUREMENT SYSTEMS	
	Introduction	
	Comparing Ideal and Real Organizations: The Role of Organizational Politics in 	Designing and Implementing Performance Measurement Systems	
		The Rational/Technocratic Framework	
		The Political/Cultural Framework	
	Intended Uses of Performance Measurement Systems	
		Elected Officials as Intended Users of Performance Information	
			Findings from a Study of Legislators? Expected Uses of 					Performance Reports	
			Promoting Usage by Auditing Performance Reports	
	Actual Uses of Performance Measurement	
		American Studies of Actual Uses of Performance Measures	
			US Federal Government Uses of Performance Measures	
			Uses of Performance Information in American States	
				Performance Measurement Use in Texas	
			Local Government Uses of Performance Measures	
		Summary of Actual Uses of Performance Reports	
	Problems and Issues in Implementing and Sustaining Performance Measurement 	Systems	
		Lack of Fit Between Organizational Needs and Performance Measurement 		Solutions.	
		The Complexity of Accountability Relationships	
			Accountability for Results	
				Accountability in Systems Where Third Parties Deliver the 					Programs	
			Input and Process-Focused Accountability: The Hierarchical 				Model	
			Accountability Where Organizations or Governments Share 				Program Responsibilities	
		Attributing Outcomes to Programs.	
		The Levels of Analysis Problem	
		The Problem of Gaming Performance Measures	
			Examples of Performance Measures That Have Created 					Unintended Behaviors	
			Understanding the Incentives for Gaming	
	Summary
	Discussion questions	
	References for Chapter 10	
CHAPTER 11: PROGRAM EVALUATION AND PROGRAM MANAGEMENT: JOINING THEORY AND PRACTICE	
	Introduction	
	Can Management and Evaluation Be Joined?	
	Prospects for Building Cultures That Support Evaluation	
	Learning Organizations as Self-Evaluating Organizations	
		Empowerment Evaluation and the Learning Organization
	Can Program Managers Evaluate Their Own Programs?
	How Should Evaluators Relate to Managers: Striving for Objectivity in Program 	Evaluations	
	Can Program Evaluators Be Objective?	
		Looking for a Defensible Definition of Objectivity	
			A Natural Science Definition of Objectivity
			Implications for Evaluation Practice	
	Criteria for Best Practices in Program Evaluation: Assuring Stakeholders that 	Evaluations Are High Quality
	 Ethics and evaluation practice	
	Summary	
	Discussion Questions	
	FIONA?S CHOICE: AN ETHICAL DILEMMA FOR A PROGRAM 	EVALUATOR	
	YOUR TASK	
	References for Chapter 11
CHAPTER 12: THE NATURE AND PRACTICE OF PROFESSIONAL JUDGMENT IN PROGRAM EVALUATION	
	Introduction	
	The Nature of the Evaluation Enterprise	
		What is Good Evaluation Practice: Methodological Considerations?	
			Problems with Experimentation as a Criterion for Good 					Methodologies	
		The Importance of Determining Causality: The Core of the Evaluation 			Enterprise	
		Alternative Perspectives on the Evaluation Enterprise	
		Reconciling Evaluation Theory with the Diversity of Practice	
		Working in the Swamp: The Real World of Evaluation Practice	
		Common Ground Between Program Evaluators and Program Managers			Situating Professional Judgment in Program Evaluation Practice	
	Professionalism and Evaluation Practice	
		Professional Knowledge as Applied Theory	
		Professional Knowledge as Practical Know-How	
		Balancing Theoretical and Practical Knowledge in Professional Practice	
	Understanding Professional Judgment	
		Modeling the Professional Judgment Process	
			The Decision Environment
			Values, Beliefs and Expectations	
		Acquiring Professional Knowledge	
	Improving Professional Judgment in Evaluation Practice	
		Guidelines for the Practitioner	
		Implications for Training Evaluation Practitioners	
		Training Evaluators: Ways of Inculcating the Capacity to Render Sound 			Professional Judgments	
	Evaluation as a Craft: Implications for Professionalizing Evaluation			Teamwork and Professional Judgment	
	Summary	
	Discussion Questions	
	References for Chapter 12	

Library of Congress Subject Headings for this publication:

Organizational effectiveness -- Measurement.
Performance -- Measurement.
Project management -- Evaluation.