Evaluation

Series
Prentice Hall
Author
Carol H. Weiss  
Publisher
Pearson
Cover
Softcover
Edition
2
Language
English
Total pages
372
Pub.-date
December 1997
ISBN13
9780133097252
ISBN
0133097250
Related Titles



Description

Suitable for undergraduate Evaluation Research courses in a variety of departments, including psychology, sociology, public health, public policy/affairs, education, and criminal justice.

Acquainting students with the complexities inherent in evaluation research, this timely and accessible guide explores how we apply research methods in evaluating social programs, and illustrates its points with reference to a variety of fields, including education, social services, and criminal justice. Written in an intelligent and graceful prose, it offers practical advice on understanding the reasons for the study, identifying key questions to be answered, and planning and implementing the overall design of the study, including measurement, qualitative methods of inquiry, data collection, analysis, reporting, and dissemination.

Features

  • Bases whole text on practicality and realism under real-world evaluation conditions, helping student/evaluators do the best study possible with a sophisticated awareness of the range of possibilities and the factors that need to be taken into account. Pg.__
  • Integrates a theory-based approach into study design, stressing that understanding the underlying theory of the program is essential to developing the most appropriate evaluation.
  • Emphasizes the need to take ethical considerations into account all along the course of the study.
  • Devotes a chapter to the concerns of stakeholders and a examines a variety of ways for addressing those concerns.
  • Highlights the importance of judgement calls in evaluation design and provides sound advice on the basis on which judgments should be made.
  • Explains both quantitative and qualitative methods and discusses times and ways to profitably combine them.
    • Covers meta-analysis, cost-benefit analysis and includes a non-technical discussion of the logic of data analysis.

  • Emphasizes the importance of getting evaluation results for improving program and policy, but helps students recognize that their use is often indirect and delayed, and is more akin to organizational learning than to immediate implementation.
  • Provides a glossary with clear definitions of research and technical terms
  • Includes references to further sources on measurement, existing longitudinal data sets, statistics, and qualitative analysis.
  • Presents a fundamentally changed text throughout, with an extensive overhaul that includes close to 95% new material and seven new chapters.

New to this Edition

  • NEW-Presents a fundamentally changed text throughout, with an extensive overhaul that includes close to 95% new material and seven new chapters.

Table of Contents



 1. Setting the Scene.


 2. Purposes of Evaluation.


 3. Understanding the Program.


 4. Planning the Evaluation.


 5. Roles for the Evaluator.


 6. Developing Measures.


 7. Collecting Data.


 8. Design of the Evaluation.


 9. The Randomized Experiment.


10. Extensions of Good Design.


11. Qualitative Methods.


12. Analyzing and Interpreting the Data.


13. Writing the Report and Disseminating Results.


14. Evaluating with Integrity.


Glossary.


References.


Index.