Ken Watson

Fall

Sensitivity Analysis in Outcome Evaluations: A Research and Practice Note

Authors:
Pages:
113-122

Every evaluation study uses data that are uncertain to a certain degree. Therefore, the analyst and the decision-maker need to know how much the outcome of the evaluation varies given the plausible variation in uncertain data inputs. That is, how sensitive is the outcome of the analysis to a particular input variable? This note discusses what characteristics make for sensitivity, what techniques to use to clarify sensitivity (especially graphic techniques), and how to interpret the results.

Special Issue

Cost-benefit analysis in the nineties

Authors:
Pages:
1-4

Fall

Risk Analysis And Program Evaluation

Authors:
Pages:
35-48

Evaluation studies by the Government of Canada follow standard procedural guidelines but have been retrospective and idiosyncratic in their methodologies. Therefore, they do not enable comparisons of the likely future effectiveness of competing programs. In contrast, the World Bank uses a combination of benefit/cost analysis and risk analysis to produce standard "bottom line" measures of program effectiveness. The latter approach makes evaluation studies more useful in decision making.

Fall

Selecting And Ranking Issues In Program Evaluations And Value-For-Money Audits

Authors:
Pages:
15-27

This article compares methods of selecting and ranking issues in program evaluations and value-for-money audits. It considers the following questions: Who should select the issues? What are the appropriate criteria for selecting issues, and can these criteria be collapsed into a single index of "issue importance" by which issues can be ranked?