BOOK REVIEWS: D. Kahneman. (2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux. 499 pages.
BOOK REVIEWS: S. Donaldson, C.A. Christie, and M.M. Mark. (2009) What Counts as Credible Evidence in Applied Research and Evaluation Practice? Thousand Oaks, CA: Sage. 265 pages.
A cost-benefit evaluation requires precise data on program outcomes. However, such data are unavailable when the analysis is prospective, and expensive and time-consuming to collect when the analysis is retrospective. This problem of uncertain data is partly solved by the revised version of the Treasury Board Benefit-Cost Analysis Guide (Watson & Mallory, 1997), which allows probabilistic estimates of program results to be used in the analysis. There are not yet many examples of this technique in practice.
Program design can make outcome evaluation impossible: a review of four studies of community ecc development programs
Between 1981 and 1990 Employment and Immigration Canada evaluated three community ecc development programs: the Community Employment Strategy, the Local Employment Assistance and Development, and the Community Futures program. In retrospect, one can see that these evaluations were hindered by two problems of program design: there was no replicable treatment, and the broad, shallow interventions were unlikely to have measurable effects in an environment "noisy" with uncontrolled factors.
Recently the discount rate recommended by Treasury Board Canada for use in program evaluations and project assessments has been challenged. This article reviews the theory and evidence supporting various estimates of the Canadian discount rate, and includes a comparison of rates used by the U.S. government and the World Bank. The article was written with the support of Training and Development Canada.