Skip to main content
Home
  • English
  • Français

User account menu

  • Log in

Secondary Menu

  • Home
  • FAQ
  • Contact Us

Social Menu

  • Twitter
  • LinkedIn
Content type
Item language

Manage your member accounts  |  Advanced search

  • About Us
    • Mission and Vision
    • Strategic Plan
    • Our Commitment to Reconciliation
    • Our History
    • Who we are
      • Our Structure
      • Our Chapters
        • Newfoundland & Labrador
        • Prince Edward Island
        • Nova Scotia
        • New Brunswick
        • Quebec
        • National Capital
        • Ontario
        • Manitoba
        • Saskatchewan
        • Alberta and Northwest Territories
        • British Columbia
        • Yukon
      • Board and Secretariat
      • Our Committees, Working Groups and Task Forces
      • Volunteer opportunities
    • Our Fellowship
    • Advocacy
    • Partnerships
      • Canadian Partners
      • International Partnerships
    • CES Awards
    • Our Policies
  • Membership
    • Why Become a CES Member
    • Member Discounts and Benefits
    • Join the CES
    • Membership Fees
    • Renew Your Membership
  • Latest News
    • Latest News
    • Upcoming National and Chapter Events
  • Designations
    • About the CE Designation
    • Competencies for Canadian Evaluators
    • Roster of Credentialed Evaluators
    • Credentialed Evaluator Success Stories
    • For Candidates
    • For Credentialed Evaluators
    • For Employers and Purchasers
    • For Educators and Trainers
    • For Other Professional Societies
  • Professional Learning
    • e-Institute
    • Webinars (members)
      • Registration
      • Webinars - Frequently Asked Questions
    • Non-CES Events (members only)
    • Submit a Non-CES event
    • Annual Conference
    • Mentoring Initiative
    • Student Competitions
    • Educational Fund
    • Post-secondary Courses
  • CJPE
    • CJPE Book Club Webinars
    • CJPE Introduction
    • Issues and Articles
    • Search for an Article
    • Submit a Manuscript to the CJPE
  • Resources
    • What Is Evaluation?
    • Ethics
    • Program Evaluation Standards
    • Submit an Unpublished Document
    • Jobs, Contracts, Collaborations (members only)
    • Add a Job, Contract, or Collaboration Opportunity
    • Standards to Support Classroom Assessment

What Evaluation Has to Say on Serving the Needs of Decision-makers

You are here:

Breadcrumb

  1. Home
25 Dec
2014
Tags
Needs of Decision-makers
Talking Data
Social Impact Analyst Association
CES President

Social Impact Analyst Association 2014 Talking Data: Measurement with a Message Conference. Speaking notes for Benoît Gauthier, President of the Canadian Evaluation Society, on November 3, 2014.

Thank you for inviting me today. I am glad to represent the Canadian Evaluation Society and its 1,800 members. I hope to offer a small contribution to your reflections based on my experience as an evaluator and on the collective experiences and lessons learned by evaluators over the last several decades.

The focus of this conference is on "measurements that deliver the right message". The conference program cites examples like writing for funders, communicating to internal decision makers, and telling stories to stakeholders. Ultimately, this conference's theme reflects a concern with bringing information to life, and making it usable and used.

Utilization is a long-standing preoccupation of evaluators – the community that I represent here today. The seminal work by Michael Quinn Patton dates back to 1978, almost 40 years ago. "Utilization-Focussed Evaluation" was the title of his book [1] which has seen four editions plus a shorter version published in 2012 [2]. This topic is alive and well in evaluation. The expression "Utilization-Focussed Evaluation" returns about 400,000 results on Google, not to mention more than 6,000 scholarly references to Patton's work.

Four decades of research on evaluation focussed on utilization have led to a number of firm observations that I will attempt to summarize now.

First, for evaluation, or more generally for knowledge, to be used in decision-making, the organization must be ready to accept the challenge of questioning itself. There has to be permeability to new ideas and data. An organization founded on ideology and unwilling to put its bases to the test will simply not use performance information contradicting its foundations.

Second, actionable knowledge is a derivative product: it is produced by a process of engagement which co-opts staff and managers within the evaluation process. Evaluation conducted in a totally arms-length manner without input from the organization itself is much less likely to get used – unless the findings happen to correspond with the interests or expectations of a particularly influential group. Ideally, engagement is found at every step of the way: in defining the approach, in identifying methodologies and yardsticks, in designing the logic of the demonstration, in interpreting the observations, in drawing conclusions.

Third, the universe of possible evaluation issues or social outcomes is always much larger than what is allowed by the practical limitations imposed on any feedback process (money, time, expertise, etc.). Therefore, it is crucial to identify the priority purposes for the feedback – and those are the concerns that staff, management and other stakeholders rank highest. Here again, the priorities for knowledge development must be identified through education and engagement.

Fourth, scattered collections of facts and figures are unlikely to get much use. Knowledge must be organized into stories that make sense to stakeholders, and that respect the complexity of the world in which the organization evolves. It is so easy for those at whom the knowledge is aimed to disregard a counter-intuitive finding with a particular case that pretends to lead to the contrary conclusion. To organize their work and to tell their stories, evaluators have used logic models, program theories, systemic maps, and other such tools which all aim to simplify the world – but avoid the science of particular cases – while allowing for important real-life interactions to be portrayed in their conclusions.

Fifth, communication to and with decision-makers is clearly a preponderant factor in utilization of knowledge. There are several facets to communication. Timing is essential: communication of results must start early and be staged over time. Method is pivotal: if the final report is still a preferred vehicle, other methods are gaining ground such as thematic summaries, newsletter articles, brochures, press releases, oral briefings, discussion groups, retreats, storyboarding sessions, creative presentations, debates, etc. Adapting the content is also key: some audiences couldn't care less about measurement methods and want to focus on lessons learned while others need to be convinced that the measurement is reliable before paying attention to the results.

Patton identified five principles of utilization-focussed reporting:

  1. Be intentional about reporting, that is, determine what you want the audience to take away and stay true to that purpose.
  2. Stay user-focussed: aim for the priorities of the intended user and answer their questions.
  3. Use graphics and other visuals to communicate succinctly and powerfully (something I have clearly not mastered).
  4. Prepare users for counter-intuitive findings.
  5. Distinguish dissemination from use: evaluators would say that dissemination is an output whereas use is an outcome.

Sixth, utilization must be facilitated with after-sale service or support: decision-makers may well use carefully crafted new knowledge produced via close engagement, but they may also need to be reminded of the implications of this knowledge in the conduct of their business. The evaluator or the social impact analyst does not own the knowledge but they have profoundly contributed to developing it; because of that, he is in a pivotal position to help apply the knowledge to various situations in the organization.

This last utilization factor posits a position of influence for the analyst. This stance can conflict with the typical level of independence we want adopt. So how does one reconcile independence, influence, and responsiveness to organizational needs? Rakesh Mohan and Kathleen Sullivan [3] have proposed that these concepts are not mutually exclusive and that strategies are available to maximize independence and responsiveness. These strategies include:

  1. identification of key stakeholders and their interactions;
  2. extensive consultation on needs and tailoring of analysis for a variety of stakeholders;
  3. appreciation and response to timing constraints;
  4. judicious use of authority;
  5. use of methods that produce information considered reliable from a broad range of perspectives;
  6. balanced reporting;
  7. use of professional standards to guide the work.

The last of these strategies is quite important. Evaluators as well as social impact analysts are open to criticism if they don't adhere to a clear set of well established professional standards. It has taken evaluators decades to develop standards that are now officially accepted in North America, Europe, and Africa [4]. Essential elements of these standards relate to stakeholder engagement, independence, credibility of evidence, protection of the confidentiality of certain information, quality assurance, balance in analysis and reporting, and record keeping. Such standards constitute one of the three pillars of the Canadian Evaluation Society Professional Credentialing program, along with ethics guidelines and professional competencies.

So, what does that mean in relation to the theme of this conference? The presentation of data is a fundamental ingredient in the recipe for influencing decision-making and knowledge utilization. However, it is only one of several: I would contend that great presentation of poorly targeted evidence is less efficacious than poor presentation of well chosen evidence. But I wholeheartedly agree that optimal utilization of evidence occurs when all the utilization drivers are aligned: openness, engagement, priorization, storytelling, communication, and influence.

I want to leave you with one final thought. Utilization comes in a number of forms. Kusters and her associates identified seven types of uses [5]. They are:

  • Direct or instrumental use which has an immediate impact on decisions and actions. This may be the type of use that you had in mind when you entered this conference.
  • Conceptual use affects how one thinks about certain issues and approaches them.
  • Symbolic use affects how one justifies their action.

    These three uses are those most often cited but four more types of uses were also identified.
  • Process use stems from being involved in the knowledge creation process; it may mean adopting a new approach to decision-making.
  • Relational use changes how roles and functions are shared within the organization.
  • Value use means that fundamental goals, aspirations, or motivations are modified.
  • External use affects how individuals outside the organization adapt and the position they adopt.

In situating how to best impact decision-making, it is important to remember that utilization comes in a multitude of forms. Also, management science tells us that decision makers are not looking for certainty; they strive for a reduction of uncertainty. Managers also need information that is credible in their eyes and actionable in their lives.

Thank you for giving me this opportunity to talk to you today about how evaluators see the use of the knowledge they generate. I hope we get an opportunity to discuss how this applies to the world of social impact analysis.

SOURCES

[1] Patton, Michael Quinn (1978) Utilization-Focussed Evaluation, first edition, Sage Publications

[2] Patton, Michael Quinn (2012) Essentials of Utilization-Focussed Evaluation, Sage Publications

[3] Mohan, Rakesh and Kathleen Sullivan (2006) "Managing the Politics of Evaluation to Achieve Impact" in New Directions for Evaluation, no. 112

[4] Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.), Sage Publications

[5] Kusters, Cecile et al (2011), Making Evaluations Matter: A Practical Guide for Evaluators, Centre for Development Innovation, Wageningen University & Research Centre, Wageningen, The Netherlands

In This Section

  • About Us
    • Mission and Vision
    • Strategic Plan
    • Our Commitment to Reconciliation
    • Our History
    • Who we are
    • Our Fellowship
    • Advocacy
    • Partnerships
    • CES Awards
    • Our Policies
  • Membership
    • Why Become a CES Member
    • Member Discounts and Benefits
    • Join the CES
    • Membership Fees
    • Renew Your Membership
  • Latest News
    • Latest News
    • Upcoming National and Chapter Events
  • Designations
    • About the CE Designation
    • Competencies for Canadian Evaluators
    • Roster of Credentialed Evaluators
    • Credentialed Evaluator Success Stories
    • For Candidates
    • For Credentialed Evaluators
    • For Employers and Purchasers
    • For Educators and Trainers
    • For Other Professional Societies
  • Professional Learning
    • e-Institute
    • Webinars (members)
    • Non-CES Events (members only)
    • Submit a Non-CES event
    • Annual Conference
    • Mentoring Initiative
    • Student Competitions
    • Educational Fund
    • Post-secondary Courses
  • CJPE
    • CJPE Book Club Webinars
    • CJPE Introduction
    • Issues and Articles
    • Search for an Article
    • Submit a Manuscript to the CJPE
  • Resources
    • What Is Evaluation?
    • Ethics
    • Program Evaluation Standards
    • Submit an Unpublished Document
    • Jobs, Contracts, Collaborations (members only)
    • Add a Job, Contract, or Collaboration Opportunity
    • Standards to Support Classroom Assessment

Tags

20182019CESCES ConferenceCESEFCongrès de la SCÉFSCÉÉJournalsMember servicesRevuesSCEServices aux membres
more tags

Recent Opportunity Posting

Either you are not logged in or no opportunity is active right now.

Upcoming Events

  • Webinar | Be Fearlessly Engaging
  • Webinar | Be Fearlessly Engaging
  • Supporting Equitable Evaluation: Practical Insights for Data Collection Processes

Latest Publications

  • Nominations Open for the Contribution to Evaluation in Canada Award
  • Nominations Open for the Service to CES Award
  • Upcoming System Downtime: February 9 to 14

Contact Details

180 Plaunt St. S, Suite 3
Renfrew, Ontario
K7V 1M8

Phone: 1-855-251-5721
Fax: 613-432-6840

Contact Us

© Copyright 2014 by Canadian Evaluation Society. All Rights Reserved. Privacy and DisclaimerDesign by baytek systems