Tips and tools in Program Evaluation

 
Tips and tools in Program Evaluation
Tips and tools in
                          Program Evaluation
                          Larraine J Larri

                          … from my experience as an evaluator and educationist
                          over 30 years supporting Public Sector agencies and NGO’s
                          to plan, scope, commission and undertake high quality
                          program evaluation.

                          In this handout you’ll find:

                               links to useful websites;
                               book titles; and
                               some of my most favourite diagrams.

OCTOBER 2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES
Tips and tools in Program Evaluation
1. Websites

         Better Evaluation - An international collaboration to improve evaluation practice
         and theory by sharing information about options (methods or tools) and approaches.

                http://betterevaluation.org/

         Australasian Evaluation Society – the professional association for evaluators in
         this country, includes resources, links, a listing of evaluation consultants, papers and
         presentations from past Annual conferences.

           Home Page: http://aes.asn.au/
           Useful links to resources:
               http://aes.asn.au/resources/useful-links-to-evaluation-resources/other-
               useful-websites/38-other-useful-websites.html

         RMIT CIRCLE - Collaboration for Interdisciplinary Research, Consulting and
         Learning in Evaluation (based at RMIT University) links those interested in improving
         the theory and practice of evaluation.

                http://www.rmit.edu.au/browse;ID=1yztov13rke9z

   2. Some books

                                                Funnell, Sue. Rogers, Patricia J.

                                                Purposeful Program Theory - Effective Use of
                                                Theories of Change and Logic Models,

                                                Jossey Bass Wiley, 2011

                                            Preskill, Hallie. Catsambas Tzavaras, Tessie

                                            Reframing Evaluation Through Appreciative
                                            Inquiry

                                            Sage Publications, 2006

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 2
Tips and tools in Program Evaluation
3. Program Theory – describing your program
             What are your program’s inputs, activities or through-puts & outputs, and
              outcomes?

                 Efficiency, effectiveness and value for money

                INPUTS

                                 through-puts
                                                   Efficiency = maximising
                                                        output for input

                                             OUTPUTS

          Value for money = maximising
                 outcome for input                      Effectiveness = maximising
                                                             outcome for output

                                                   OUTCOMES
                                                  •Immediate
                                                  •Intermediate
                                                  •Long-term

   Logic models provide systematic, visual ways of showing how a program is meant to work.
   They explicitly describe the underlying assumptions of cause and effect, if this then that - the
   way lower level results lead to each successive level, to achieve the ultimate results. All the
   program activities and the ways in which they contribute to achieving the desired results are
   clearly and concisely shown.

   Evaluators use logic models to understand a program’s theory of action during the initial
   ‘focusing’ stages of an evaluation. There are often enduring benefits when a logic model is
   developed for a program. Program managers find that they have a tool that supports
   program implementation and re-design.

Understanding your program’s theory of action helps to:

        bring your strategy (or program) concepts and dreams to life
        describe the way you hope your strategy (or program) will work
        show the sequence of events (or activities) and the inter-relationships ie how it all
         fits together
        give everyone a picture of how the strategy (or program) would function

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 3
Tips and tools in Program Evaluation
A way of showing the logic by using a table …its called an ‘Outcomes Hirearchy’

               A map that shows the short-term, mid-term and longer-term results that you hope to
               get. This map shows how you intend to link each of the results and what needs to
               happen first before something else can be achieved i.e. this, therefore that (cause
               and effect relationships, causal links) its called an Outcomes Hierarchy

                   Why does your program exist? What situation or need is it addressing?
                    Where did you get this knowledge from? – put it at the bottom of the
                    hierarchy to show that it underpins everything.
                   What is the great hope / vision / ultimate / big picture outcome? How does
                    this fit into any strategic directions? – put this at the top to show what your
                    aiming for.
                   Where does your program start? And then what happens? – the bits in
                    between to show how the program or project is meant to be impmented.

                             Column 1
LONG-TERM
 ULTIMATE

                        Systemic change
  GOALS

                         Big Picture stuff

                        Observable changes
INTERMEDIATE
  OUTCOMES

                   … in people or systems as a
                        result of activities

                  Activities that involve people in
SHORT-TERM
 OUTPUTS

                   the program, the ‘busy’ work

                        Situational Analysis
PROGRAM
NEED FOR

                      Background information

OCTOBER         2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 4
Tips and tools in Program Evaluation
If you want to spend more time planning and analysing how your project or program
      will work, you can add these columns to each outcome.

              What should this result look like if we are successful? see Column 2
              What might affect this result – either positively or negatively; and how much control
               do we have over these factors? see Columns 3 & 4
              What will be done to achieve this result? see Column 5
              What are the most important aspects of this result that need to be measured so that
               we can be efficient and effective in achieving the result see Column 6
              Are there any comparisons that help to better understand this result so that we can
               learn more eg before and after; how the same thing operates in 5 different locations;
               how different target groups have responded. see Column 7

    Column 2              Column 3        Column 4       Column 5         Column 6            Column 7

SUCCESS CRITERIA,         FACTORS         FACTORS       ACTIVITIES     PERFORMANCE         COMPARISONS
 DEFINITIONS AND        THAT AFFECT         THAT           TO           INFORMATION
  EXPLANATIONS            SUCCESS          AFFECT       ADDRESS
                           THAT the       SUCCESS       FACTORS
                        Program / Unit     THAT the       THAT
                             CAN          Program /      AFFECT
                         INFLUENCE        Unit CAN’T    SUCCESS
                                         INFLUENCE

      Example:

                                     CarbonKids Program Pilot Evaluation
      See http://www.csiro.au/resources/CK-final-evaluation-report.html

      OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

      PAGE 5
4. The evaluation approach you use will depend on the
      questions you are asking about your program
            Questions you would like answered about your program
            Where do they fit in relation to the types of evaluation listed below?

 TABLE 1: TYPES OF EVALUATION APPROACHES, KEY QUESTIONS AND FOCUS

    1.     Design Evaluation
           Key evaluation question: Is this the right approach?
           Evaluation Focus:
           To assess the feasibility of new programs by drawing on feasibility studies,
           literature reviews, economic appraisals, relevant research findings as
           supporting evidence for new program proposals.
     2.    Process or Implementation Evaluation
           Key evaluation question: How well are we going?
           Evaluation Focus:
           To determine how well a program is being implemented and operates,
           whether it is efficient and has provided the service/s as intended. This
           Includes monitoring, routine review, or ongoing evaluation i.e. regular data
           collection on a program or service’s activities and outputs.
     3.    Result, Impact or Outcome Evaluation
           Key evaluation question: Did we make a difference?
           Evaluation Focus:
           To determine the effectiveness of the program and the ways in which it is
           achieving the intended results for the client group. Often impacts are both
           quantitative and qualitative.
     4.    Economic Evaluation
           Key evaluation question: Are we getting / did we get value for money?
           Evaluation Focus:
           To determine the value (benefits over costs), and cost effectiveness of a
           program. This may include using methods that include: cost-effectiveness,
           cost-utility, cost-benefit analysis, and social return on investment.

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 6
5. Typical steps in planning and implementing an evaluation

             Below are two explanations of ‘Participatory Evaluation’
            What are the typical steps in planning an evaluation? … list

                                            Excerpt from Participatory Evaluation What is it? Why do it? What are the challenges,
                                                      Community-based public health policy and proactice Issue 5, April 2002 see
                                                                        http://depts.washington.edu/ccph/pdf_files/Evaluation.pdf

              Excerpt from Performance Monitoring and Evaluation,Tips,USAID Center for Development Information and Evaluation
                                                             (1996, Number 1) see http://pdf.usaid.gov/pdf_docs/PNABS539.pdf

            It’s important to have an evaluation that responds to key stakeholder needs.
             Who are the different stakeholder groups involved? …use these table
             headings

                 Key Stakeholder             Examples from my                       What data they need to know
                 Group                       program / project                      in relation to the program
                                                                                    and how they would use it

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 7
 How much involvement / participation of different stakeholders do you want?
             For each of the typical steps in planning an evaluation, now consider
             possible strategies you could use for participation, the benefits and
             challenges. Here are some ‘Guiding Principles’ to help you.

              Participatory evaluation approaches can be empowering, educational tools for
              community partnerships that can be used to ensure that evaluations address locally
              relevant questions, contribute to improving program performance, and support the
              development of sustainable partnerships. More importantly, the approach is focused
              on building the capacity of individuals and teams to carry out all steps in an
              evaluation process. In this respect, participatory evaluation can contribute to
              empowering communities to act and create change within their neighbourhoods,
              community organizations and local governmental institutions.

                                  Excerpt from Participatory Evaluation What is it? Why do it? What are the challenges,
                                               Community-based public health policy and proactice Issue 5, April 2002
                                                           seehttp://depts.washington.edu/ccph/pdf_files/Evaluation.pdf

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 8
What to put into an Evaluation Plan (or Terms of reference for an
          evaluation) – start by working on your own, or with others if they are
          part of your team at work …
                Below are typical headings for the parts of an Evaluation Plan,

             Have a go at putting some of your information against each of these
              headings
             If you don’t know what to put in, perhaps you know who to ask or its likely
              there is a document that gives you the words you need e.g. a business or
              operational plan.

   Purpose and Objectives
      What do you hope to achieve by doing this evalution?
   Description of the program ‘model’
      How is your program meant to work? An outcomes hierarchy is useful here.
                  Program Logic
   Key evaluation questions and relationship to type of evaluation
      Questions focus the evaluation in a range of ways. They are a way of clarifying
      everyone’s expectations.
   Scope and limitations
      Helps to put boundaries around the evaluation project - what’s in and what’s out. Could
      include time period, geographical area, dimensions of the stakeholder participation,
      factors that may constrain or limit the project.
                Participatory and Empowerment Evaluation
   Key stakeholders
      A list of people to consider or consult who have a role and/or opinion about the program.
   Data sources and collection plan
      An overview of data that the program routinely collects, any additional data that the
      evaluation may need, how the data will be collected, how it will contribute to answering
      the questions.
             Data Matrix, qualitative and quantitative data collection strategies;
                  Appreciative Enquiry; Most Significant Change
   Ethical considerations
      e.g. confidentiality, privacy, security and data management, informed consent
   Products and reporting plan – planned dissemination, disclosure and use of results
      Consider who will be reading the report/s – the audience; and what their information
      needs are.
   Budget and resources
      Financial and human resources required and the source of the budget.
   Risk analysis
      A list of potential risks and mitigation strategies.
   Evaluation project management and governance
      A project plan with stages / phases, deliverables, person responsible, who gives the final
      sign-off, timeline / due date.
   Learning and Development Plan
      Organisational learning and development strategies that build capacity in evaluation
      through supporting staff to implement the M&E Plan, including data analysis and
      evaluation reporting skills. t
   Communications plan
      Who needs to know, what they need to know, when do they need to know it, how will the
      information be communicated, who is responsible for making this happen..

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 9
What you need to know to develop a Monitoring and Evaluation Plan
          What is your organisation’s current capability in evaluation? …

               A Monitoring and Evaluation plan is similar to an Evaluation plan. Its
               purpose is to describe how you collect and use your own program’s
               performance data for monitoring, reporting and managing; and to
               explain how evaluation events will be managed.

               The aim of evaluation and monitoring is to provide data in a systematic
               way so that it contributes to ongoing quality assurance and innovation.
               The diagram below is an attempt to show how organisations can use
               data feedback loops to inform organisation learning.

             When you have heard the explanation about this diagram, decide where you
              would rate your organisation on the continuum.

                 Diagram: Towards a systemic approach to
                     quality performance management

                                        Quality Assurance
                                  Achieving outputs with feedback
                                   for continuous improvement

                                                              Systematic
             Quality Control
            Achieving outputs

      Serendipity
                         Haphazard / Ad Hoc
                                                                Continuous Innovation
                                                          Continuous improvement with critical
                                                              reflection and breakthroughs

    Adapted from Laurie Field’s diagram (1997) “Becoming more like a learning organisation”

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 10
More steps in planning an evaluation …

             Here are some more steps to consider when planning an evaluation. Can
              you fit them with your previous steps? Where should they go?
             Is there enough information in your draft Evaluation Plan for you to be able to
              understand what is involved in these steps?

   1. Propose your program for evaluation and gain approval from your
      supervisor.
   2. Identify who is responsible for the evaluation.
   3. Form a steering committee and/or a program evaluation team to oversee
      the project.
   4. Use Program Logic to get an overview of the program
   5. Draft the Evaluation Plan which may also be a ‘terms of reference’.
   6. Decide who should conduct the program evaluation and if an external
      evaluator is needed.
   7. Conduct the program evaluation.
   8. Communicate results.
   9. Use the results to improve the program and contribute to the evidence
      base decision-making

          Deciding who should conduct the Program Evaluation …

               Read the information in the box below and …

             Discuss each of the 3 factors listed. How would you determine level of skill;
              capacity to undertake evaluation; and degree of independence?
             Consider whether you need to engage an external evaluator and would it be
              for the whole evaluation or only for some components?.

The decision about who conducts the evaluation involves discussing the
following factors:

   ■   Level of skill required for all elements of the evaluation
   ■   Capacity of the program staff to undertake evaluations
   ■   Need for level of independence from the organisation

If you decide to engage an external evaluator, your organisation will have its
own procurement procedures that you are required to follow. These
procedures describe the steps that need to be followed when purchasing
services from contractors and consultants and should be read before making
any purchasing decisions. Legal advice is generally also available in relation to
drafting a contract.

OCTOBER   2012 © LJ LARRI, RENSHAW-HITCHEN & ASSOCIATES

PAGE 11
You can also read
NEXT SLIDES ... Cancel