Welch, T(March 2000) 'SAIDE evaluates its evaluations' in SAIDE Open Learning Through Distance Education, Vol. 6, No. 1, SAIDE: Johannesburg
Southern African Global Distance Education Network - South Africa Page SAIDE Homepage Contents

SAIDE evaluates its evaluations

SAIDE has conducted ten major evaluations in the last two years, and is likely to conduct as many again in the next two years. As a result, we felt the need to reflect in a structured way on our experience and deepen our understanding of evaluation research, so that we can do our work in a more effective way in the future, writes Tessa Welch.

We decided to conduct a ‘meta-evaluation’. We identified a sample of our evaluation reports and asked experienced external evaluators to evaluate them in terms of:

We then synthesized the comments, and held an internal workshop in which we discussed the findings and received input from an evaluation expert, Ray Basson, from the University of the Witwatersrand.

What our meta-evaluators think

Input from Ray Basson
Ray Basson shared with us some insights from a paper entitled, Curriculum Evaluation - Possibilities Pro-offered by Qualitative Approaches (available in the SAIDE Resource Centre). In this paper, he outlines five general approaches showing their basic similarities and offers a set of guidelines to apply to evaluation approaches.

In response to a question asking for a description of alternative evaluation methods, he mentioned the following:

In response to a question about the evaluator’s voice, he said:

There are two ways of approaching the writing. The first focuses on what the data is compelling the evaluator to see, and conclusions are driven by the data. The second (in which the evaluator’s voice is more pronounced) is when the evaluators feel compelled to make additional or disclaimer remarks about what is emerging from the data. There may be inconsistencies or anomalies in the data and the evaluator articulates these with alternative viewpoints and/or disclaimers.

In response to a question about how we can ensure sufficient rigour in our analysis, he said:

This depends on two things: the volume of data and coding the data. Some software packages like NNU.DIS automate some of the search functions of manual coding. Large volumes of data make using automated techniques more viable. Manual coding techniques are outlined in Miles and Hubermann and involve coding on a line-by-line basis. One can either use the evaluation question as the starting point to examine what the data tells you in order to develop the code and from which the criteria emerge. Alternatively, one can start with the developed criteria and use these to code and hence analyse the data.

SAIDE’s response

In the discussion that followed, SAIDE staff discussed a number of these issues.

The first question raised was whether methodological diversity is possible within budgetary constraints. Most alternative approaches demand more time and money than SAIDE’s clients are able or willing to pay. In addition, the length of the report has implications for project planning, which must be taken seriously.

While it is clear that we have the knowledge and practice base to make informed judgements about programmes, we need to become more aware of evaluation methods and theoretical approaches.

Other questions raised more briefly in the discussion included:

  1. Where there are multiple authors, how can we best impose a report structure that takes account of themes emerging in analysis?
  1. How do we avoid using contested terms like ‘open learning’ or ‘outcomes-based education’ unproblematically and as assumed universals? How does this relate to our engaging in greater depth with broader literature?

Conclusion

This meta-evaluation process is ongoing within SAIDE. It is supported by the work of the Resource Centre in cataloguing and classifying our evaluation reports, as well as by the team in collecting examples of project proposals, evaluation instruments, descriptions of evaluation approaches for use as we embark on new evaluation research.

A focus for this work in 2000 will be on developing mechanisms for obtaining systematic client feedback, so that we can incorporate these insights into the project as it evolves, or use it to inform future projects. We would also like to develop ways in which we can track how clients implement the recommendations made in our evaluations after project completion.


Southern African Global Distance Education Network - South Africa Page SAIDE Homepage Contents

South African Institute for Distance Education
SAIDE.
Uploaded on: 5 July 2000
www.saide.org.za/worldbank/Default.htm