Welch, T(March 2000) 'SAIDE evaluates its evaluations' in SAIDE Open Learning Through Distance Education,
Vol. 6, No. 1, SAIDE: Johannesburg |
||
Southern African Global Distance Education Network - South Africa Page | SAIDE Homepage | Contents |
SAIDE has conducted ten major evaluations in the last two years, and is likely to conduct as many again in the next two years. As a result, we felt the need to reflect in a structured way on our experience and deepen our understanding of evaluation research, so that we can do our work in a more effective way in the future, writes Tessa Welch.
We decided to conduct a meta-evaluation. We identified a sample of our evaluation reports and asked experienced external evaluators to evaluate them in terms of:
We then synthesized the comments, and held an internal workshop in which we discussed the findings and received input from an evaluation expert, Ray Basson, from the University of the Witwatersrand.
What our meta-evaluators thinkInput from Ray Basson
- Generally reviewers were impressed with the contextualization of the evaluation reports, both locally and nationally. The SAIDE evaluators had reviewed the necessary literature and practice to enable them to make informed judgements about the programmes they were evaluating. It was said that both this and the professionalism of the reports in terms of comprehensiveness, coherence, and fairness inspired confidence.
- In the main, readers felt that the purposes of the evaluations were explicit, except for one instance in which SAIDE had not been clear whether the work was meant to inform the improvement of the programme or result in generalizable statements of interest to researchers or policy makers in general. It was recognized that the very nature of evaluations means they evolve as they progress. At a certain point in the process, however, a formal contract with an agreed scope of work, closely tied to the evaluation proposal, is necessary to avoid confusion about the purpose of the evaluation.
- The impression gained by reviewers was that SAIDE has an open and professional relationship with the programmes being evaluated. This impression was reinforced by the apparent willingness of staff and students to participate in the evaluation. Although it was seen as positive that, when evaluating, SAIDE played a facilitative, supportive, and collegial role, one reader said that careful consideration needs to be paid to balancing this role and attitude with being sufficiently critical. If SAIDE is too helpful and too friendly to the projects it evaluates, it detracts from the framework of hermeneutic of suspicion in which evaluations should operate.
- Readers commented that generally a good mix of methods was used in evaluations, enabling triangulation of findings. They also commented on honesty in reporting data and describing the limitations of the research. However, some readers felt that a weakness was lack of discussion of possible alternative approaches. This had the effect of making the chosen approach seem self-evident, sensible, and natural.
- Although it was felt that the reports were likely to facilitate and encourage changes, reviewers said that SAIDE needs to be clearer about the type of decisions that may result from an evaluation. A common thread in the evaluations was the promise/hope that they would lead to improvements in the programme, but it was not always clear how evaluations would actually help practitioners and/or decision-makers.
- Most readers commented that, although there was evidence of informal quality assurance in the evaluation process (such as ongoing discussions between the programme director and evaluator, or getting interviewees to read interviews for accuracy), there could have been more carefully planned opportunities for stakeholder feedback.
In response to a question asking for a description of alternative evaluation methods, he mentioned the following:
In response to a question about the evaluators voice, he said:
There are two ways of approaching the writing. The first focuses on what the data is compelling the evaluator to see, and conclusions are driven by the data. The second (in which the evaluators voice is more pronounced) is when the evaluators feel compelled to make additional or disclaimer remarks about what is emerging from the data. There may be inconsistencies or anomalies in the data and the evaluator articulates these with alternative viewpoints and/or disclaimers.
In response to a question about how we can ensure sufficient rigour in our analysis, he said:
This depends on two things: the volume of data and coding the data. Some software packages like NNU.DIS automate some of the search functions of manual coding. Large volumes of data make using automated techniques more viable. Manual coding techniques are outlined in Miles and Hubermann and involve coding on a line-by-line basis. One can either use the evaluation question as the starting point to examine what the data tells you in order to develop the code and from which the criteria emerge. Alternatively, one can start with the developed criteria and use these to code and hence analyse the data.
SAIDEs responseIn the discussion that followed, SAIDE staff discussed a number of these issues.
The first question raised was whether methodological diversity is possible within budgetary constraints. Most alternative approaches demand more time and money than SAIDEs clients are able or willing to pay. In addition, the length of the report has implications for project planning, which must be taken seriously.
While it is clear that we have the knowledge and practice base to make informed judgements about programmes, we need to become more aware of evaluation methods and theoretical approaches.
Other questions raised more briefly in the discussion included:
Conclusion
This meta-evaluation process is ongoing within SAIDE. It is supported by the work of the Resource Centre in cataloguing and classifying our evaluation reports, as well as by the team in collecting examples of project proposals, evaluation instruments, descriptions of evaluation approaches for use as we embark on new evaluation research.
A focus for this work in 2000 will be on developing mechanisms for obtaining systematic client feedback, so that we can incorporate these insights into the project as it evolves, or use it to inform future projects. We would also like to develop ways in which we can track how clients implement the recommendations made in our evaluations after project completion.
Southern African Global Distance Education Network - South Africa Page | SAIDE Homepage | Contents |
South
African Institute for Distance Education |