International Network for the Availability of Scientific Publications |
|
||||
Home About Events Links Newsletter Publications | AJOL Health LSP PERI PSI South |
|
Health Information Forum: Working together to improve access to reliable information for healthcare workers in developing and transitional countriesWorkshop 9: Monitoring and evaluation of health information activitiesBritish Medical Association, London, Tuesday 18 January 2000, 4-6pm Chair: Dr Muir Gray (Director, Institute of Health Sciences, Oxford)
The meeting opened with three items of Health Information Forum (HIF) business:
Dr Muir Gray opened the meeting by linking the Euston Road area with the social revolutions brought about by the railways in the 1850s and now the knowledge industry. The Internet is a powerful instrument of social change and hence the information it carries needs to be carefully assessed. Monitoring and evaluation, the theme for the day, are therefore of great importance. Speaker 1: Barbara Stilwell (Behavioural Scientist, Organization of Health Service Delivery, WHO) Dissemination, adaptation and use of information from WHO a systemic approach Most programmes in WHO have two ways of disseminating information: (1) producing guidelines and sending them to extensive mailing lists, or (2) setting up training courses to introduce new information. A variety of training approaches has been developed and used in WHO, although not systematically evaluated for effectiveness. The majority of evidence of effectiveness in WHO training is anecdotal, and while this has its own value, more rigorous evaluation methods would provide firm evidence on which to build policies and methods in the future. All programmes agree that health workers' clinical practice should be changed by their training interventions and by the dissemination of new knowledge, but evaluations of outcomes don't routinely include practice-based assessments. Research from others indicates that information alone will not impact practice (Haines and Donald, 1998). WHO provides many examples of best practice guidelines, with guidance on implementation (usually through training). Given the interest in evidence-based practice and policy, this is likely to become more widespread, and information about the effectiveness of dissemination and use of these materials is urgently needed. There is a consensus among programmes that it is essential to know the learning needs of those who are to be taught, in relation to their practice, and needs assessments are routinely done prior to training. Again, however, there is no consistency in methods of needs assessments: evaluations of outcomes of training are done, but generally for knowledge and satisfaction with the course. Systemic approaches to developing capacity Given what is known about success in changing practice and influencing decision making, and the case based reporting in WHO and other agencies, a systemic approach to education and training is likely to be one which most influences change in practice. The systemic approach recognizes that change in one part of a system will influence, and be influenced by, the other parts of that system (Senge 1990). The systemic approach (see diagram) builds needs assessment and evaluation into a continuous cycle, so that evaluation re-informs the processes of cause identification and change. In this framework the management of change processes are featured clearly when needs are analyzed and when system changes are implemented. This means that the context of change is managed, with `buy in' from those who must implement and sustain changes. Application This systemic model of implementation of change to a system or to an individual is currently being tested within WHO programmes. Figure 2 (available on request to < [email protected] >) shows the design of a project to increase injection safety through the use of information gathered in a 'toolkit'. The toolkit will be developed in partnership with the countries in which it will be used over a year, and the processes of feedback will be built in to the work of the project, so that each country where it is field tested will be able to suggest changes and new information. This work and other similar projects will take place from now until the end of 2001, and it is anticipated that findings will be published to stimulate discussion and further input in this important area. References Haines A, Donald A, (1998) Getting Research Findings into Practice. BMJ Books, BMJ Publishing Group, London Senge P., (1990) The Fifth Discipline, Random House, London, UK Speaker 2: Andrew Chetley (Communications and Information Manager, Healthlink Worldwide) Experiences of evaluation Healthlink Worldwide sees monitoring and evaluation as fundamental to our ability to learn, to develop, to grow, to improve and to increase the impact of the work we do. We have a long history of evaluating project activities and monitoring the use and more recently the impact of health information. This whole process provides us with valuable insight into how people use the materials we produce, how they interact with the information services we support, and what information needs are not being met. I'm going to summarise briefly a few of the evaluation exercises we have been involved with in recent years. Three of these deal with looking at the way we can measure the effectiveness of work with resource centres or information centres, and two look at specific information materials. Tanzania creating demand In Tanzania, Healthlink Worldwide has worked in partnership with the Centre for Educational Development in Health, Arusha (CEDHA) and the Continuing Education Programme of the Tanzanian Ministry of Health for more than eight years. Work started with an information needs assessment, included training and support for infrastructure development, and began to improve access to health information for health workers at district level through the development of zonal and ultimately district level resource centres. When the project was evaluated, most of the objectives set out in the project documents were achieved. What we said we would try to do was, for the most part, done. But there was one problem: we had helped establish a group of resource centres, we had helped train staff to run them, they were well supplied, but they were under used. Why? The evaluation identified the concepts and styles of teaching and learning were acting as a barrier to use of the centres: there was a strong dependency on lectures and little encouragement of independent learning. To really have impact, we needed to explore a closer integration of information provision and training. Some of that is now being attempted in Tanzania. The lesson being learned here was that having a clear understanding of information needs and being prepared to create a demand for information and support capacity to respond to that demand is as important as simply supplying information. The lesson is also about being engaged in a continual learning exercise so that experience influences the next stage of the process. Namibia it takes time But the lessons from Tanzania have already been incorporated into work with a pilot project in Namibia the Communication for Integrated Learning Project with the Ministry of Health. Again, a careful needs assessment was carried out, resource centres were established and promoted as part of a national training programme to build health worker capacity in the country, and support and training was given for the development of national health learning materials. The evaluation of the first three-year period found considerable success again meeting most of the objectives set out in the project plans. It attributed much of the success to the alignment of the project's objectives to national priorities. However, there were difficulties related to a high turnover of national staff and to the slow provision of basic infrastructure within the country. The evaluation called for the project work to be extended for at least two more years, to provide more time to better test the impact. An extension is currently being discussed. The lesson being identified here is that health information work requires time to be effective and that it cannot function independently from other activities within the health care system, and sometimes from other sectors such as transport or communications links. Kenya increasing participation In Kenya, Healthlink Worldwide's AIDS and Sexual Health Programme has been working with the Kenya AIDS NGO Consortium (KANCO) for more than five years. The partnership has involved dissemination of information through the development of an East African edition of AIDS Action and through support for developing and strengthening a central resource centre and eight district resource centres. Two evaluations in the past three years have contributed much to learning about how to increase the impact of this work. The first was a review of work of the main centre in 1997 and the second was an external evaluation of the AIDS and Sexual Health Programme in 1999 which used KANCO as a case study. One of the recommendations of the first study was to increase participation of people at district level in the development and production of information materials and to make extensive use of the resource centre as a networking tool. This approach was confirmed by the 1999 review which stressed the importance on focusing on how to improve communications strategies with partners, rather than simply improve knowledge or information transfer. The lesson being learned here is the importance of seeing the information work of Healthlink Worldwide (and of its partners) in the context of an education or communication process. The tools that are developed, the vehicles used to deliver information, are only going to be effective if they operate within a well defined communication strategy and where the lines of communication within the society are well understood. Focus on end users One of Healthlink Worldwide's major activities indeed something for which most people probably know the organisation best is the production of a range of practical publications, including four regular newsletters in 10 languages that reach an estimated 2 million readers. Over the years, there have been continual surveys and evaluations of these publications to learn more about who is using them and how they are being used. Readers are surveyed every two to tree years through mail questionnaires that get a 20% or better return rate via focus groups, through key informant interviews, and as part of any information needs assessments. Over the years, we have discovered that our newsletters:
This last point is perhaps the most interesting one training. When the newsletter were first developed, there were seen as providing isolated health and community workers with a link to practical information. As we learned that up to 75% of readers surveyed use the newsletters for training or education purposes, we began to change the content, style and format of the newsletters so that they could be used more effectively for training. In doing this, we were paying attention to the lesson that was highlighted in the recent AIDS and Sexual Health Programme evaluation which is to
Increasing impact This was also a lesson that we were able to identify in a study that we undertook for the World Health Organisation in 1998/99 when we explored the use and impact of materials aimed at strengthening the development of health systems. A key finding of that study was that, even where information was available,
Where end users were involved in the preparation of the material through a review process or a writing or adaptation workshop or in a training workshop to explain the contents in more detail and in a practical manner, people found the materials more useful, were more likely to use them, and were more likely to adapt or recommend those materials. Three separate discussion groups took place, each with a specific question to address: 1. How can we do better to share the lessons learned from M & E? How are the results of M & E shared at the moment and how might they be improved? Results tend to be shared within the organization concerned, but not externally. This may be partly because the potential value of the results to outside organizations is not recognized by the investigators, and partly because of lack of a culture for sharing both positive and negative experience with others. Although some M & E was shared externally the data shared tended to be just the end results rather than the methodology of the learning process. Positive results might well have a wide circulation but negative results or non-results do not, presumably because, in the current culture, admission of negative results is seen as a `failure'. It was suggested that M & E results might be anonymized. At a practical level there is a lack of documentation on M & E. Such documents are often part of the `grey literature' (unpublished literature) and may not be recorded. The profile of experience in this area is low. One of the problems associated with M & E is that it is frequently tagged on to the end of the project, rather than being an integral part of it from the start. This tendency is exacerbated in projects where no one person has overall responsibility for the whole project/process including the dissemination of results and their evaluation. Also project implementation may be hurried and therefore M & E may be ignored. Both qualitative and quantitative aspects are important, and both are difficult to collect. There is also evidence that good quality alone cannot change practice, although, as pointed out by the Chair, access to reliable information is a prerequisite. Further ideas and recommendations might be found in:
In conclusion, there is a need to share the results of projects with the users as well as with other practitioners, and HIF's recently launched EVAG (Evaluation Action Group) and e-mail discussion list are one step towards encouraging these objectives. 2. What kind of health information activities need M & E most, and how should this be done? Some health information initiatives are designed to support changes in behaviour, others are designed to inform, but the evaluation of all health information activities are important, though some may currently be more neglected than others. Information is presented in a number of different formats e.g. training courses, conferences, publications, web sites etc., thus the process as well as the outcome needs to be evaluated. Some quantitative information is relatively easy to collect, e.g. numbers of publications sent out, number of attendees, but this type of feedback has little relation to the objectives of most information projects. Qualitative and quantitative information (where possible) are important and three levels of outcome can be identified - how good the process/information is, how it is used, and its effectiveness. Thus relevance, continuity, and feedback should be integral to the M & E process. The measurement of impact may be difficult to measure but account should be taken of unexpected impacts (both positive and negative) as well as the desired impacts. Most importantly, honesty in reporting results - particularly negative outcomes - is essential if overall progress is to be made. This can be difficult since funding is so often tied to successful projects and programmes. 3. What do we most need to learn from M & E? The main outcome of the discussion was that M & E can be a lot more pro-active in influencing changes to practice than is often recognised. This is especially the case in understanding the role of catalysts in the process of change. Two broad categories were decided upon. Is the work or publication actually making a difference? And 'why' do things happen or not happen - how can we understand the function of catalysts in making things happen? M & E gives us the capacity to understand what happens when we move from existing practice to new practices and technologies. Within this it is important to consider the role of the human being - not just the new technologies. If this holistic approach is adopted a gap may be spotted, filled and the impact of the new situation can be evaluated. It is important that M & E encourages people to develop their own ideas on how shared objectives might be achieved. For instance M & E may provide indications about where the motivation for change comes from. If this can be identified then it should feed back into improving the system. It is important that what we learn from M & E should better enable us to create a learning environment. There will be different types of impacts e.g. the impacts of programmes, and information resources, but M & E should enable us to explore and learn the different impacts of programmes and the application of such results. Thus it can provide indications as to the best way of moving information resource material into practical applications and implementation. Monitoring sales or dissemination ranges does not provide the sort of deeper information necessary for understanding the impact of the resource. But the development of realistic indicators for impact is a problem. M & E can help us understand the cost-benefits of developing peripheral dissemination and implementation support for information resources, which can help when seeking funds for resource development and programme implementation. In this context, and for the benefit of all concerned, the development of visual representations of M & E systems would make them more easily understood, thus allowing all parties to see where they fit in and the way in which the M & E information is processed. Conclusions and recommendations The Chair commented on project management, that the process needed the type of quality assurance approach identified by the various groups. He emphasized the importance of negative results; that it was important to know why, even after careful planning, things had gone wrong and ended in failure. He suggested that short `case reports' might be mounted on a web site indicating what had been tried and what had happened, with commentary on the results for `fast learning'. Information/knowledge should be made acceptable and accessible, but it is the responsibility of management to make use of it. Andrew Chetley announced that the M & E Action Group had started with about 12 participants and that these people would take discussions further and report back to HIF. Neil Pakenham-Walsh said he looked forward to an increasing flow of information among all members of the Health Information Forum and the Evaluation Action Group. Particular areas to address would include methods of documenting information on M & E and sharing of that information. The Chair ended the meeting by emphasizing the importance of evaluating those information activities that were possible to evaluate, but suggested there may a danger in being overconcerned about the evaluation of activities that are not possible to evaluate. Perhaps one of the most important factors in bringing about change is to advocate the right of access to information - the right to know and to make informed choices. |
|||||||||
Home | INASP-Health | Health Information Forum | Go to top | Go Back |
---|