Performance improvement – reproductive health organizations
Performance Improvement (PI), a process pioneered in industry, is now helping to strengthen reproductive health organizations. PI focuses on meeting the needs of service providers and other staff members. When programs enable and inspire staff to do their best, the quality of care improves.
People need the right knowledge and skills to do a good job, but they also need to know what is expected of them and whether they are meeting expectations. They need to have good working conditions, strong support from their organization, and incentives to perform well.
When people do not perform well, there usually are a number of reasons. The PI approach can help organizations identify and address them all. Performance Improvement is useful in resource-poor settings because it focuses attention on often-neglected causes of performance problems, such as unclear expectations or infrequent feedback, that need not be costly to correct. Performance Improvement is used primarily to solve problems, but it can also help to set up a new job or help staff members take on new tasks or adjust to new standards.
Reproductive health care organizations apply Performance Improvement in a process that is carried out by stakeholders–the staff members, clients, managers, and others who are affected by a performance problem or are interested in solving it. In turn, stakeholders usually need help from facilitators–staff members or consultants who have training or experience with Performance Improvement. The PI process is comprehensive, beginning with research and ending with evaluation of solutions:
1. Consider the institutional context of the performance problem and foster agreement on the objectives of the PI process.
2. Define desired performance.
3. Describe actual performance.
4. Measure or describe the performance gap.
5. Find the root causes of the performance gap and link them to performance factors, such as incentives or knowledge and skills.
6. Select interventions that address the root causes.
7. Implement interventions.
8. Monitor and evaluate performance.
Performance Improvement encourages use of evidence-based “best practices.” In place of trial and error, it offers a systematic approach. Instead of guessing or jumping to conclusions about the reasons for poor performance, managers can use analytical techniques. For the tendency to use familiar solutions, the PI process substitutes closely reasoned links between root causes, performance factors, and solutions.
Beginning with a pilot project in 1998, reproductive health organizations have used the PI process to:
* Respond to demands by clients for improved reproductive health services (Dominican Republic);
* Learn why providers are not following guidelines for infection prevention despite their training (Ghana);
* Perform national needs assessments for reproductive health care, examine organizational performance problems, and decide on priorities (Armenia, Burkina Faso, Nigeria, and Tanzania);
* Establish standards of care and help clinics meet the standards for licensing or accreditation (Guatemala and Honduras);
* Help decentralize health services (Tanzania);
* Identify barriers faced by community midwives (Yemen); and
* Design incentives for private providers to counsel clients better about family planning and to provide services (India).
Performance Improvement is inclusive. It empowers and encourages people to look beyond causes of job problems that they can do little or nothing about and to take into their own hands the task of improving services. Staff members, supervisors, clients, and community members work together to assess needs and find solutions. When necessary, they can seek help from experts in communication, logistics, management, and training.
Performance Improvement promises to be a powerful addition to the quality improvement methods available to reproductive health programs. It can help solve performance problems with well-conceived solutions that lead to more productive and satisfied workers providing better reproductive health care for more satisfied clients.
Most people feel that they could do their jobs better. They could work harder, produce better work at a faster pace, and make fewer mistakes. Training can help, but not always and not alone, because lack of knowledge and skills may not be the problem or the only problem. In health care, as in other fields, employees need support from their organizations in other areas besides knowledge and skills. For example, many people are unsure about what is expected of them, or they need adequate workspace, up-to-date equipment, or a reliable source of supplies. Some people need rewards for producing excellent work.
Performance Improvement (PI) is a process that helps organizations create the conditions for high employee productivity. Used in industry since the 1960s, Performance Improvement is now being adapted in developing countries by organizations that provide reproductive health care and general primary health care. In this introductory phase Performance Improvement has shown promising results. It has helped to enhance quality of care, encourage collaboration among reproductive health organizations, and identify priorities for program development. Practitioners continue to test and refine the process, and reproductive health organizations are adapting the principles of Performance Improvement to their specific needs.
Performance Improvement is helpful in resource-poor settings because it offers low-cost solutions to performance problems. Several factors typically neglected by organizations need not be expensive to correct, for example, providing staff members with clear expectations and frequent assessments of their performance.
The PI process expands the choices of reproductive health organizations seeking to improve services (see Population Reports, Family Planning Programs: Improving Quality, Series J, No. 47, November 1998). Other approaches include Operations Research (OR), which has been used in reproductive health since the 1970s, and initiatives introduced in the 1980s and 1990s such as Quality Improvement and COPE (Client-Oriented, Provider-Efficient). These approaches vary somewhat in theory and in the tools they use to analyze performance. All, however, offer the important benefit of a systematic process for investigating the causes of problems and finding solutions.
The PI Process
Performance Improvement encourages an understanding of the organization as a system of interdependent functions and people. The system responds to influences from the environment–particularly the needs of its clients–and turns resources into products or services. In a well-run organization there is alignment of structure, goals, and strategies with the processes through which work gets done and the performance of staff members (142).
The focus on job performance is essential. Performance is not behavior or knowledge but rather the results of behavior and knowledge. In most cases performance can be measured (48).
Performance problems usually indicate weaknesses in the support that organizations provide to their staff members, rather than problems with staff members themselves (48, 142). Performance Improvement guides organizations in viewing problems systemically and addressing all the areas that enhance performance.
Performance Improvement is inclusive. Everyone participates who is affected by the performance problem or has an interest in solving it. These participants are called stakeholders, and chief among them are the staff members themselves and the clients they serve. Other stakeholders often include top managers of the organization, supervisors of the staff members, community representatives, government officials, and donors. Stakeholders usually need help from facilitators, people who have had training or experience with Performance Improvement.
In reproductive health care PI facilitators use a step-by-step process (see Figure 1, next page). Performance Improvement has a variety of benefits (see box, p. 5). Many organizational problems have causes that would not be uncovered without the systematic and comprehensive thinking encouraged by the PI process. The step-by-step process of Performance Improvement helps stakeholders to organize and analyze information before deciding what to do. It discourages guessing about the causes of performance problems or choosing solutions prematurely. Without such a process, managers may unfairly blame staff members for performance problems, suggest an ineffective solution, or suggest one solution when several are necessary (4, 63, 101, 141).
[FIGURE 1 OMITTED]
Applying the PI Process
Since 1998 reproductive health care organizations in over a dozen countries have used Performance Improvement to address job-related problems involving service providers, supervisors, support staff, logistics staff, and trainers (see Table 1) (46, 79, 91, 130, 131, 135). The PI process helped these organizations:
* Respond to requests from clients for improved services–and particularly more considerate treatment–in the Dominican Republic. The Dominican Social Security Institute (IDSS) is one of the few organizations that has used the PI process from start to finish to improve reproductive health services (see box, p. 7).
* Find out why providers in Ghana were not following guidelines for infection prevention despite their training.
* Decentralize health services in Tanzania by strengthening Zonal Training Centres.
* Explore why providers in Kenya, who were trained to offer postabortion care, were not using their skills.
* Perform national needs assessments for reproductive health care, examine organizational performance problems, and decide on priorities in Armenia, Burkina Faso (see box, p. 10), Nigeria, and Tanzania.
* Determine qualifications and organizational support for new community-based distributors in Burkina Faso.
* Improve preservice clinical training at schools of midwifery in Ghana.
* Identify barriers to provision of services by community midwives in Yemen.
* Design incentives for private practitioners in India to counsel clients about their needs for family planning and to provide services (see box, p. 15).
The PI process has also encouraged cooperating agencies of the US Agency for International Development (USAID) to collaborate in analyzing performance gaps and root causes and in generating and carrying out solutions in their areas of expertise, for example, communication, logistics, management, and training (46, 91, 120, 135, 136).
Managing the PI Process
Top-level managers are in a good position to initiate the PI process because they have a comprehensive view of the organization (143). If others initiate the process, however, top managers must at least endorse and support their efforts, and they should participate at key stages, such as defining desired performance.
Managing change–and especially the resistance that often accompanies change–is also a responsibility of management. Resistance may come from employees who fear that they will have to do more work without receiving more pay. Managers need to communicate a strong vision of the organization and an urgency for change. In making the decision to use the PI process, top managers also need to consider the qualifications of facilitators, staff time required, and cost.
Facilitating the PI process requires a thorough understanding of the methodology and good project management skills. In general, facilitators need to communicate well, build trust in the process, inspire people to participate, run meetings, negotiate, forge consensus, and mobilize staff and resources (129). They need to listen well and encourage civil discussion of often contentious issues. Facilitators need tact to dissuade stakeholders from prematurely assigning causes of performance problems and selecting solutions. They also need tact to persuade managers to give up some authority and allow decisions to be made by stakeholders.
One or two people can facilitate a small PI project, but a team may be required for larger projects. If possible, at least two people should facilitate the process so that they can compare insights and observations and share the work. The facilitators may change during the process as the skills needed for specific steps change. In the beginning facilitators are strong listeners, negotiators, and consensus-builders. At the end, to help organization staff members carry out and evaluate the solutions, facilitators may be asked to hire experts in such fields as performance appraisal procedures or performance incentives (129). In a program in Honduras to strengthen family planning and prenatal services, for example, stakeholders found that weak incentives caused a performance gap, and the facilitators selected a motivation and incentives team to address the problem (38).
Managers can appoint staff members or hire consultants to facilitate the PI process. A combination of staff members and consultants may work well in some organizations. On one hand, staff members know their organization and can suggest problems that could be addressed with the PI process (140). Also, stakeholders may prefer to work with people they know and who are always available, rather than with consultants who move on to their next job after a few weeks (154). On the other hand, consultants have expertise that stakeholders respect. They can more easily insist on carrying out each step of the PI process and resist pressure from stakeholders who want to rush through the process (121). They bring knowledge of performance problems at other organizations and solutions that have worked (101).
Moreover, with a fresh perspective, consultants sometimes can see problems that staff members have become accustomed to and no longer notice. As outsiders, consultants generally are less fearful than staff members about describing problems frankly (121), and they do not have relationships with stakeholders that might impede staff members’ work as facilitators.
Time and cost. The time required for the Pi process depends mainly on the scale of the performance problem, the availability of stakeholders, and the priority they assign to solving the problem. In general, staff members need one to two weeks to learn to facilitate the process, and they need from one day to several weeks to observe actual performance (88). To strengthen the Zonal Training Centres in Tanzania, for example, the PI facilitators developed checklists and interview guides in meetings over five days, and they collected information on actual performance in four weeks during visits to the centers (135).
In the performance needs assessment, the main cost is time off the job for the stakeholders and staff facilitators. Meetings and observation also often entail travel and per diem costs. The better work, higher efficiency, and improved morale resulting from the PI process, however, can more than make up for the time spent.
Stakeholders often attend several meetings to discuss the performance problem, define desired performance, analyze root causes, and decide what to do. In a program in Yemen working with community midwives, for example, the PI facilitators took about one month to help stakeholders measure the performance gap, analyze root causes, and select solutions (141). In the Tanzania training center project the PI facilitator conducted stakeholder meetings over two months to agree on the performance problem, decide on the staff members whose improved performance could best solve the problem, define desired performance, select indicators and data sources, and design data collection tools. Stakeholders met again following collection and analysis of the data to discuss performance gaps and root causes and to decide what to do (135).
Used only for a performance needs assessment, the shortened PI process can be carried out in one meeting. In Ghana, for example, 22 stakeholders met for a half day to describe performance gaps in infection prevention, determine causes, and recommend solutions. The stakeholders represented the Ministry of Health, the Nursing and Midwives Council, medical schools, regional training centers, USAID, and development organizations working with USAID. Staff members of the ministry and JHPIEGO facilitated the meeting (20).
The time needed to carry out solutions depends on the scale of the project and the resources available. Procedures to communicate expectations or to assess performance in one department usually can be put in place in a few weeks. Writing policies or manuals often takes several months (26). In contrast, training for a national cadre of health care providers may require a year or more (131).
The cost of solutions depends on the root causes being addressed and the scale of the solutions. Strengthening knowledge and skills through training can be expensive, but providing a job aid, such as a chart or checklist, may be as effective and cost much less. Changing a policy, posting job expectations, or establishing an appraisal procedure for staff members need not be expensive. For example, the IDSS in the Dominican Republic set up a system of rating cards and suggestion boxes for clients to comment on their care at a cost about US$1,700 for design, production, training, and distribution to 12 health care facilities in two provinces (63, 119).
Budgeting for a project using the PI process can be carried out in two steps, since the cost of closing performance gaps is not known at the start of the process. The initial budget estimates the cost of measuring the performance gap, finding the root causes, and selecting potential solutions. Stakeholders and the PI facilitators may roughly estimate the cost of the potential solutions at this point, since cost is one criterion used to select solutions. Once stakeholders have selected solutions, PI facilitators and program managers can estimate their costs more precisely and complete the second part of the budget (42).
Performance Improvement, Quality Improvement, and MAQ
Performance Improvement is similar to Quality Improvement, which has been adapted for developing country health care organizations by the Quality Assurance Project (98). Both encourage organizations to compare their services with standards of care, seek the causes of substandard care, and identify and select solutions that will help staff members meet or exceed standards (122). Practitioners note that these two processes developed from different fields and thus often approach problems from different starting points.
Quality Improvement grew out of the fields of engineering, statistics, and management, while Performance Improvement grew out of behavioral psychology and instructional design–a field dealing with the analysis of gaps in knowledge and the development and evaluation of training (18, 39, 139, 151). As a result, practitioners of Quality Improvement often begin by analyzing systems and processes that affect individual performance (98). In contrast, PI practitioners often begin by analyzing the performance of individuals or groups of employees, such as nurse-midwives or supervisors, and then examine the systems and processes that support individual performance (102).
Also, Performance Improvement can be used to set up a new job or add a new skill to the responsibilities of an individual or a group of employees. Quality Improvement, in contrast, addresses performance problems but usually not a new job (85).
The PI process is similar to processes that guide the development of training or communication projects. Many training programs use the ADDLE model (analysis, design, development, implementation, evaluation), which was a forerunner of the PI process (139). Many communication programs use the P Process (analysis; strategic design; development, pretesting, and production; management, implementation, and monitoring; and impact evaluation) (126).
Performance Improvement complements another approach to improving quality, the Maximizing Access and Quality (MAQ) Initiative, which USAID began in the early 1990s. The MAQ initiative–through exchanges of information on best practices in reproductive health–has encouraged programs to develop and follow guidelines that set high standards. The MAQ list of approaches to improving access and quality–such as provider rewards, client and community engagement, and leadership development–can help stakeholders select solutions to performance problems (64, 122, 124, 141).
The Implementing Best Practices (IBP) Initiative also has promoted high standards of care in reproductive health. Formed in 1999, the IBP Initiative is currently carried out by a consortium comprising the World Health Organization (WHO), USAID, the United Nations Population Fund (UNFPA), and eight collaborating organizations. They use the experience of reproductive health programs worldwide to establish, disseminate, and apply evidence-based best practices with a process similar to the PI process (162).
Performance Improvement can be used any time the performance of an individual, a group of employees, or an organization could be better (102). The opportunity to use Performance Improvement often arises when supervisors or decision-makers request training for employees who are not performing well. Thus trainers are in a good position to introduce Performance Improvement into an organization and should be aware of the process, tools, and other resources.
The PI facilitator’s best response to a request for training is an invitation to discuss the problem further. If, instead, the PI facilitator immediately says that training alone may not solve the problem and recommends Performance Improvement, supervisors may look for someone else to do the training (102).
Some facilitators use the PI process without announcing it as a new way of solving problems. In organizations where staff members will be put off by a formal process that sounds time-consuming, or where other approaches have failed, simply carrying out the process has avoided initial objections (63, 88).
Responding to a request for help, facilitators begin by collecting preliminary information about the performance problem. Information gathering begins with the key decision-maker, the person who made the request or who will be responsible for the results. The decision-maker identifies the people, documents, and records that facilitators should consult.
Facilitators also examine the institutional context–organizational goals, strategies, and culture. For problems with service delivery, facilitators also need to understand the perspectives of clients and community groups.
The facilitators synthesize the information and draft a preliminary description of the performance problem and its context. In the full PI process the facilitators present the results first to the decision-maker and then to the other stakeholders in a project agreement meeting. In shortened or less formal applications of the PI process, facilitators can hold a single meeting with decision-makers and other stakeholders (20, 91, 102). Stakeholders need to agree on the group of staff members whose work needs to be improved and the scope of work–how large a project to conduct, how many people to involve, and how much money to allocate.
Involving all stakeholders is essential because all perspectives need to be included for the PI process to succeed. Also, omitting and thus offending stakeholders can make them resistant to change. Facilitators ask the decisionmaker whom to include. Some draw a diagram showing all those connected to the staff members whose performance will be improved. In some countries kinship may have to be considered as well as organizational connections.
Getting and maintaining the agreement of stakeholders is one of the most important and difficult tasks of PI facilitators. Rarely will stakeholders agree on every aspect of a performance problem. They may disagree about the causes of the problem or how to measure desired performance, for example. The PI facilitators should make sure that the decision-maker is aware of any disagreements before holding the project agreement meeting with all of the stakeholders (102). In that meeting the PI facilitator or the decision-maker should point out the disagreements and attempt to resolve them. Some may not be resolved, but the process can continue anyway.
Maintaining the interest of stakeholders is especially difficult in projects that last a year or more. Stakeholders may lose interest if there are no quick and obvious improvements. Well-designed projects plan for some quick successes to maintain interest and decrease people’s resistance to change.
This first step ends with achievement of a consensus, if not complete agreement, among all stakeholders. The consensus can be formally stated in a letter of agreement or memorandum of understanding signed by the lead PI facilitator and the decision-maker. The letter summarizes the purpose of the project, the process of meetings and information gathering, and the next steps. It should also cover understandings about logistics, office space, travel, and funding (102).
The initial consensus may have to be reviewed several times during the process as facilitators learn more about the performance problem or as people change jobs. Facilitators may discover information that will resolve some of the disagreements, and they may discover other performance problems. Job changes among stakeholders require informational meetings for the person who takes over. At the IDSS in the Dominican Republic, for example, the director general changed twice. Facilitators had to brief the new directors general and obtain their permission to proceed (91).
Define Desired Performance
When stakeholders define desired performance, they are describing the type of reproductive health services they would like. The Pi facilitators select indicators of desired performance based on international or national standards and guidelines and information gathered in meetings or interviews with staff members, exemplary performers, clients, community groups, and other stakeholders.
Defining desired performance is one of the most useful steps of the PI process but also one of the most difficult and contentious steps. Many organizations can benefit from a systematic and thoughtful discussion of the desired performance of their staff members. Such discussion should involve all stakeholders in selecting clear objectives that, if possible, are measurable (101). Defining desired performance gives some staff members their first opportunity to discuss what their job should be and how they contribute to their organization (36). The difficult part is persuading stakeholders to use observable and measurable indicators of performance. The facilitator usually needs to help with tactful questioning and clear examples of desired performance (63).
For jobs that involve clinical procedures with universally accepted standards, there is little room for debate on desired performance. For other jobs, however, stakeholders often disagree vehemently on desired performance, arguing, for example, that standards are being set too high or that achieving them will take too much time. Some stakeholders prefer realistic goals, while others favor ideal goals. Both approaches pose risks. Setting ideal goals can inspire staff members to try harder than they would with a realistic target, or else the higher target can be demoralizing because it seems unreachable. The choice between idealistic and realistic measures, or a mix of the two, is part of the consensus among stakeholders.
In the program working with private practitioners in India, for example, stakeholders first set desired performance at counseling 100% of women who might need family planning–women between the ages of 15 and 49 who were not using contraception. But when the PI facilitators found that providers actually were counseling fewer than half of such women, stakeholders decreased desired performance to counseling 75% of the women (88).
Self-assessment guides, such as those from the COPE process developed by EngenderHealth, can help define both desired and actual performance. Checklists for self-assessment cover all aspects of services, for example, quality of care, staffing, recordkeeping, and counseling (32-34, 92).
Indicators are objective measures of performance. They describe accomplishments that are observable, measurable, and under the control of the staff members whose performance is being measured. Desired performance, actual performance, and the performance gap should be defined with the same indicators.
Indicators are a key component of the PI process because they determine the amount and type of information that the PI facilitators must collect. Too many indicators, or indicators that require information that is difficult to find, will waste the facilitators’ time. For example, facilitators have found that some indicators require time-consuming travel and interviews, and they have replaced them with indicators that can be found more easily in clinic records (165). The PI facilitators consult with stakeholders to select an initial set of indicators. These may change as the facilitators collect more information.
Indicators for clinical skills, such as IUD insertion or infection prevention, are generally taken from international or national standards. For example, several indicators that a provider is prepared to insert an IUD are: washes hands with soap and clean water for at least 15 seconds, tells the woman what will happen and encourages questions, and conducts a pelvic exam (16, 103).
Studying the guidelines followed by other health care facilities or organizations, a practice known as benchmarking, is also useful for defining desired performance (98). Reviews of evidence-based best practices in reproductive health care, such as the WHO Reproductive Health Library distributed annually on diskette and CD-ROM, are also helpful (51, 116). Contact information for the library and for sources of information about the PI process, such as the International Society for Performance Improvement and the USAID-sponsored Performance Improvement Consultative Group, can be found on the Internet at.
The PI facilitators discuss with staff members the services they would like to deliver, and they discuss with clients and community groups the services they would like to receive. Facilitators usually prepare questionnaires or guides to conduct meetings or interviews. In the needs assessment in Nigeria, for example, the PI facilitators drafted guides for stakeholder meetings that addressed desired performance of the health care facility and of health care providers, and they explicitly directed participants to include the perspective of clients (46). As a courtesy and to encourage thoughtful answers, PI facilitators sometimes give the questions to stakeholders before the meetings or interviews (102).
The most helpful information often follows from open-ended questions, which cannot be answered yes or no. Typical questions that have helped define desired performance are: For service providers and managers:
What would people do if they performed perfectly?
* What would they say to each client? How would they treat clients?
Do you think it is possible to provide that level of service? If not, what level of service is reasonable to expect? How many providers perform at this reasonable level?
For managers specifically:
* What results do you expect from the reproductive health program?
* What resources are available to carry out the program?
* How do you imagine the ideal reproductive health program in terms of both service goals and resources?
* If managers say that they want “good work,” they must be coaxed to define “good work” in measurable terms (102).
For clients and for community members not using reproductive health services:
* Imagine the ideal facility providing family planning services. What would it look like?
* How would the waiting area and exam rooms look?
* What services would be offered?
* How would providers behave?
* What is the first thing you would change about your health clinic to make it closer to the ideal (46, 136)?
To provide important details stakeholders often need prompting with such follow-up questions as: Could you be more specific about …?, How typical is what you just described?, or Is there anything else that you would like to add?
Among clinics or staff members, and clients and community members, a few may stand out as exemplary performers (also known as positive deviants). Exemplary performers overcome conditions that limit others. A study of exemplary public and private clinics in Kenya, for example, found that they have inspiring leaders, offer staff members a variety of incentives, involve the community, and have local control of finances (169). Also, in the needs assessment in Nigeria facilitators met a nurse-midwife in a rural clinic who had set up a revolving loan fund that kept supplies in stock when others were out of stock, and she motivated her staff members to keep the clinic exceptionally clean (56). The PI facilitators should search for exemplary performers and observe and interview them. Their example helps to define desired performance, and their practices can help generate solutions to others’ job performance problems (48).
Experts in reproductive health or in the methodologies for improving quality contribute a knowledge of procedures and standards to the definition of desired performance (88). Experts also can help to adapt international or national standards to definitions of desired performance that take local conditions into account (102).
Examples of Desired Performance
Like indicators of performance, well-conceived statements of desired performance describe accomplishments that are specific, observable, measurable, and under the control of the staff member. For example:
* In the training program in Tanzania, the zonal training resource teams are expected to work with district health management teams, regional health management teams, and NGOs to identify training needs at least every three years (135).
* In a postabortion care (PAC) project of the Family Planning Association of Kenya, 80% of volunteers linked to PAC facilities are expected to receive an orientation from a PAC provider within 20 weeks after the provider completes central training (164).
* Poorly phrased statements of desired performance often describe only knowledge or ability, are too vague to be measurable, or describe performance that is not under the control of the staff member (see Table 2).
Stakeholders sometimes wonder how much detail to include in statements of desired performance. For example, to meet the standard for infection prevention, do providers need to be told to wash hands with soap and lather for 15 seconds before rinsing in clean water, or is “Wash hands between each client” enough? Stakeholders can estimate the right amount of detail by asking the question, “How would the typical staff member carry out these instructions?”
Stakeholders decide on the appropriate level of detail by considering generally accepted standards, the importance of the task, and staff turnover, among other factors. If, for example, the cost of not doing a task is high or if turnover is high, the task should be described in detail to avoid costly omissions or to inform new staff members (123).
Describe Actual Performance
The description of actual performance is needed to define the performance gap. The sources of information for describing actual performance comprise:
* Interviews or meetings with stakeholders, particularly the staff members whose performance is being analyzed, supervisors, and clients; and
Observation of staff members.
Before investigating actual performance, PI facilitators review existing surveys, operations research, or clinic observation studies. These sources may save some time in gathering information, but they rarely have all the information needed to assess actual performance or the performance factors (88, 141).
A flowchart can help the PI facilitators to understand actual performance and to visualize desired performance as well. With information supplied by the staff member, the flowchart maps a job as a series of tasks and decision points, and it can reveal the reasons for problems. Long waiting times for clients, for example, may result from tasks that could be carried out at another point in the process or that may be unnecessary (70). In logistics, flowcharts have helped identify redundant tasks, wasted time, decisions that required more people than necessary, or decisions made without reference to standards or best practices (36).
Clinic records contain information such as number of clients who have received services, the outcome of their visits, and orders for equipment and supplies. The PI facilitators should know how the data were collected and how current and reliable they are. To respect the confidentiality of clients, clinic records should be reviewed by PI facilitators who are staff members rather than by consultants (102).
The PI facilitators should review records before conducting interviews or meetings with stakeholders. Knowing what the records contain, facilitators can formulate useful questions and avoid asking for information that they could obtain from the records. Asking stakeholders about some information in the records, however, can verify its accuracy. If records are incomplete or inaccurate, stakeholder interviews can supply missing information.
In interviews and meetings PI facilitators ask staff members to assess their actual performance with questions such as: What do you do during a normal work day? What services do you provide? Roughly how much time do you spend on your main tasks?
Staff members may present an inaccurate impression of their performance, making it sound better or worse than it is. In a clinic accreditation project in Brazil, for example, staff at some clinics said everything was fine, while staff at other clinics said the opposite. Role playing or showing a videotape to demonstrate good and poor performance can help providers assess themselves objectively (16).
The PI facilitators can check providers’ perceptions by observing them and by interviewing clients and community members. In the Nigeria needs assessment, for example, questions for clients in focus groups included: How would you describe the clinic environment? Your meeting with the family planning provider? If you had a friend interested in family planning, would you recommend that he/she go to this facility? Why or why not? Community members can answer similar questions based on their impression from talking to friends, relatives, or neighbors who have used the clinic (46). The PI facilitators should also ask family planning clients if they felt that providers had given them enough information to choose a contraceptive method with confidence.
Information from clients is not always reliable, however. Some clients are reluctant to criticize staff members who have higher status, or clients think that criticism would be impolite. Interviewing clients where they live rather than at the clinic can reduce this courtesy bias (145).
Interviews with staff members may uncover obstacles to the PI process itself. Providers who have not been paid in months, for example, have rejected efforts to improve their performance (4, 16).
Probing the Performance Factors
The PI facilitators also ask for information about the performance factors that will help in the root cause analysis (see p. 16). Typical questions include (102):
* Job expectations: Can you explain what is expected of you? Have you been given a job description? How do you find out what is expected of you?
* Feedback: How do you know when you are meeting job expectations? Do you get feedback orally and/or in writing? How often? From whom?
* Workspace, equipment, supplies: Do you have all the equipment or supplies you need to do your work? Have you requested material and supplies that you have not received? Do you have all the space you need, particularly private space? Is equipment maintained?
* Incentives: What happens if you do an outstanding job on a particular day? In your area how are decisions made about promotions, invitations to external training, or other opportunities? How can recognition for good performance be improved?
* Organizational support: How does the structure of the organization help your work or make it more difficult? How are the goals and strategies of the organization communicated to you? How are important decisions made and communicated to you? Are you getting enough help and guidance from your supervisor?
* Knowledge and skills: How much of your training do you use on the job? Would on-the-job reminders help you with certain tasks? Would you do a better job if you knew you would receive an extraordinary reward or recognition?
These questions may yield a long list of causes from which stakeholders select the few vital root causes.
To encourage truthfulness, PI facilitators can question staff members and supervisors, or nurses and doctors, in separate groups. When answering questions about expectations or performance assessments, staff members may not feel free to criticize supervisors if they are in the room (165).
Observation of staff members at work is an indispensable source of information about actual performance. Observers need to be unobtrusive to avoid disturbing staff members, some of whom may never have been observed before (75).
To obtain a complete impression of actual performance, observers pay attention to the operation of the clinic or office as a whole (the organizational level of performance) and to the work of individual staff members. In the Nigeria needs assessment, for example, observers noted problems at the clinics in planning and goal setting, supervision, record keeping, and equipment and supplies. Problems among providers were in interpersonal skills, use of service statistics, and adherence to infection prevention procedures (46).
Checklists help observers attend to all the performance indicators. Checklists of clinic operation may cover equipment and supplies, the presence of guidelines, the quality of clinic records, information provided in counseling, and the attitude of providers and other staff members.
Observation has limitations. Some staff members feel anxious or threatened when they are observed and thus do not perform as usual. Obtaining permission from staff members and discussing the PI process and the project before the observation can help to reduce anxiety. Staff members are reassured if they have worked with someone on the observation team. In the training program in Tanzania, for example, the PI facilitators included a senior staff member from the human resources division of the Ministry of Health, who had met the staff of the Zonal Training Centres (165). Also helpful for observers is dressing like clinic staff to be inconspicuous (16), staying long enough that staff members become accustomed to being observed (106), and explaining to staff members that they are not being rated and that the observation will not affect their salaries.
Using simulated, or mystery, clients to collect information avoids some of these observation problems but can create other problems (60, 81, 93). Simulated clients need to be keen observers with a good memory and the ability to play a role. Training people to pose as clients can be time-consuming, sometimes requiring several weeks (93, 96). Also, using simulated clients raises ethical problems of deceiving providers and breaking down trust between staff members and management (93, 106).
Observer bias or disagreement may also be a problem. Two observers may differ in their interpretation of the same behavior (78, 81, 106). In a study in Peru simulated clients were inconsistent in overall ratings of providers but were more reliable at observing specific behaviors and recording them on checklists after their appointments (81). Training in observation methods and memory aids, or else using tape recorders can improve the accuracy of information (93, 106).
Measure/Describe Performance Gaps
Using the definitions of desired performance and the information about actual performance, the PI facilitators: (1) measure or describe the performance gaps, (2) help stakeholders select the gaps that they would like to address, and (3) rank the selected gaps in order of importance. This preliminary selection avoids further analysis of gaps that stakeholders do not want to pursue.
The performance gap is the difference between desired and actual performance, often expressed as a difference of percentages. It can also be expressed as a ratio of the achievements of exemplary performers to those of typical staff members (48).
A common mistake at this stage is to list causes as performance gaps. For example, if providers are not counseling clients well, PI facilitators may mistakenly define the gap as inadequate knowledge and skills rather than the difference between the desired performance, which could be 100% of providers following the counseling protocol, and the actual performance, perhaps 20% following the protocol. Analysis of the root cause–the next step in the PI process–is not part of the description of the performance gap but rather explains the performance gap. Stakeholders themselves select performance gaps for further attention based on criteria that they choose. For example, they may select gaps because they are large, because they are important to the organization or to top management, or because they can be solved quickly or their solution will have an obvious impact.
In meetings and interviews the PI facilitators collect ranking information with questions such as: What is the impact of this typical [or unsatisfactory] performance on reproductive health services? and How does this performance problem compare with other performance problems we have discussed?
In general, the larger the performance gap, the greater the opportunity for performance improvement. In the Tanzania project, for example, the facilitators considered gaps of over 20% large enough to pursue with root cause analysis and solutions (135).
Ranking the selected gaps helps stakeholders decide the order in which they should be addressed. In the Nigeria needs assessment, for example, PI facilitators ranked the clinic performance gaps based on a consensus of the stakeholders. In order, the gaps dealt with problems in: (1) the supply of contraceptives, (2) clinic records, (3) treatment of clients, (4) infection prevention, and (5) accessibility in rural areas (46, 88). In some cases, however, the most important gaps have to wait until other, less important performance problems are solved. In the IDSS project, for example, gaps in counseling were ranked highest, but logistics problems, which were ranked fourth among five, had to be solved first so that providers would have contraceptives to give to clients (101).
Find the Root Causes
Root cause analysis is the main diagnostic step in the PI process. It is the transition between the description of the problem and the development of solutions.
Performance problems need to be attacked at their root, or they will persist. For example, a root cause of the gap in counseling among private providers in India was loss of income. Clients did not pay providers for counseling but only for products (90). The PI facilitators concluded that despite training, clear expectations, and supplies, the gap would not close as long as this root cause, related to the incentive performance factor, remained (see box, p. 15).
Faced with several root causes, stakeholders need to identify the ones that have the greatest effect on performance. The root causes are constraints or bottlenecks in the system or work process. Working on weaker constraints will not help if the most serious bottlenecks in the process remain (27).
The process also coaxes stakeholders to see beyond explanations that they feel they can do nothing about. Staff members often blame lack of funding, bad management, or corruption for problems when there are other causes that they can influence, such as unclear expectations or infrequent performance appraisals (63). An apparent lack of funding could instead be caused by misallocation, poor planning, or poor coordination, which could be corrected. PI facilitators need to encourage positive thinking about causes that can be addressed.
Root Cause Analysis Techniques
Stakeholders find root causes by discussing the information collected from records, site visits, interviews, and meetings, and by using analysis techniques. Two techniques that have proved useful for reproductive health programs are the Why Tree technique and the cause-and-effect diagram. The two techniques encourage careful inquiry into causes and discourage jumping to conclusions.
The Why Tree. Stakeholders identify chains of causes of a performance gap with the Why Tree technique, also known as the Why-Why-Why technique. When stakeholders can think of no more causes in one chain–no more answers to the question “Why?”–the PI facilitator asks if there are any other causes of the gap and begins another chain. Recorded on paper, the performance gap appears at the top of the page with a root system of causes (see Figure 3).
[FIGURE 3 OMITTED]
A project in Ghana to strengthen regional resource teams used the Why Tree technique to explore the lack of supervisory visits to a large proportion of providers. The stakeholders identified two main causes: the resource teams did not know how many supervision visits to make, and they had no transportation. The first cause, lack of knowledge, had three roots: no job description, no support system, and no information during training about frequency of supervision. The second cause, lack of transportation, had one root: no training in proposal writing to get funds for transportation.
Each root of the Why Tree describes a cause of the performance gap, and the lowest item in the root indicates how to address the cause, in this case by drafting a job description, establishing a support system for the Ghana teams, and training (87). The Why Tree technique helped stakeholders uncover an unexpected root cause–lack of training in proposal writing. Such training could help solve the transportation problem and other problems caused by lack of funding. If stakeholders end a chain of causes with health-sector or societal problems that they cannot control, then they address the next higher cause under their control (63).
Cause-and-effect diagrams. Sorting the root causes according to the performance factors suggests the type of solutions that would address the root causes. To help with the sorting, stakeholders can use a cause-and-effect diagram, also known as a fishbone diagram, or an Ishikawa diagram after its inventor, Kaoru Ishikawa (70, 108) (see Figure 4, p. 18). The spine of the fishbone diagram extends from the performance gap in a box on the right. The long bones extending from the spine stand for the performance factors. Causes are diagrammed on lines extending from each performance factor, and further explanations extend from each cause (98).
[FIGURE 4 OMITTED]
Since performance factors overlap, some causes may fit under more than one factor. In the Ghana example in Figure 4, “No supervisor” could be placed under expectations, feedback, or organizational support. Also, the explanation for a cause under one factor can connect that cause to another factor. Thus the lack of transportation, a cause classified under workspace/equipment/supplies, turned out to be related to knowledge/skills and expectations.
Common Root Causes
Root causes of problems in reproductive health programs that have used the PI process range across all the performance factors. Providers variously lack knowledge and skills in counseling, logistics, integrated reproductive health services, estimating the cost of services, and infection prevention (20, 90, 91, 135, 136). They do not know what is expected of them because they have no written job descriptions, guidelines are out of date, or supervisors do not tell providers what they should do (46, 90, 91, 130, 135).
Clinics lack supplies to offer services requested by their clients, to practice infection prevention, or to distribute health education materials (46, 90, 91, 114, 131, 135). Without vehicles or fuel, supervisors cannot visit clinics (130). In some programs there is no incentive system, supervisors do not support their staff, and providers have no power to make decisions or else they feel helpless to solve problems and wait for instructions from a higher level (38, 46, 91, 114, 130).
Individually, these causes are well known, but emerging together from the PI process they indicate the systemic nature of performance problems. Thus the IDSS in the Dominican Republic worked on expectations, feedback, incentives, and knowledge and skills to encourage providers to treat clients more considerately (91). In Ghana training the regional resource teams would not be effective unless expectations were reinforced through job descriptions and supervision and transportation were provided (130). Organizations usually need to address several root causes to improve performance.
Having systematically defined the performance problems, stakeholders use the same care in selecting interventions. They propose solutions, assess the solutions according to effectiveness, feasibility, and other ranking criteria, and then make the choices. Stakeholders can draw from a range of approaches that address weaknesses in the performance factors.
The PI facilitators encourage the staff members whose performance is being analyzed to suggest solutions. The people doing the work have the best knowledge of their job and generally contribute the most practical ideas, if staff members themselves play an important role in developing solutions, they are less likely to feel that the solutions are imposed and less likely to resist changes (94).
Stakeholders use project design criteria to rank the potential solutions. With the help of the PI facilitators, stakeholders answer the following questions:
Will the proposed solution actually fix the problem? Using best practices from the reproductive health literature can give stakeholders confidence that their proposed solutions will be effective. Stakeholders can also adapt the experience of local programs that have solved similar problems. Experts in logistics, communication, or training, for example, can participate and summarize lessons learned.
Will the proposed solution provide the best results for the least resources? A simple, subjective, and quick cost-and-benefit assessment can answer this question. Stakeholders assign points on a scale of 1 to 10 to costs and benefits for each proposed solution. Costs include political, social, cultural, logistical, and technical factors as well as monetary costs. The benefit score is a consensus estimate of how well the proposed solution will solve the problem or how much of the problem it will solve. This quick analysis avoids a time-consuming cost-benefit analysis requiring special expertise (102).
Stakeholders can compare the ratios of costs to benefits for a variety of proposed solutions. In the IDSS program in the Dominican Republic the cost-and-benefit ratios for proposed solutions ranged from 1.4 to 4.0 (see Figure 2, p. 7).
Is the proposed solution feasible? Can the solution fix the problem in time and with the funding and staff available? If not, can more time be allotted, more funding found, or more expert staff recruited?
Concern about feasibility should not discourage stakeholders from striving for goals that may at first seem impossible to reach. In the IDSS project in the Dominican Republic, for example, the PI facilitators thought that upgrading the status of reproductive health services from a special project to a department–with more status, a larger budget, and more space–was not possible. But it proved feasible because the IDSS was planning to restructure and because the improvements in reproductive health services as a result of the PI process motivated staff members at several levels to support the upgrade. Also, the PI facilitators produced a widely disseminated brochure that informed IDSS staff about the reproductive health services and persuaded them of the importance of the services (63, 82, 101).
Is the proposed solution acceptable to clients, community, and the staff members who will carry out the solution? Do the stakeholders representing these groups think that their constituents will welcome the solution? Is it culturally acceptable? People often assess changes by looking at advantages, simplicity, compatibility with what they have been doing, how easily they can adapt to the changes, and the effect of changes on their personal life (29). Stakeholders need to reach a consensus to decrease the resistance that change often brings. They also can discuss how to manage the change and anticipate the changes that might arouse the most resistance (see p. 25).
Is the proposed solution sustainable? How much help does the organization need from consultants or the ministry of health to carry out the solution? Will the solution be continued after a donor or cooperating agency leaves? Changes in organizational structure to accommodate the solution and involvement of high-level management increase the likelihood that the solution will be sustainable (77).
In general, a few well-executed solutions are more sustainable than many solutions hampered by limited resources. The ranking criteria help the stakeholders focus on the solutions that will do the most to improve performance.
The ranking criteria guide, rather than dictate, the selection of proposed solutions. Stakeholders may decide to carry out some solutions even though they have a high cost-to-benefit ratio, often because they must precede higher-ranking solutions. In the project working with private providers in India, for example, a highly ranked solution was encouraging the providers to buy supplies on the last day of their training program. They would be unlikely to buy supplies, however, if they had not set up a pricing strategy, established a regular source of supplies, and received training in sales and marketing–which were all lower-ranked solutions (90).
Some criteria may be binding, such as inflexible funding or a time limit, while others are less strict. In the Nigeria needs assessment, for example, stakeholders insisted on addressing any gap that directly affected the performance of providers in the clinic (88). Thus solutions addressing the availability of contraceptives, access to family planning services in rural areas, and counseling skills had higher priority than solutions addressing management, planning, and financial sustainability. Less crucial but still important criteria can be ranked by consensus or by a vote (89).
Stakeholders should consider upgrading low-ranked solutions that can be carried out quickly. If staff members can quickly improve health services themselves without waiting for the help of supervisors or ministries, the quick results demonstrate the value of the PI process, demonstrate that change is possible, and motivate staff members to improve their performance. Quick positive results also encourage staff members to attempt more difficult solutions (92), and they can persuade managers to provide the resources for more ambitious projects (1, 9, 41, 97, 161).
Stakeholders can draw from the worldwide experience of programs and research to solve performance problems. The Performance Improvement literature, the reproductive health literature, and the medical literature suggest ways to address weaknesses in the six performance factors.
Carrying out the solutions to performance problems requires good project management skills. The staff members who carry out the solutions–usually with help from facilitators–plan, schedule, budget, coordinate, and keep people informed. Managers often participate in implementation because they have project management skills and they are ultimately responsible for the outcome of the process. If necessary, managers or facilitators invite individuals or organizations with expertise in the interventions, for example, in training, communication, or logistics, to help with implementation (129). Implementers also plan the evaluation of the solutions and the organizational changes that will help to initiate and sustain the solutions.
A variety of approaches can help to clarify job expectations–for example, distribution of guidelines with training, accreditation programs, clear job descriptions, posters, prompting providers before a client visit, messages from management, discussions with respected peers, community involvement, and mass media promotion.
Guidelines with training. To clarify job expectations, organizations typically distribute guidelines and expect staff members to read and follow them. Distribution of guidelines alone is usually not enough, however. The materials must be reinforced through training or performance appraisal (24, 52, 59, 91, 118, 149).
One of the few programs that has measured the effectiveness and cost of dissemination of guidelines and training was carried out in 1999 by the Kenya Ministry of Health. Two thousand providers received revised guidelines and training in 1999–274 were trained directly, and about 1,700 were subsequently trained at their clinics by the 274 trainees. Materials to help providers train their coworkers increased scores slightly on 38 indicators at a cost of about US$12 per provider. Adding supervision for 54 providers to reinforce their training increased scores by a factor of nine over training alone at a cost of about US$377 per provider (149). Accreditation. An accreditation program clarifies expectations by specifying the changes that clinics need to make to satisfy accreditation standards. Such programs are being carried out in Brazil, Egypt, Guatemala, Honduras, Malawi, and other countries (16, 44, 69, 99, 126).
Job descriptions. Written with care, job descriptions specify the contribution that the job makes to organizational goals, the main product or service produced by the job (for example, community-based family planning services), the accomplishments of the job (helping clients choose and use contraceptives), the tasks that the employee must carry out (visiting clients in their homes), and rates or quantities (clients will be visited at least once every month) (102). Such job descriptions also help managers hire or promote employees who can fulfill job expectations (80).
Posters or brochures. Hanging posters where staff members work, or distributing brochures helps to remind staff members what is expected of them (91). The IPPF wall chart listing clients’ rights and providers’ needs, for example, is displayed in the offices of most IPPF affiliates (146).
Prompts for providers. Sheets attached to client files that list tests or procedures to perform have helped British and US doctors improve compliance with guidelines. Also helpful has been giving clients cards listing the services that they should receive, which they give to providers as a prompt during their visit. Doctors comply better when the prompts list instructions specific to a patient rather than general instructions (24, 31, 52, 53).
Messages from top management. In the Dominican Republic a letter from the central office of the IDSS informed the staff of health centers that they were expected to offer five reproductive health services: family planning, maternal and child health care, prevention and treatment of HIV/AIDS and other sexually transmitted infections, breastfeeding promotion, and detection of breast and uterine/cervical cancer and referral for treatment. The letter helped reduce a gap in providers’ knowledge of reproductive health services (63, 91).
Discussions with respected peers. When they discuss proper care in small groups or with individual providers, respected peers can be persuasive (6, 24). Personal visits from peers, known as educational detailing or academic detailing, have helped to improve US physicians’ prescribing practices (117).
Community involvement. A close relationship between health care providers and communities can lead to honest dialogue and better understanding of each other’s expectations and needs (8, 28, 34, 57, 99, 169). For example, in the “Building Bridges for Quality” project in Peru, begun in 1998 and carried out by the Peru Ministry of Health, providers and community groups produced videos portraying their ideal of health care and their impression of the care that is actually provided. Providers toured the communities they serve, community members toured the health center, and together they made plans to improve health services so that providers meet clients’ expectations and clients meet providers’ expectations (8, 57). The communities now feel that providers are more attentive to and respectful of clients, and providers say that community members know more about the health services and ask to be educated about health care (7).
Mass media. Skilled and attentive providers have been portrayed in the mass media to show providers the level of care that they are expected to offer and to show clients the care that they can expect to receive. This approach has been used in several countries, for example, Brazil, Egypt, Ghana, Indonesia, and Nepal (19, 67, 68, 126).
A number of studies and programs have tested ways to provide people with information about their job performance. To encourage more frequent performance appraisal, organizations have worked with supervisors to present quantitative feedback, encouraged comments from clients, or encouraged providers to assess themselves and their coworkers.
Quantitative feedback. Organizations have trained supervisors to evaluate staff members with checklists and to provide detailed and quantitative appraisals (15, 22). For example, a program in Burkina Faso, carried out in 1994 by the Programme Elargi de Vaccination (Expanded Program on Immunization) and the Ministry of Health, used quantitative feedback to promote vaccinations against measles. Six months after a workshop to train health workers in communication skills, supervisors visited clinics and observed the health workers, pointed out weaknesses, and helped with solutions. Supervisors prepared bar charts on transparencies that, when laid on top of each other, allowed a health worker to compare her current performance with her previous performance and the averages for coworkers and a control group. The health workers appreciated the quantitative feedback and they were motivated to improve skills that had declined since their training, such as providing information to mothers about caring for children with measles, arranging return visits for vaccinations, and responding to questions (15).
Observation, presentation, and discussion. A program in Niger introduced Integrated Management of Childhood Illness (IMCI) in 1997 and 1998 by training providers and then observing and discussing their performance with them. Observers presented their appraisals to providers in a workshop. Providers then discussed the appraisals in small groups with the help of a facilitator. After being appraised, providers were better at some of the tasks, such as recognizing symptoms of severe illness and malnutrition and finding out about vaccination history, but the improvements were not sustained after eight months. Also, counseling skills declined despite the feedback. The cost of the appraisal system was US$108 per provider. Adding an average of 11 days of training had a larger and more comprehensive impact on skills, but cost a total of about US$430 per provider (72).
Comments from clients. In the Dominican Republic the IDSS set up suggestion boxes and offered comment cards asking clients to rate their care on friendliness, privacy and confidentiality, communication, and problem-solving (see Figure 5). Each week the responses were collected, and the directors of the health centers discussed them in staff meetings or with any providers whom clients mentioned by name. Stakeholders said that the comments influenced providers to take better care of clients and, as a result, clients were more satisfied with services, and providers were happier in their work (91). The system was not sustained, however, because of administrative problems (119).
In Peru, Max Salud, a private, nonprofit health care organization, set up a system in 1998 and 1999 with six ways of collecting comments from clients: 10-minute exit interviews in the waiting room or just outside the clinic; follow-up visits to clients at home; focus-group discussions that were tape recorded with clients’ permission; household interviews of people who had stopped using services; suggestion boxes; and community meetings. Among the lessons learned were that clients were overly polite during exit interviews but more willing to be critical when they were interviewed at home. Also, comments from clients should be distributed to providers as soon as possible so that they can respond quickly, and comments should be collected frequently because clients’ expectations change. The study found that suggestion boxes were the least costly method of collecting clients’ comments (145).
Self-assessment. A study in Indonesia conducted by the State Ministry of Population/National Family Planning Coordinating Board (BKKBN) evaluated the effect of self-assessment and peer review on counseling skills following a training workshop. Providers used self-assessment forms to evaluate their counseling skills daily for 16 weeks. They also assessed the clients’ behavior and their influence on clients. Also some providers met weekly in groups of three or four to discuss their performance.
The assessments helped the providers remember what they learned in the workshop, clarify performance standards, and recognize and work on weaknesses. Four months after the training, providers who had training reinforced with self-assessment had better counseling skills than a control group. For example, they gave more information and built better rapport with clients, and their clients talked more and were more satisfied with the counseling. Discussion with peers further enhanced counseling skills but did not increase clients’ satisfaction (74).
Adequate Workspace, Equipment, and Supplies
Common environmental problems in reproductive health programs are lack of private space for counseling, stock-outs of contraceptives, and lack of equipment or supplies for disinfecting instruments. To solve these problems organizations:
* Improve their logistics system and provide training (35) (see Population Reports, Family Planning Logistics: Strengthening the Supply Chain, Series J, No. 51, Winter 2002).
* Work with local government and communities to improve the workspace and provide supplies. Municipalities in Brazil funded improvements in clinics that participated in the PROQUALI accreditation program carried out by the Secretariats of Health in Bahia and Ceara states. The funding paid for repairs and remodeling, a computer, a car, and an autoclave. One city dug a well to provide the water needed to carry out infection prevention procedures (69).
* Ask for help from donors to buy equipment and set up a sustainable supply system or encourage public-private partnerships to supply contraceptives (46). The IDSS, for example, received help from the USAID-funded Family Planning Logistics Management program, which conducted two-day logistics management workshops and negotiated donations of contraceptives from USAID and the National Council for Population and the Family (CONAPOFA) in the Dominican Republic. The IDSS then began to buy contraceptives from UNFPA (91, 134).
People work in health care programs for a variety of reasons. Some like to care for people and value clients’ appreciation, or they value the social status accorded health care providers and the respect of clients and communities (3, 23, 38, 92, 148). The equipment and training that come with a job and the opportunity to attend meetings are also attractions (10, 23, 111, 169). Some providers value their work enough that they stay on the job even when their pay is delayed (13). Of course, many providers work only because they need an income (5).
To encourage better performance, organizations have tried incentives such as more money, recognition for good work, and the opportunity to provide better care.
Monetary incentives. A base salary attracts people to jobs and keeps them coming to work, but it does not necessarily motivate them to perform well (47). Monetary incentives include increases in pay; allowances for clothing, housing, or training; time off with pay or extra vacation; free meals, or gifts, such as appliances or bicycles (9, 155). A community-based distribution program in Tanzania pays agents with income-generating equipment such as boats, tractors, or sewing machines (62).
Linking pay to performance can be controversial, however. In Zimbabwe, for example, civil servants went on strike in 1996 when the government proposed to tie salary increases, an annual bonus, and promotions to job performance. Health workers thought that it was unfair to set high performance objectives in the face of shortages of staff and resources. Already poorly paid, the health workers challenged any threat to their small salaries. The experience in Zimbabwe raised several other potential problems with such an incentive scheme: Linking pay to job performance can inspire mistrust or abuse of the appraisal process, some supervisors are reluctant to give poor reports, and supervisors may not know how to appraise employees or may be too busy (110).
Reproductive health organizations supported by USAID are not permitted to reward employees for meeting quotas or targets for the number of family planning acceptors (156, 158). They can reward them, however, for excelling in other ways that help organizations meet their goals.
Recognition. Organizations can recognize outstanding work by posting staff members’ pictures (163), by selecting an employee of the month (9), or by mentioning staff in a newsletter. They can also announce promotions and report them to the local news media, and they can declare special days for groups of employees, such as nurse-midwives’ day.
Recognition through positive feedback encourages employees by showing them how they are improving. For example, the self-assessment and peer review in the Indonesian counseling study motivated providers by allowing them to track their own improvement and recognize and work on their weaknesses (74). In the PROQUALI clinic accreditation project in Brazil, providers were motivated to improve performance by the feedback they received from state and city officials, supervisors, coworkers, and clients (69).
Accreditation programs, by recognizing good performance, have motivated staff members to work hard to meet standards. For example, a program in western Guatemala is improving maternal and neonatal health care by accrediting hospitals, health centers and health posts. In baseline surveys conducted between March and August 2001, seven hospitals met an average of 11% of accreditation criteria, and in a follow-up survey in December 2001 compliance had increased to 40% (113).
Providing better care. Giving staff members the opportunity to improve care is an incentive in itself. In the “Building Bridges for Quality” project in Peru, for example, MOH staff said that they had never considered improving relations between clients and providers to be part of their job. In their new role they learned facilitation skills such as asking open-ended questions, encouraging participation, and making summarizing statements. They felt that they were doing more to improve services and not just checking on providers, receiving reports, and arranging training (57).
When supervisors ask employees what motivates them, they avoid guesswork. In the IDSS project, for example, the PI facilitators conducted focus-group discussions with providers and interviews with hospital directors to explore incentives aside from salary increases that would motivate considerate treatment of clients. Asked what types of incentives they would like, employees listed rewards such as clean and well-ventilated offices, extra days off, travel for training, more sharing of information with staff, public recognition, and insurance for risks on the job. Providers who worked alone in clinics wanted the chance to work in teams (157). Offering a variety of incentives allows employees to choose the ones they like the best (12, 169).
To strengthen organizational support for the performance of their employees, managers should attend to the other performance factors, such as expectations, equipment, and incentives. They can also:
* Clarify and communicate the organizational mission, develop a work strategy to fulfill the mission, and ensure that the organizational structure–the lines of authority and allocation of resources–supports the strategy (13, 95, 142).
* Involve office staff in efforts to increase adherence to guidelines. Patients of US doctors received better services when, for example, office staff provided information and supportive comments as part of a smoking cessation campaign (24).
* Set up a supportive supervision system that encourages suggestions and problem-solving by staff members at all levels of the organization (11, 22). For example, ASHONPLAFA, a private family planning organization in Honduras, strengthened its supervision system in 1999 and 2000 by combining more support by supervisors-standard setting, planning meetings, feedback and evaluation, field visits, and recognition for good performance–with encouragement for employees to monitor themselves (26).
Knowledge and Skills
Strengthening preservice education or conducting in-service training are the main approaches to improving knowledge and skills. Job aids such as checklists or flowcharts also help by providing information or guidance as people work, but training to use job aids is usually necessary (30, 76).
Training in reproductive health care emphasizes transfer of learning to the workplace and the demonstration of competency by trainees (65, 132, 152). Transfer of learning is difficult. In general, participants in training programs use only 10-20% of what they learn on the job because the training was poorly designed or because they receive no support for changing the way they work (150). Trainers, subject matter experts, and PI practitioners are working together to improve the effectiveness of education in reproductive health care.
Strengthening preservice education can have a larger and more lasting effect than in-service training. Preservice education influences more people, and the knowledge and skills learned in professional schools determine the practices of many students throughout their careers. Lessons learned from programs to strengthen preservice education in the Philippines and Turkey, for example, include the importance of recruiting a strong advocate for change in the schools and forging a close relationship between the professional schools and the clinical practice sites (171 ).
In-service training refreshes knowledge and skills or introduces new information and techniques. In-service training is carried out either on-the-job or away from the workplace. On-the-job training can be informal or structured (61, 150). Among the advantages of structured on-the-job training reported by a PAC program in Kenya, for example, were that the training met each clinic’s specific needs, providers who could best use the training were selected, and there was little disruption of clinic services (168).
Training design includes format, methods, and materials. Training can take place through individual learning, self-assessment, paired learning, peer review, or group learning (74, 105, 150). Among training methods are coaching, mentoring, analyzing case studies, and role-playing (109, 150, 166). Micro-skills training–in which a skill is broken down into its elements and trainees receive lessons on each of the elements–has improved providers’ counseling (170). Combinations of approaches often give the best results (25).
Print manuals are being supplemented by CD-ROM, instruction via the World Wide Web, and coaching by e-mail. For example, PROCOSI, a network of Bolivian health care NGOs, uses CD-ROM and e-mail to train staff members in leadership and management (167).
Transfer of learning to the workplace requires cooperation among supervisors, trainers, trainees, and coworkers. Each has a role to play before, during, and after training. For example, before training, supervisors help select trainees, work with trainers on training objectives, inform trainees about the performance expectations once they are trained, and assign trainees’ work among coworkers. After the training supervisors and trainers should visit trainees on the job to monitor, support, and coach them as they use their new knowledge and skills (132). The goal is a closer link between training and performance (21,152).
Monitor and Evaluate Performance
Staff members or consultants monitor the solutions to ensure that they are carried out as planned, and they evaluate the solutions to assess results. Monitoring allows staff members to respond to unexpected problems or take advantage of unexpected opportunities. Among the monitoring tasks are checking that all stakeholders are involved, that top management is publicly supportive, and that the staff members whose performance is being analyzed are participating and accepting the solutions.
The program monitors notify other team members of problems or changes in schedule that affect other deadlines (102). If results fall short, midcourse adjustments can be made. In a training-of-trainers program, for example, monitors can observe classes taught by trainers, take note of any weaknesses, and suggest changes to the curriculum (126).
To evaluate solutions, staff members or consultants measure actual performance after the solutions take effect and compare it with the desired performance agreed to by stakeholders. The evaluators use the same performance indicators that were used to measure the initial performance gap. Data come from observations, interviews or surveys of staff and clients, self-assessment questionnaires, or clinic records.
Few reproductive health care organizations have evaluated their use of the PI process. Only the pilot project carried out by the Dominican Social Security Institute (IDSS) has documented its evaluation. For the IDSS, consultants measured actual performance in three provinces, San Cristobal, La Romana, and La Vega. They carried out a baseline survey in March/April 1999 and follow-up surveys in August 1999, six weeks after the solutions were carried out, and in July/August 2000. Three questions were addressed:
* Did the project close performance gaps? The evaluation team analyzed performance over time in one province, San Cristobal, where the IDSS carried out a full set of solutions addressing expectations, performance appraisal, and knowledge and skills.
* Did provinces differ? The evaluation team compared results in San Cristobal with those in La Romana, where providers worked on expectations and feedback but were not specially trained, and in La Vega, the control province where no solutions were carried out.
* Did facilities differ? The evaluation team compared performance gaps for staff members in the three types of health care facilities that participated in the project-hospitals, clinics, and doctors’ offices.
Did the Project Close Performance Gaps?
The IDSS evaluation measured performance gaps in considerate treatment of clients and providers’ knowledge of reproductive health services offered by the IDSS.
Considerate treatment of clients. The evaluation had two parts: clients were interviewed after they used reproductive health services, and observers watched providers as they cared for clients. Interviewers and observers filled out a questionnaire that measured indicators of considerate treatment of clients. The questionnaires assessed the four areas of counseling: courtesy (Did the provider greet you and call you by your name?); privacy (Did the provider ensure that the consultation would be as comfortable and private as possible?); information (Did the provider give information that answered your questions or needs?); and problem-solving (Did the provider help you to reach a decision that resolved a problem?). The maximum score for desired performance based on the questionnaire was 12.
In San Cristobal the performance gap closed significantly according to both clients and observers. According to clients, the gap decreased from 5.2 at baseline to 4.7 (10% difference from baseline) at the first evaluation survey and to 3.9 (25% difference) at the second survey. According to observers, the performance gap decreased from 7.9 to 4.3 (46%) at the first survey and then increased to 5.6 (29%) at the second survey (120).
Knowledge of reproductive health services. PI facilitators interviewed approximately 80 providers to assess their knowledge of the full range of reproductive health services offered by the IDSS. Facilitators asked providers three questions and graded them on the number of services they mentioned in their answers: (1) For you what is meant by reproductive health services? (2) What are the reproductive health services offered at this facility? and (3) For which reproductive health services can you refer clients? In San Cristobal the performance gap decreased by 32% at the first follow-up survey but then increased by 4% at the second survey compared with the gap at the baseline survey, probably because of staff turnover (120).
Did Provinces Differ?
The differences between San Cristobal and La Romana indicate the relative strengths of the solutions to the performance problems. In La Romana the performance gaps for considerate treatment of clients either increased or did not change significantly, and the gap in providers’ knowledge of reproductive health services increased at the first follow-up survey. Compared with the decrease in performance gaps in San Cristobal, the results in La Romana indicate that providers lacked knowledge and skills–not only clear expectations and feedback–and needed training and follow-up, which were not offered in La Romana (37, 101). Also, managers may not have communicated the new expectations clearly and forcefully enough to decrease performance gaps in La Romana (37).
Higher expectations of clients may also explain the increase in the performance gap in treatment of clients at the first follow-up survey. Responding to the posters describing the quality of reproductive health services, clients may have expected better quality of care than providers in La Romana could deliver (63).
The results from La Vega, the control province, indicate the overall effectiveness of the pilot project. The small changes in the performance gaps in La Vega show that the improved performance in San Cristobal was the result of the pilot project rather than a general improvement in performance in all provinces.
Did Facilities Differ?
The facilities differed significantly in their response to the pilot project. The doctors’ offices improved performance most. For example, the performance gap for considerate treatment of clients, as rated by clients, decreased significantly in doctors’ offices between baseline and the first evaluation survey from 5.8 to 5.1 (12%) on the 12-point scale. At hospitals and clinics, in contrast, the gap increased (120).
Bureaucracy and staff turnover may explain the differences between the facilities. Procedures at hospitals and other large institutions are difficult to change, particularly as a result of short-term projects. The organizational changes required to improve performance take more time in a large institution than in an office (37). Also, staff turnover at hospitals probably prevented improved performance because new staff would not have participated in the project (120).
Improving performance requires people and organizations to learn and change. For example, providers learn new procedures for sterilizing equipment or change attitudes toward clients. In a decentralizing organization employees learn to handle more authority and to make decisions that their managers previously made for them. Carrying out the PI process itself involves managing change.
Change is stressful. It provokes fear, anxiety, and resentment in many people. Without a compelling reason to change, people resist change because they fear that they will have to adopt unfamiliar routines (148), be forced to do more work without more pay, or lose their jobs because they will be judged by a higher standard that they cannot meet (22). Some people are unwilling to take on more responsibility (119, 148). Others dislike change imposed by outsiders [29), or they dislike their working conditions and resist change in protest (4, 22, 112, 144, 147).
Leaders of an organization using the PI process need to take into account and plan for the varying responses of staff members to change. Most people change slowly and in stages. One theory of behavior change identifies a five-step process: precontemplation, contemplation, preparation, action, and maintenance (160). People vary in their response to change, falling into groups of innovators, early adopters, early majority, late majority, and late adopters. The rate of adoption depends on the perceived advantages of the change, how difficult it is to adopt, and the skill with which it is introduced, among other factors (77, 138).
Creative Leadership Needed
Starting and sustaining institutional change requires strong and creative leaders. They need to inspire and persuade employees to complete a sometimes difficult and lengthy process. Leaders committed to change can emerge at any level of an organization, not only from top management. To start the change process, leaders:
* Articulate and communicate an urgent reason to change. Urgency usually comes from a change outside the organization, such as a funding cut or a change in clientele (104). In the Dominican Republic, for example, as more women took jobs in the early 1990s, the Social Security Institute (IDSS) began serving more women than men, and the women were dissatisfied with the reproductive health services offered by the institute (91).
* Include a broad spectrum of employees in planning the changes. In the PI process, involving all stakeholders creates a nucleus of people who support the changes and reduces the likelihood of resistance to change. Experience in US industry suggests that organizations can change when at least one-quarter of employees are committed to change (77).
* Create a vision of the organization. Leaders communicate a vision for the organization and link the changes to the vision so that employees see the reason for change.
Communicating the vision demands persistence and creativity. Leaders set examples of the new ways of working. They must emphasize the vision repeatedly in many forms–presentations, informal discussions, letters, memos, and newsletters–to make it the guiding principle for employees (77). Changes become permanent when employees change the way they think about and do their work (77, 84).
Performance Improvement promises to do for reproductive health organizations and programs in developing countries what it has done for corporations around the world: improve services with well-designed solutions to performance problems. The PI process helps organizations inspire, guide, equip, and enable employees to fulfill the mission of their organization and perform at their highest level. The result can be more productive employees, more effective reproductive health programs, and more satisfied clients.
A Step-by-Step Process to Strengthen Performance
1. Consider the institutional context of the performance problem and get stakeholder agreement. Facilitators examine the mission, goals, strategies, and culture of the organization, and the perspectives of clients and communities. They foster and maintain stakeholder agreement on the objective of the PI process and the plans for addressing the performance problem.
2. Define desired performance in measurable terms if possible. Desired performance takes into account international or national standards and the perspective of stakeholders. The description of desired performance creates a manageable set of objectives for the process.
3. Describe actual performance. The description of actual performance is
based on observations and interviews of staff members and clients and on reviews of clinic records and other documents.
4. Measure or describe the performance gap. The difference between desired and actual performance is the performance gap.
5. Find the root causes of the performance gap. Stakeholders discuss the reasons for the gap and identify the most basic reasons, or root causes. Most root causes can be linked to factors that help people do their work: job expectations; performance feedback (including formal performance appraisals, comments from supervisors, coworkers, or clients, or self-assessments); workspace, supplies, or equipment; incentives; organizational support; and knowledge or skills (see box, p. 8). Reproductive health organizations have identified weaknesses in all the performance factors, but most often in knowledge and skills, expectations, and supplies and equipment (128). Linking the root causes of performance gaps to specific factors helps stakeholders generate solutions that address the root causes.
6. Select interventions. Stakeholders generate ideas for solutions that address the root causes of performance gaps and the related performance factors. These solutions can be drawn from reviews of best practices. Then stakeholders rank and select these interventions according to cost, benefit, or other criteria.
7. Implement interventions. The staff members or consultants who carry out the solutions need good project management skills–planning, scheduling, budgeting, hiring, supervising, and reporting (49, 159).
8. Monitor and evaluate performance. Staff members or consultants keep the solutions on track and guide the organizational changes required to support and sustain the solutions, usually with the help of top management. To evaluate performance, they observe actual performance again and remeasure the performance gap to see the effect of the solutions.
The PI process can be used in cycles. The performance observed and evaluated at the end of the first cycle becomes the actual performance of the next cycle.
Benefits of Performance Improvement
Performance Improvement offers a number of advantages for organizations seeking to improve reproductive health services. Performance Improvement is:
* Involves everyone who has a stake in improving performance, including clients and communities. The stakeholders play the central role in Performance Improvement (102, 124).
* Directs staff members to articulate what their job is, what it should be, and how they contribute to the goals of their organization (142).
Encourages staff members and supervisors to agree on measures of performance (102, 142).
* Encourages organizations and government agencies to pool expertise and work together to analyze and solve performance problems (46, 91, 165).
Logical and systematic
* Begins with discussion among stakeholders to describe the problem and agree on desired performance (102).
* Proceeds step by step from analysis of performance gaps and causes to design and selection of solutions, implementation, and evaluation (102, 124, 159).
* Discourages jumping to conclusions about the causes of performance gaps and possible solutions (48, 94, 142, 159).
* Guides stakeholders to look for causes in all facets of an organization–structure, goals, management, resource allocation, and work processes–and not only in the performance of staff members (1, 94, 142, 159).
Requires observation and research to understand performance problems and measure performance gaps (46, 91, 102, 140).
* Guides stakeholders to solutions based on experience and best practices (162).
* Focuses on results rather than behavior or effort (46).
* Offers an objective, measurable way to evaluate interventions by comparing results with the stakeholders’ original desired performance (46, 91, 102).
Directs stakeholders to look beneath the surface and dig for the root causes of performance problems (102, 124, 159).
* Encourages managers to consider other solutions to performance problems besides training (41, 94, 102, 142).
* Encourages staff members to look beyond causes that they can do nothing about, find causes that they can address, and take improvement of services into their own hands (63).
API Case Study: The Dominican Social Security Institute
The Dominican Social Security Institute (IDSS) carried out a pilot project using Performance Improvement in 1998 and 1999 to respond to clients’ requests for improved reproductive health services. To describe the performance gaps, facilitators interviewed health center directors, service providers, and managers in the IDSS and conducted focus-group discussions with clients. The facilitators, members of the PRIME project, identified six performance gaps. A group of 26 stakeholders–providers, directors, regional supervisors, and others met to rank the gaps, analyze causes, and decide what to do.
The stakeholders decided that a gap in considerate treatment of clients had highest priority. To quantify the gap, facilitators developed a questionnaire with 12 indicators of considerate treatment and carried out a baseline survey in which clients and observers rated providers. Clients found that providers did not perform an average of 5 of the 12 indicators, and observers found that providers did not perform an average of 8 of the indicators.
Investigating the root causes, stakeholders decided that providers were not evaluated on their treatment of clients (through job expectations and performance appraisals), were not rewarded for treating clients considerately (incentives), and did not know how to treat clients considerately (knowledge and skills). Stakeholders selected six possible ways to address the root causes and estimated cost-and-benefit ratios for each (91, 120). The stakeholders’ work can be summarized in a Performance Improvement specification form (see Figure 2).
Closing the Gap
To clarify expectations, a 10-member stakeholder committee developed guidelines for considerate treatment. Using findings from focus-group discussions with clients, the committee identified four components of considerate treatment: friendliness, privacy and confidentiality, providing adequate information, and problem-solving.
Approximately 50 providers reviewed and approved the guidelines (91). The IDSS produced a poster of the guidelines to inform both providers and clients about the new expectations for considerate treatment. The guidelines were also used in a training curriculum and on a card for clients to comment on their treatment by providers (55, 63, 91).
To encourage comments from clients, a consultant distributed suggestion boxes and rating cards to each health facility and provided instruction in their use. Also, a letter to clients from the general director of the IDSS placed next to the suggestion boxes or handed to clients along with the rating cards described the intent to improve treatment of clients and invited clients to comment (63).
To improve providers’ knowledge and skills, an instructional designer and an expert in reproductive health designed a five-day training-of-trainers workshop and a 2 1/2-day workshop for providers. The training strengthened expectations by showing providers good and bad examples of counseling. Providers were asked to assess their own counseling in comparison, and they had an opportunity to practice counseling skills (63).
The performance gap decreased significantly in one province, San Cristobal, where all of the solutions were carried out (see p. 24). Stakeholders thought that the training had the largest impact on the performance gap but that clients’ comments led to important changes in the way providers viewed clients: Providers understood clients better and were concerned that clients were satisfied with services. One hospital director said that his hospital increased its clientele by almost a factor of four because of improvements inspired by clients’ comments (91). Assessing Organizational Change
The PI facilitators asked IDSS staff members to assess organizational changes and institutional capacity to support the changes in provider performance. Among the 20 indicators were support for reproductive health from top managers; up-to-date training materials, supplies, and equipment; and community involvement in decisions about reproductive health services. Ranking the 20 indicators from 1 (no capacity) to 4 (full capacity), IDSS staff members concluded that institutional capacity had increased an average of one full point, from 1.3 before the project to 2.3 after the project (91).
Both the personal qualities an individual brings to the job and the working environment of the organization determine performance. Personal qualities comprise knowledge, skills, capacity, and motives. Environmental factors comprise job expectations, performance feedback, workspace and equipment, and incentives (48).
Facilitators in reproductive health programs in developing countries have linked root causes of performance problems to six performance factors (102):
* Job expectations,
* Performance feedback,
* Workspace, equipment, and supplies,
* Organizational support, and
* Knowledge and skills.
These factors are similar to the needs of providers identified by the International Planned Parenthood Federation (IPPF), for example, guidance, feedback, infrastructure, supplies, encouragement, and training (58).
The order of the factors indicates how difficult they are to correct. For example, fixing unclear job expectations is usually easier and less costly than training (94, 102, 150, 153).
PI practitioners debate the list of performance factors. Capacity–which refers to intelligence, talent, and physical ability (48)–is not included because in developing countries the solution to a capacity problem, telling or encouraging people to leave jobs, is difficult (84, 101). Some practitioners include capacity, however, arguing that it can be taken into account in hiring or in moving people to jobs that suit them better (2, 17).
Incentives, culture, and organizational support are also debated. One prominent PI practitioner leaves incentives off the list, arguing that an employee in a job with all the other factors in place cannot fail to be motivated (142). Another would include cultural practices that affect performance (115). Some leave organizational support off the list, arguing that organizations support performance by attending to the other five performance factors (127).
Job expectations. To perform well, employees need to know what is expected of them and how they will be evaluated. Expectations comprise the objective of their job, the tasks they must carry out–with measurable quantities and rates–and where, when, and with whom they must work.
Many employees are unsure about what is expected of them (45, 73, 137). Some may not be able to state the goals of their organization and how their job contributes to the goals or what their responsibilities are (107). Many employees work without formal job descriptions (5, 38, 46), instead learning by watching or talking to colleagues. Some have unclear or imprecise job descriptions.
Performance feedback, Employees need to know how they are doing in comparison with the expectations for their job. Employees find out if they are meeting or falling short of expectations through oral or written information from supervisors, coworkers, or clients.
Personal, cultural, and organizational factors can prevent employees from receiving useful appraisals of their performance. Afraid of offending, supervisors typically praise employees and tell them to “keep up the good work,” without going into specifics. The culture of an organization or national customs sometimes preclude confronting employees directly about performance problems (100, 101,144). In some Asian and Latin American countries, for example, employees are judged on their personal characteristics such as integrity and loyalty and would be offended by judgments based solely on their performance (144). Some supervisors have trouble giving feedback because they have never done the work of the people they are supervising. Encouraging self-assessment can help in this case (101).
Performance appraisal is a skill that needs to be learned and practiced. Ideally, appraisals should be honest and timely, precise and specific, private, provided with an opportunity for self-evaluation, and delivered without interruption. Employees should receive information about their performance often–weekly or even daily for new employees and once a month for long-term employees (125). For many organizations, however, any systematic performance appraisal would be an improvement. Infrequent feedback and unclear job expectations together are the most common causes of performance problems in US corporations (48, 150).
Workspace, equipment, and supplies. The space in which employees work and the equipment and supplies they need to do their jobs comprise the physical environment. The workspace should be easy and safe to work in. Distractions and inconveniences–for example, noise or inaccessible supplies–require staff to adapt. Some adaptation can be challenging and motivating, but, if employees spend too much time and energy overcoming inconveniences, performance inevitably suffers (71).
Incentives. Motivation results from both external incentives and a person’s internal motives for doing a job (48). Typical causes of low motivation are poor pay, poor working conditions, and no opportunity for advancement. Lacking incentives, many people do not give full effort. In US surveys of worker productivity, only about 25% of people say that they work as hard as they could. Most say they work at about two-thirds of their potential or only hard enough to hold onto their jobs (14, 40).
Tradition and culture influence the use of incentives. Many organizations reward employees for time on the job rather than for good performance, and the reward is often the opportunity to attend training programs. Also, offering more money as an incentive for individuals may not be effective in countries, such as Denmark and Japan, where work in teams is encouraged by equal pay among team members (144).
Organizational support. To help employees do their best work, managers are responsible for setting up supportive organizational structures, strategies, and work processes. For example, managers create and communicate a clear mission and goals for the organization, provide inspiring and effective leadership, design job roles that align with the organization’s goals, develop clear lines of authority, and encourage open communication up and down the hierarchy (27, 77, 95, 142, 169).
Knowledge and skills. People acquire knowledge and skills for reproductive health care in preservice education and in-service training. They attend professional schools of management, public health, or nursing and midwifery, for example, or they learn on the job.
Employees lack the knowledge or skills to do their jobs well for a variety of reasons. They may have been hired for or promoted into a job they were not trained for. They are unaware of changes in protocols or guidelines, they had poor training in professional schools, or they forgot information or skills from lack of use.
Performance Needs Assessment: Burkina Faso
A district management team (DMT) in Koupela, Burkina Faso, conducted a five-day workshop that used Performance Improvement to address problems in the team’s support of maternal and neonatal health care. The workshop also prepared DMT members to introduce the PI process to providers at the district health centers. The DMT is responsible for planning, supervising, and reporting on health care activities in the district, including in-service training, provision of equipment, and financial and personnel management. Seven members of the DMT, an instructor at the National School of Midwifery, and a trainer attended the workshop in December 2000.
Workshop participants reviewed the PI process and carried out its steps through the analysis of root causes and the generation of solutions. They identified five roles for the DMT, indicators for each role, and desired performance for each indicator. The five areas and sample indicators included:
* Identify problems in maternal and neonatal care. Assemble members of the DMT and any experts needed, present problems, and analyze causes.
* Carry out projects on schedule. Create the schedule of projects, write briefing notes at least two weeks before each project, and deposit funds at least 72 working hours before the project.
* Conduct quarterly supervision visits to the health centers. Hold an introductory meeting with staff of the health center, check that the recommendations of previous supervisory visits have been carried out, discuss problems and solutions with staff and community members, and encourage and thank the staff.
* Write a report summarizing the supervisory visit. Discuss objectives, methodology, activities, results, and recommendations. Distribute the report to the regional directorate.
The team described actual performance and identified performance gaps. For example, the team did not write briefing notes two weeks in advance of projects; deposit funds at least 72 hours before a project; conduct supervisory visits four times a year, or even twice a year; invite members of the community to discuss the visit; or write a summary report.
Analyzing the root causes of the performance gaps, the PI team found that many were linked to lack of organizational support. For example, the gap in supervision had several root causes. The few supervisory teams could not visit the large number of health centers every quarter, and the schedule for supervision did not always take into account the availability of supervisors (organizational support); teams could not travel to some centers during the rainy season because roads and vehicles were in bad condition (equipment); supervision was not a priority for the DMT, it was not well organized, and members lacked skills (expectations, skills/knowledge).
To address these root causes, the workshop participants focused on training to improve skills and strengthen expectations. The participants recommended: (1) evaluating the needs of members of the DMT for training in supervision, (2) planning and developing a training program, and (3) following up the members who were trained (66).
Performance Improvement in the Private Sector: India
Stakeholders in Uttar Pradesh, India, used Performance Improvement to find ways to encourage private providers to offer better family planning services and to identify more clients who need family planning services. Indigenous Systems of Medicine (ISM) practitioners use a combination of traditional and modern medicine and provide most curative services in rural areas of Uttar Pradesh (90, 133). They charge clients for medicine and other supplies but not for time spent counseling.
From 1995 to 1999 the State Innovations in Family Planning Services Agency (SIFPSA) and local district organizations trained ISM practitioners to counsel about family planning and to provide oral contraceptives and condoms. Stakeholders felt that despite the training, ISM practitioners were not counseling as many women as they could about family planning.
PI facilitators then carried out a performance needs assessment in 1999. They developed indicators for the quantity and quality of family planning services offered by the practitioners. The main quantitative indicator was the proportion of clients possibly needing family planning services whom the practitioner identifies and counsels. The main qualitative indicator was the adherence of practitioners to an observation checklist, which included items about clinic settings and counseling skills. The PI team found that practitioners counseled less than half of eligible clients. Stakeholders set desired performance at a realistic goal: counseling for 75% of eligible clients. The performance gap was the 25% or more of eligible clients whom the practitioners did not identify and counsel.
The main root cause of the performance gap was the loss of income by practitioners when they counseled clients–on average for 10 minutes per visit (incentive). Other root causes were the absence of a reliable source for condoms and oral contraceptives (supplies) and lack of awareness to counsel every eligible client (expectations). Some did not know how to counsel or how to identity/eligible clients (knowledge/skills). Communities did not know that the practitioners offered family planning services because practitioners did not promote or market their services (expectations, knowledge, skills).
To solve the counseling problem, stakeholders suggested several initiatives to make selling contraceptives more profitable. They ranked them on a 10-point cost-and-benefit scale. Among the highest ranked solutions were:
1. Make sure during the training program that practitioners know they should provide family planning counseling. 2. Give an initial supply of contraceptives at the end of the training program.
3. Make training more selective to increase the status of ISM practitioners who provide family planning services.
4. Promote services in the community, particularly to elderly women who have great influence on family decisions.
5. Identify wholesalers, distributors, and other sources of contraceptives for the practitioners.
6. Train the ISM practitioners in marketing (90).
Figures for the current percentage of eligible clients being counseled are not available. Quality of care seems to be high: Simulated clients and self-reporting show that 80% of the practitioners are meeting the criteria for good counseling (88, 133).
Note to readers: This report serves two audiences. The first chapter is an overview for managers who will make the decision to use Performance Improvement and need to know the fundamentals, costs, and expected results. The rest of the report details each step of the process, tools, and techniques for readers who may become PI facilitators.
Table 1. Performance Improvement in Reproductive Health Care
Country, Organization, Year Ref. No. Goal
Burkina Faso 66 Improve planning and
Koupela district manage- supervision of DMT; improve
ment team (DMT) and skills of providers in
health care facilities maternal and neonatal health;
2000- introduce Performance
Burkina Faso 114 Identify needs (of community-
Directorate of Family based distributors in order to
Health, Ministry of add reproductive health
Health; 1998 services to their duties
Ghana 130 Strengthen supervisory skills
Family Health Division, of Regional Resource Teams in
Ministry of Health; 2001- 3 regions
Ghana 20 Encourage MOH providers to
Ministry of Health follow infection prevention
Ghana 43 Improve clinical skills of
Ministry of Health staff at training sites for
2001- nursing and midwifery pre-
Kenya 164 Strengthen providers’
Family Planning Association postabortion care (PAC) skills
of Kenya; 2001-
Malawi 44 Improve staff members’
Ministry of Health and infection prevention practices
Nigeria 46 Assess public and NGO family
USAID Mission planning clinics and providers
2000 in 3 states; help formulate
strategy for strengthening
reproductive health services
Senegal 44 Improve PAC services of
Ministry of Health providers at Roi Baudouin
2001- Hospital in Dakar
Tanzania 135, Assess community perceptions
Reproductive and Child 136 and expectations of health
Health Section, Ministry of care services; work with staff
Health of Zonal Training Centres to
2001- decentralize training and
improve quality of
reproductive and child health
Armenia 86 Work with policy makers on
Ministry of Health standards of care and
2001- physicians and nurse-midwives
on quality of services; inform
and involve clients and
India 90 Help Indigenous Systems of
State Innovations in Family Medicine and rural
Planning Services Agency practitioners in Uttar Pradesh
(SIFPSA); 1999 offer family planning services
Yemen 131 Strengthen reproductive health
Ministry of Public Health care skills of community
1999- midwives (CMWs)
Dominican Republic 91, Strengthen reproductive health
Dominican Social Security 120 services offered by providers
Institute (IDSS); 1998-1999 in IDSS facilities in 2
Guatemala 99, Improve maternal and neonatal
Ministry of Health 113 health care through
2000- accreditation of district
hospitals, health centers, and
posts in seven districts
Honduras 38 Help MOH to license 200 public
Ministry of Health and private health facilities
2001- in Olancho province
Country, Organization, Year Result/Status
Burkina Faso Found gaps in DMT in planning projects,
Koupela district manage- frequency of supervision, community
ment team (DMT) and involvement in problem solving, and
health care facilities dissemination of results; for supervision
2000- problem, recommended training to
strengthen supervisory skills and clarify
Burkina Faso Found performance hampered by lack of
Directorate of Family financial incentives, supplies, knowledge
Health, Ministry of and skills, and supervision; recommended
Health; 1998 work on incentives and supervision before
training to improve knowledge and skills.
Ghana Strengthening organizational support for
Family Health Division, supervision.
Ministry of Health; 2001-
Ghana Facilitated half-day meeting;
Ministry of Health participants found 7 areas that needed
2000 strengthening, including supervision,
training of managers and administrators,
and standardizing procurement of bleach.
Ghana Defined measurable desired performance,
Ministry of Health observed actual performance, and analyzed
2001- root causes of performance gaps;
selecting and implementing interventions.
Kenya Conducting performance needs assessment;
Family Planning Association defined desired performance and
of Kenya; 2001- performance indicators.
Malawi Assessing actual performance.
Ministry of Health and
Nigeria Found gaps in availability of services,
USAID Mission supplies, clinic cleanliness, counseling
2000 skills, infection prevention, and record
Senegal Analyzed root causes; selecting
Ministry of Health interventions.
Tanzania Conducted performance needs assessment
Reproductive and Child and made recommendations regarding
Health Section, Ministry of access, environment, and quality of
Health services; defined desired performance in
2001- eight areas to strengthen Zonal Training
Armenia Carried out performance needs assessment;
Ministry of Health drafting policies and standards for
2001- reproductive health services.
India Identified root causes of practitioners’
State Innovations in Family reluctance to offer family planning
Planning Services Agency services despite training; recommended
(SIFPSA); 1999 ways to address root causes, especially
lack of financial incentive to spend time
Yemen Carried out performance needs assessment;
Ministry of Public Health strengthening supervision of CMWs and
1999- opportunities for self-directed learning;
establishing licensing program.
Dominican Republic Reduced performance gaps in counseling,
Dominican Social Security knowledge of reproductive health
Institute (IDSS); 1998-1999 services, and provision of
Guatemala Carrying out solutions, such as WHO-
Ministry of Health recommended practices for management of
2000- labor in hospitals; compliance of
hospitals with 77 criteria increased
between baseline and first follow-up
Honduras Found problems with most performance
Ministry of Health factors; generated solutions and
2001- estimated costs and benefits; recommended
strengthening supervision, organizational
support, and incentives.
Table 2. Defining Desired Performance:
Correcting Common Mistakes
Poor Phrasing Problem Better Phrasing
The provider knows Describes ability or The provider carries
the guidelines for knowledge, which out all the steps in the
IUD insertion. cannot be observed IUD protocol.
The provider spends Vague The provider spends at
enough time with least 10 minutes with
each client. each client
The provider sees at Provider does not When clients are wait-
least 10 clients each control the number ing, provider takes no
day. of clients who come more than 15 minutes
to the clinic. between clients.
Source: McCaffery, 2000 (102)
Figure 2. Performance Improvement Specification Form for Counseling in
the Dominican Social Security Institute, 1998-1999
Desired Actual Performance
Performance Performance Gap Root Causes
Providers treat Average 6.8 5.2 for Providers do not
all clients with score by clients; know that considerate
consideration clients; 7.9 for treatment is expected of
and respect average 4.1 observers. them
(“trato score by
score 12 out
of 12 on ques- No feedback from clients
tionnaire filled or supervisors on
out by clients counseling
No incentives to counsel
Performance Interventions Benefit Cost Ratio
Providers treat Develop and disseminate 10 4 2.5
all clients with norms for treating clients
and respect Disseminate information to 9 4 2.3
(“trato ensure that providers know
humano”); they are expected to treat
score 12 out clients well: posters,
of 12 on ques- letters
out by clients Set up suggestion boxes and 8 2 4.0
and observers cards for clients’ feedback
Recognize providers for 8 4 2.0
showing consideration and
respect for clients
Training 10 7 1.4
* Stakeholders estimated benefits and costs on scale of 1 (least
favorable) to 10 (most favorable).
Source: Luoma, 2000 (91); McCaffery, 2000 (102); Padilla, 2001 (120)
Figure 5. Clients’ Feedback Card Evaluating
Reproductive Health Services at Dominican
Social Security Clinics
Let us know!
We are making every effort to offer you the highest quality health
services. Help us to provide better care for you by telling us how you
liked your visit today to this health center. Please fill out this card
and put it in the box.
Yes More or Less No
Did the health care provider ask
you the reason for your visit? * ** ***
Did the health care provider
speak about your concerns in a * ** ***
Did the health care provider give
you information that responded * ** ***
to your questions or needs?
Did the health care provider
help you to make a decision to * ** ***
resolve a problem?
If you would like to tell us more, write here–
Source: PRIME Project
Translated from Spanish.
An asterisk (*) denotes an item that was particularly useful in the preparation of this issue of Population Reports.
(1.) ADDISON, R.M., and HAIG, C. Human performance technology in action. In: Stolovitch, H., and Keeps, E. Handbook of human performance technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 298-318.
(2.) ADETUNJI, A. (EngenderHealth) [Performance Improvement in Nigeria] Personal communication, May 10, 2001.
(3.) AHMED, A.M., MUNG’ONG’O, E., and MASSAWE, E. Tackling obstacles to health care delivery at district level. World Health Forum 12: 483-489. 1991.
(4.) AINSLIE, R. (Johns Hopkins University/Center for Communication Programs) [Performance Improvement in Brazil (PROQUALI) and Guatemala (CaliRed)] Personal communication, June 9, Oct. 16, 2001; May 2, 2002.
(5.) AITKEN, J.-M. Voices from the inside: Managing district health services in Nepal. International Journal of Health Planning and Management 9(4): 309-340. 1994.
(6.) ALLERY, L.A., OWEN, P.A., and ROBLING, M.R. Why general practitioners and consultants change their clinical practice: A critical incident study. British Medical Journal 314: 870. Mar. 22, 1997.
(7.) ANONYMOUS. Puentes hacia la calidad de atencion en salud [Bridges toward the quality of care in health]. [manuscriptl. Save the Children, Oct. 15, 2001.4 p.
(8.) ANONYMOUS. “Building bridges for quality”: A community mobilization project to improve quality. [draft]. no date. 3 p.
(9.) ASKOV, K., MACAULAY, C., MILLER FRANCO, L., SILIMPERI, D., and VAN ZANTEN VELDHUYZEN, T. Institutionalization of quality assurance. Bethesda, Maryland, Center for Human Services, No date. (Project Report) 38 p.
(10.) BAYA, B., GUIELLA, G., OUEDRAOGO, C., and PICTET, G. Evaluation de la strategie de distribution a base communautaire [Evaluation of the strategy of community-based distribution], [CD-ROM, “Electronic Library, 1990-1999,” from Population Council, Frontiers in Reproductive Health]. Laboratoire de Sante Communautaire du Bazega, Dec. 1998. 123 p.
(11.) BEN SALEM, B., and BEATTIE, K.J. Facilitative supervision: A vital link in quality reproductive health service delivery. New York, AVSC International, Aug. 1996. (Working Paper No. 10) 19 p.
(12.) BENNETT, S., and MILLER FRANCO, L. Summary proceedings: Workshop on health worker motivation and health sector reform. Bethesda, Maryland, Oct. 1998. Partnerships for Health Reform/Abt Associates, 37 p.
(13.) BENNETT, S., and MILLER FRANCO, L. Public sector health worker motivation and health sector reform: A conceptual framework. Bethesda, Maryland, Abt Associates/ Partnerships for Health Reform, Jan. 1999. (Major Applied Research 5, Technical paper 1) 45 p.
(14.) BENNIS, W.G., and NANUS, B. Leaders: Strategies for taking charge. New York, Harper & Row, 1985. 244 p.
(15.) BHATTACHARYYA, K., SHAFRITZ, L., and GRAEFF, J.A. Sustaining health worker performance in Burkina Faso. Arlington, Virginia, BASICS, 1997.45 p.
(16.) BLAKE, S.M., NECOCHEA, E., BOSSEMEYER, D., GRIFFEY BRECHIN, S.J., LEMOS DA SILVA, B., and MAFALDA ILDEEONSO DA SILVEIRA, D. PROQUALI: Development and dissemination of a primary care center accreditation model for performance and quality improvement in reproductive health services in northern Brazil. Baltimore, JHPIEGO, July 1999. (JHP No. 03) 77 p.
(17.) BORNSTEIN, T. (Center for Human Services/Quality Assurance Project) [Performance improvement and quality assurance] Personal communication, Apr. 17, 2001.
(18.) BORNSTEIN, T. Quality Improvement and Performance Improvement: Different means to the same end? QA Brief, Vol. 9 No. 1, Spring 2001. p. 4-12.
(19.) BOULAY, M. (Johns Hopkins University/Center for Communication Programs) [The client as performer in Nepal] Personal communication, Apr. 3, 2001.
(20.) CAIOLA, N. Application of the PI process in Ghana. Pi-L listserv. Posted to the Pi-L e-mail listserv Apr. 27, 2000. (Available: , Accessed May 17, 2001).
(21.) CAIOLA, N., and SULLIVAN, R.L. Performance Improvement: Developing a strategy for reproductive health services. Baltimore, JHPIEGO, May 2000. (Strategy paper No. 9) 12 p.
(22.) CAIOLA, N., SULLIVAN, R.L., LYNAM, P., and TRANGSRUD, R. Supervising health services: Improving the performance of people. [draft]. Baltimore, JHPIEGO, Jan. 11, 2001. 122 p. [available on CD-ROM, May 2002].
(23.) CHEGE, J., SANOGO, D., ASKEW, I., BANNERMAN, A., GREY, S., GLOVER, E.K., YANKEY, F., and NERQUAYETETTEH, J. An assessment of the community based distribution programmes in Ghana. Nairobi, Kenya, Population Council, Planned Parent Association of Ghana, Nov. 2000.36 p.
(24.) COHEN, S.J., HALVORSON, H.W., and GOSSELINK, C.A. Changing physician behavior to improve disease prevention. Preventive Medicine 23: 284-291. 1994.
(25.) DAVIS, D.A., THOMSON, M.A., OXMAN, A.D., and HAYNES, B. Changing physician performance: A systematic review of the effect of continuing medical education strategies. Journal of the American Medical Association 274(9): 700-705. Sep. 6, 1995.
(26.) DE LA PEZA, L., and ELLIS, A. Case: PI work with ASHONPLAFA in Honduras. Pi-L listserv. Posted to the Pi-L e-mail listserv May 29, 2001. (Available: http://community. jhpiego.jhu.edu/archives/pi-l.html>, Accessed May 29, 2001)
(27.) DETTMER, H.W. Goldratt’s theory of constraints: A systems approach to continuous improvement. Milwaukee, Wisconsin, American Society for Quality, 1997. 378 p.
(28.) DOHLIE, M., MIELKE, E., BWIRE, T., ADRIANCE, D., and MUMBA, F. COPE: A model for building community partnerships that improve care in East Africa. Journal for Healthcare Quality 22(5). Sep./Oct. 2000.
(29.) DORMANT, D. Implementing human performance technology in organizations. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 237-259.
(30.) ELLIOTT, P.H. Job aids. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 430-441.
(31.) EMSLIE, C., GRIMSHAW, J.M., and TEMPLETON, A. Do clinical guidelines improve general practice management and referral of infertile couples. British Medical Journal 306: 1728-1731. 1993.
(32.) ENGENDERHEALTH. COPE: Self-assessment guides for reproductive health services. New York, EngenderHealth, 1999. (No. SM-15). 71 p.
(33.) ENGENDERHEALTH. What Is COPE? . EngenderHealth, Jan. 3, 2002.
(34.) ENGENDERHEALTH. Community COPE. New York, EngenderHealth, 2002. (No. SM-23).
(35.) FAMILY PLANNING LOGISTICS MANAGEMENT/JOHN SNOW, INC. (FPLM/JSI). Programs that deliver: Logistics contributions to better health in developing countries. Arlington, Virginia, FPLM/JSI, 2000. 117 p.
(36.) FELLING, B. (John Snow) [Performance Improvement in the Deliver project] Personal communication, Mar. 28, 2001; Feb. 26, 2002.
(37.) FORT, A. (Intrah/PRIME) [Evaluation of the Dominican Republic pilot project] Personal communication, Aug. 21, 2001. 38. FORT, A., CALIX, M., CARIAS, D., CORDERO, M., ESCOTO, H., ESPADA, S.L., JASKIEWICZ, W., KILLIAN, R., LUOMA, M., and VALLEJO, F. Baseline survey on licensing and the performance of primary health care providers in Region 7-Olancho, Honduras: primary provider performance, its factors and client perception. Chapel Hill, North Carolina, PRIME II, Feb. 2002. (PRIME II Technical Report No. 29A) 58 p.
(39.) FOSHAY, W.R., MOLLER, L., SCHWEN, T.M., KALMAN, H.K., and HANEY, D.S. Research in Human Performance Technology. In: Stolovitch, H., and Keeps, E. Handbook of Human Perfomance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 895-915.
(40.) FOX, D., BYRNE, V., and ROUAULT, F. Performance improvement: What to keep in mind. Alexandria, Virginia, American Society for Training & Development (ASTD), 2001. (Available: )
(41.) FULLER, J. From training to performance. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 281-297.
(42.) GARRISON, K. From training to performance: Issues in planning. Proceedings of the Performance Improvement: Orientation, programming & skill building course [CD-ROM], Alexandria, Virginia, Oct. 15-19, 2001. Available from JHPIEGO, Baltimore.
(43.) GARRISON, K. Performance Improvement in preservice. Pi-L listserv. Posted to the Pi-L e-mail listserv Oct. 10, 2001. (Available: , Accessed Oct. 10, 2001)
(44.) GARRISON, K. (JHPIEGO) [JHPIEGO projects using the Performance Improvement approach] Personal communication, Jan. 4, 2002.
(45.) GARRISON, K. Supervision in Kenya. Pi-L listserv. Posted to the Pi-L e-mail listserv May 6, 2002. (Available: , Accessed May 6, 2002)
* (46.) GAYE, P., SIDHOM, Y., ADENIRAN, B., OJEDIRAN, M., LUOMA, M., HEEREY, M., AWOSIYAN, M., ADETLUNJI, A., DOSUMU, B., CORDERO, C., JOHNSON, S., and ANYANWU, L. Assessing the performance of family planning service at the primary care level in Nigerian local government area health centers and NGO clinics: Final report. Chapel Hill, North Carolina, PRIME II, Jan. 2001.70 p.
(47.) GELLERMAN, S.W. Motivation in the real world: The art of getting extra effort from everyone. New York, Penguin, 1992. 324 p.
* (48.) GILBERT, T.F. Human competence: Engineering worthy performance. Washington, D.C., International Society for Performance Improvement, 1996. 376 p.
(49.) GREER, M. Planning and managing human performance technology projects. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 96-121.
(50.) GRIFFIN, J. (US Agency for International Development) [Performance improvement] Personal communication, Dec. 6, 2000; Oct. 16, 2001.
(51.) GRIMES, D.A. The need for systematic reviews in family planning. WHO Reproductive Health Library (3): 2. 2000.
(52.) GRIMSHAW, J.M., and RUSSELL, I.T. Effect of clinical guidelines on medical practice: A systematic review of rigorous evaluation. Lancet 342: 1317-1322. 1993.
(53.) GRIMSHAW, J.M., and RUSSELL, I.T. Achieving health gain through clinical guidelines. II: Ensuring guidelines change medical practice. Quality in Health Care 3: 45-52. 1994.
(54.) GROL, R. Implementing guidelines in general practice care. Quality in Health Care 1(3): 184-191. Sep. 1992.
(55.) HARBER, L. (Intrah/PRIME) [Material production for the PI project in the Dominican Republic] Personal communication, Aug. 23, 2001.
(56.) HEEREY, M. Bauchi State. Presented at the Nigeria Performance Needs Assessment Briefing, USAID, Washington, D.C., Jan. 8, 2001.
(57.) HOWARD-GRABMAN, L. (Save the Children) [Community mobilization in Peru] Personal communication, Aug. 2, 2001.
(58.) HUEZO, C., and DIAZ, S. Quality of care in family planning: Clients’ rights and providers’ needs. Advances in Contraception 9: 129-139. Jun. 1993.
(59.) HULSCHER, M.E.J.L., WENSING, M., GROL, R., VAN DER WEIJDEN, T., and VAN WEEL, C. Interventions to improve the delivery of preventive services in primary care. American Journal of Public Health 89(5): 737-746. May 1999.
(60.) HUNTINGTON, D., LETTENMAIER, C., and OBENGQUAIDOO, I. User’s perspective of counseling training in Ghana: The “mystery client” trial. Studies in Family Planning 21(3): 171-177. May/June 1990.
(61.) JACOBS, R.L. Structured on-the-job training. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 606-625.
(62.) JANOWITZ, B., CHEGE, J., THOMPSON, A., RUTENBERG, N., and HOMAN, R. Community-based distribution in Tanzania: Costs and impacts of alternative strategies to improve worker performance. International Family Planning Perspectives 26(4): 159-160, 193-195. Dec. 2000.
(63.) JASKIEWICZ, W. (PRIME) [Performance improvement in Latin America] Personal communications, July 10, 2001-May 24, 2002.
(64.) JHPIEGO. Defining and strengthening the MAQ/PI relationship at JHPIEGO. http;//www.reproline.jhu.edu/english/ 6read/6pi/pi_maq.htm>, JHPIEGO, April 23, 2001.
(65.) JHPIEGO. JHPIEGO’s Instructional Design Process..JHPIEGO, Dec. 25, 2001.
(66.) JHPIEGO. Report of analysis of performance improvement under the MNH Koupela project. [draft translation from French]. Baltimore, JHPIEGO, Nov. 2001.21 p.
(67.) JOHNS HOPKINS UNIVERSITY/CENTER FOR COMMUNICATION PROGRAMS. Distance education works. Communication Impact! (1). Jan. 1998. (Available: )
(68.) JOHNS HOPKINS UNIVERSITY/CENTER FOR COMMUNICATION PROGRAMS. PROQUALI improves health services in Brazil. Communication Impact! (10): 2. Aug. 2000. (Available: , Accessed Sep. 20, 2001)
(69.) JOHNS HOPKINS UNIVERSITY/POPULATION COMMUNICATION SERVICES (JHU/PCS). PROQUALI technical document. [draft]. Baltimore, JHU/PCS, May 9, 2000.48 p.
(70.) JOINT COMMISSION ON ACCREDITATION OF HEALTHCARE ORGANIZATIONS (JCAHO). Performance improvement in ambulatory care. Oakbrook Terrace, Illinois, JCAHO, 1997. 212 p.
(71.) KEARNY, L., and SMITH, P. Workplace design for creative thinking. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 464-484.
(72.) KELLEY, E., GESLIN, C., DJIBRINA, S., and BOUCAR, M. The impact of QA methods on compliance with the Integrated Management of Childhood Illness Algorithm in Niger. Bethesda, Maryland, Center for Human Services/Quality Assurance Project, 2000. (Operations Research Results No. 1(2)) 16 p.
(73.) KEPNER-TREGOE. People and their jobs: What’s real, what’s rhetoric? http://www.kepner-tregoe.com/pdf/people _Jobs_KL445a.pdf>. Kepner-Tregoe, Nov. 2, 2001.
(74.) KIM, Y.M., PUTJUK, F., KOLS, A.J., and BASUKI, E. Improving provider-client communication: Reinforcing IPC/C training in Indonesia with self-assessment and peer review. Bethesda, Maryland, Quality Assurance Project, 2000. (Operations Research Results No. 1(6)) 18 p.
(75.) KIM, Y.M., TAVROW, P., MALIANGA, L., SIMBA, S., PHIRI, A., and GUMBO, P. The quality of supervisor-provider interactions in Zimbabwe. Bethesda, Maryland, Center for Human Services, Quality Assurance Project, 2000. (Operations Research Results No. 1(5)) 16 p.
(76.) KNEBEL, E., LUNDAHL, S., EDWARD-RAJ, A., and ABDALLAH, H. The use of manual job aids by health care providers: What do we know? Bethesda, Maryland, Quality Assurance Project/Center for Human Services, Feb. 2000. (Operations Research Issues Paper No. 1) 24 p.
(77.) KOTTER, J.P. Leading change. Boston, Harvard Business School Press, 1996. 187 p.
(78.) KUMAR, K. Rapid, low-cost data collection methods for A.I.D. Washington, D.C., US Agency for International Development, Dec. 1987. (Program Design and Evaluation Methodology Report No. 10) 34 p.
(79.) KUMAR, V., RUDY, S., and SHAH, S. Improving the performance of community midwives in Yemen. [draft]. New Delhi, India, Intrah/PRIME II, 2000. 15 p.
(80.) LEIBLER, S.N., and PARKMAN, A.W. Human resources selection. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 351-372.
(81.) LEON, F.R., QUIROZ, G., and BRAZZODURO, A. The reliability of simulated clients’ quality-of-care ratings. Studies in Family Planning 25(3): 184-190. May-June 1994.
(82.) LION COLEMAN, A. (Intrah/PRIME) [Performance improvement in the Dominican Social Security Institute] Personal communication, May 17, 2001.
(83.) LOMAS, J., ENKIN, M., ANDERSON, G.M., HANNAH, W.J., VAYDA, E., and SINGER, J. Opinion leaders vs audit and feedback to implement practice guidelines. Journal of the American Medical Association 265(17): 2202-2207. May 1, 1991.
(84.) LOZARE, B. (Johns Hopkins Center for Communication Programs) [Leadership and change] Personal communication, Apr. 10, 2001.
(85.) LUOMA, M. Performance improvement and quality improvement. Pi-L listserv. Posted to the Pi-L e-mail listserv Jan. 6, 2000. (Available: )
(86.) LUOMA, M. Armenia debrief. Technical leadership area: Systems for improved performance. [PowerPoint presentation]. Chapel Hill, North Carolina, PRIME II, Aug. 24, 2001. 14 p.
(87.) LUOMA, M. Ghana Why Tree. [draft]. Chapel Hill, PRIME, June 19, 2001.
(88.) LUOMA, M. (Training Resources Group (TRG)/PRIME) [Performance improvement] Personal communications, June 28, 2001-Apr. 22, 2002.
(89.) LUOMA, M. Selecting interventions. [draft]. Chapel Hill, July 10, 2001. 1 p.
* (90.) LUOMA, M., GAUTHAM, M., and KUMAR, V. Performance assessment of Indigenous Systems of Medicine and rural practitioners in Uttar Pradesh, India. Chapel Hill, North Carolina, PRIME, Feb./Mar. 1999.29 p.
* (91.) LUOMA, M., JASKIEWICZ, W., MCCAFFERY, J., and CATOTTI, D. Dominican Republic Performance Improvement project evaluation. Chapel Hill, NC, PRIME, Jan. 2000. (Technical report No. 19) 87 p.
(92.) LYNAM, P., RABINOVITZ, L.M., and SHOBOWALE, M. The use of self-assessment in improving the quality of family planning clinic operations: The experience with COPE in Africa. New York, EngenderHealth, Dec. 1992. (AVSC Working Paper No. 2) 15 p.
(93.) MADDEN, J.M., QUICK, J.D., ROSS-DEGNAN, D., and KAFLE, K.K. Undercover careseekers: Simulated clients in the study of health provider behavior in developing countries. Social Science and Medicine 45(10): 1465-1482. Nov. 1997.
(94.) MAGER, R.F., and PIPE, P. Analyzing performance problems: Or you really oughta wanna. 3rd ed. Atlanta, Center for Effective Performance, 1997. 183 p.
(95.) MANAGEMENT SCIENCES FOR HEALTH (MSH). Family Planning Management Development . MSH, May 13, 2002.
(96.) MARQUEZ, L. Helping healthcare providers perform according to standards. Bethesda, Maryland, Center for Human Services/Quality Assurance Project, 2001. (Operations Research Issues Paper No. 2(3)) 36 p.
(97.) MASSOUD, R. [Quality Assurance Project/Center for Human Services) [Quality assurance and performance improvement] Personal communication, July 25, 2001.
* (98.) MASSOUD, R., ASKOV, K., REINKE, J., MILLER FRANCO, L., BORNSTEIN, T., KNEBEL, E., and MACAULAY, C. A modern paradigm for improving healthcare quality. Bethesda, Maryland, Center for Human Services/Quality Assurance Project, 2001. (QA Monograph No. 1) 78 p.
(99.) MATERNAL AND NEONATAL HEALTH PROGRAM. Country Profile: Guatemala. [newsletter]. Baltimore, JHPIEGO, no date. 2 p. (Available: , Accessed Sep. 4, 2001)
(100.) MAU, K. (Kendall Philip Consulting) [Performance improvement in developing countries] Personal communication, June 26, 2001.
(101.) MCCAFFERY, J. (Training Resources Group (TRG)/PRIME II) [Performance Improvement: Overview and the pilot project in the Dominican Republic] Personal communications, Mar. 28, Aug. 22, 2001; Mar. 7, 2002.
* (102.) MCCAFFERY, J., LUOMA, M., NEWMAN, C., RUDY, S., FORT, A., and ROSENSWEIG, F. Performance Improvement: Stages, steps and tools. Chapel Hill, North Carolina, PRIME, 2000, 95 p.
(103.) MCINTOSH, N., KINZIE, B., and BLOUSE, A. IUD guidelines for family planning services: A problem-solving reference manual. Baltimore, JHPIEGO Corporation, 1993. 191 p.
(104.) MCNAMARA, C. Basic context for organizational change. . Management Assistance Program for Nonprofits, Jan. 24, 2002.
(105.) MCNAMARA, C. Strong value of self-directed learning in the workplace: How supervisors and learners gain leaps in learning. . Management Assistance Program for Non-profits, Nov. 20, 2001.
(106.) MENSCH, B. Using data on client-provider interactions to assess the quality of family planning services. [unpublished]. Washington, D.C., Sep. 9-10, 1993. (Presented at the Expert meeting on information systems and measurement for assessing program effects, sponsored by the National Academy of Sciences Committee, on Population) 19 p.
(107.) MILLER FRANCO, L., KANFER, R., MILBURN, L, QARRAIN, R., and STUBBLEBINE, P. Determinants of health worker motivation in Jordan: A 360 degree assessment in two hospitals. Bethesda, Maryland, Partnerships for Health Reform/Abt Associates, July 2000. (Major Applied Research 5, Working Paper 7) 67 p.
(108.) MORGAN, C., and MURGATROYD, S. Leading thinkers for Total Quality Management. In: Total Quality Management in the public sector: An international perspective. Buckingham, England, Open University Press, 1994. p. 35-41.
(109.) MURRAY, M. Performance improvement with mentorins. In: Stolovitch, H., and Keeps, E, Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 545-563.
(110.) MUTIZWA-MANGIZA, D. The impact of health sector reform on public sector health worker motivation in Zimbabwe. Bethesda, Maryland, Abt Associates, Nov. 1998. (Major Applied Research 5, Working Paper 4) 28 p,
(111.) NAVRONGO HEALTH RESEARCH CENTRE and POPULATION COUNCIL. The Navrongo community health and family planning project: Lessons learned 1994-1998. [CD-ROM, “Electronic Library, 1990-1999,” from Population Council, Frontiers in Reproductive Health]. Navrongo Health Research Centre, Population Council, Apr. 1999. (Operations Research Technical Assistance, Africa Project II) 204 p.
(112.) NECOCHEA, E. (JHPIEGO) [Performance Improvement in Latin America] Personal communication, Aug. 3, 2001.
(113.) NECOCHEA, E., AINSLIE, R., BOSSEMEYER, D., CORDON, O., GARCIA COLINDRES, J., METCALFE, G., PEINADO, L., and POPPE, P. CaliRed: A performance and quality improvement model for maternal and neonatal health services in Guatemala. [abstractl. Baltimore, JHPIEGO, 2002. 1 p.
(114.) NEWMAN, C. PRIME’s Performance Improvement initiative underway in Burkina Faso. PRIME Perspectives, No. 3, Aug. 1998. p. 1-5.
(115.) NEWMAN, C. Following up performance: Lessons from the field. Performance Improvement 41 (1): 11-18. Jan. 2002.
(116.) O’BRIEN, P. The Cochrane Collaboration: Preparing, maintaining and disseminating systematic reviews in fertility regulation. British Journal of Family Planning 23: 37-38. July 1997.
(117.) O’BRIEN, T., OXMAN, A.D., DAVIS, D.A., HAYNES, R.B., FREEMANTLE, N., and HARVEY, E.L. Educational outreach visits: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev (2): CD000409. 2000.
(118.) OXMAN, A.D., THOMSON, M.A., DAVIS, D.A., and HAYNES, R.B. No magic bullets: A systematic review of 102 trials of interventions to improve professional practice. Canadian Medical Association Journal (153): 1423-1431. 1995.
(119.) PADILLA, M. Performance Improvement: Lessons from the experience with a client feedback system in the Dominican Republic (PowerPoint presentation). Chapel Hill, North Carolina, PRIME, May 1, 2001, 22 p.
(120.) PADILLA, M., and FORT. A. Addendum to Technical Report 19: Results from the second follow-up study of the Dominican Republic Performance Improvement project evaluation. Chapel Hill, North Carolina, Intrah, Nov. 2001. (PRIME II Technical Report No, 28) 23 p.
(121.) PELOQUIN, J. (Population Leadership Program) [Applying Performance Improvement in developing countries] Personal communication, Mar. 29, 2002.
* (122.) PERFORMANCE IMPROVEMENT CONSULTATIVE GROUP (PICG). Frequently asked questions about Performance Improvement. . PICG, May 2, 2002.
(123.) PERFORMANCE IMPROVEMENT CONSULTATIVE GROUP. Descriptions of desired performance: How detailed? Proceedings of the Orientation, programming, and skill building course (CD-ROM), Alexandria, Virginia, Oct. 15-19, 2001. Available from JHPIEGO, Baltimore.
* (124.) PERFORMANCE IMPROVEMENT CONSULTATIVE GROUP. Improving performance to maximize access and quality for clients [PowerPoint presentation]. July 6, 2001. (Available: )
(125.) PETERS, P. Seven tips for delivering performance feedback. . Zigon Performance Group, May 1, 2001.
(126.) PIOTROW, P.T., KINCAID, D.L., RIMON, J.G., and RINEHART, W. Health communication: Lessons from family planning and reproductive health. Westport, Connecticut, Praeger, 1997. 307 p.
(127.) PRIME. Performance factors, [draft]. Chapel Hill, North Carolina, PRIME, July 9, 2001. 4 p.
(128.) PRIME. A performance improvement approach; Lessons learned [PowerPoint presentation]. Chapel Hill, North Carolina, PRIME II, 2001.
(129.) PRIME. The role of the PI facilitator. [draft]. Chapel Hill, North Carolina, PRIME, July 12, 2001. 4 p.
(130.) PRIME. RRT (regional resource team) performance improvement tables (Ghana). [draft]. Chapel Hill, North Carolina, PRIME, June 19, 2001. 21 p.
(131.) PRIME. Yemen performance improvement: Connecting midwives and communities. [draft]. Chapel Hill, North Carolina, PRIME, June 16, 2001. 2 p.
(132.) PRIME II and JHPIEGO. Transfer of learning: A guide for strengthening the performance of health care workers. Chapel Hill, North Carolina; Baltimore, PRIME II and JHPIEGO, Mar. 2002. 36 p. [Available: Accessed June 4, 2002)
(133.) PRIME II. India: Multiple strategies in Uttar Pradesh. . PRIME, April 5, 2002.
(134.) QUESADA, N. and NOGUERA, M. FPLM technical assistance record [Dominican Republic]. John Snow, Mar. 2000. 2 p.
* (135.) REPRODUCTIVE AND CHILD HEALTH SECTION, MINISTRY OF HEALTH, UNITED REPUBLIC OF TANZANIA. Report on performance needs assessment of Zonal Training Centre capacity to implement the reproductive and child health program in Tanzania. [draft]. Dar es Salaam, Ministry of Health, United Republic of Tanzania, Jan. 2001. 88 p.
(136.) REPRODUCTIVE AND CHILD HEALTH SECTION, MINISTRY OF HEALTH, UNITED REPUBLIC OF TANZANIA. Tanzania Performance Improvement initiative: Assessment of community perceptions of quality health services, Preliminary Report. Baltimore, Johns Hopkins University, Center for Communication Programs, Feb. 21, 2001. 19 p.
(137.) RIFKIN, H. Performance feedback: Getting past avoidance in the quest for excellence. Insights: Quarterly Newsletter of ACEC/MA (American Council of Engineering Companies of Massachusetts), May/Jun. 1999. (Available: Accessed June 5, 2001)
(138.) ROGERS, E.M. Diffusion of innovations. 4th ed. New York, Free Press, 1995. 519 p.
(139.) ROSENBERG, M.J. The origins and evolution of the field. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 24-46.
(140.) ROSSETT, A. Analysis for human performance technology. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 139-162.
(141.) RUDY, S. (Intrah/PRIME) [Advantages and applications of the PI process] Personal communication, Mar. 23, 2001; Feb. 25, 2002.
* (142.) RUMMLER, G.A., and BRACHE, A.P. Improving performance: How to manage the white space on the organization chart. 2nd ed. San Francisco, Jossey-Bass Publishers, 1995. 226 p.
(143.) SAFFITZ, G. (Johns Hopkins Center for Communication Programs) [Performance improvement] Personal communication, Mar. 19, 2001.
(144.) SANCHEZ, C.M. Performance improvement in international environments: Designing individual performance interventions to fit national cultures. Performance Improvement Quarterly, Vol. 13 No. 2, 2000. p. 56-70. (Available: Accessed Sep. 9, 2001)
(145.) SANTILLAN, D., and FIGUEROA, M.E. Implementing a client feedback system to improve the quality of NGO healthcare services in Peru. Bethesda, Maryland, Center for Human Services/Quality Assurance Project, 2001. (Operations Research Results No. 2) 15 p.
(146.) SENENAYAKE, P. (IPPF) [IPPF wall chart] Personal communication, Mar. 5, 2002.
(147.) SENGE, P., KLEINER, A., ROBERTS, C., ROSS, R., ROTH, G., and SMITH, B. The dance of change: The challenges to sustaining momentum in learning organizations. New York, Doubleday, 1999. 596 p.
(148.) SHELTON, J. The provider perspective: Human after all. International Family Planning Perspectives 27(3): 152-153, 161. Sep. 2001.
(149.) STANBACK, J., GRIFFEY BRECHIN, S.J., LYNAM, P., TOROITICH-RUTO, C., SMITH, T., KUYOH, M., WELSH, M., CUMMINGS, S., CUTHBERTSON, C., BRATT, J., HOMAN, R., TOROITICH, N., GACHUHI, N., CHIKAMATA, D., SUELLEN, M., HASSAN, D., MAKUMI, D., GITONGA, J., and QURESHI, Z. The effectiveness of national dissemination of updated reproductive health/family planning guidelines in Kenya. Durham, North Carolina, Family Health International Aug. 2001.19 p.
(150.) STOLOVITCH, H., and KEEPS, E. Implementation phase: Performance Improvement interventions. In: Robinson, D.G., and Robinson, J.C. Moving from training to performance: A practical guidebook. Alexandria, Virginia; San Francisco, American Society for Training and Development, Berrett-Koehler Publishers, Inc., 1998. p. 95-133.
* (151.) STOLOVITCH, H., and KEEPS, E. What Is Human Performance Technology? In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 3-23.
(152.) SULLIVAN, R.L. The competency-based approach to training. Baltimore, Maryland, JHPIEGO, Sep. 1995. (Strategy paper No. 1)9 p.
(153.) SULLIVAN, R.L. Performance feedback to improve worker performance. Pi-L listserv. Posted to the Pi-L e-mail listserv May 30, 2001. (Available: Accessed May 30, 2001)
(154.) TAMBERG, A. (JHPIEGO) [Performance Improvement in the Central Asian Republics] Personal communication, May 18, 2001.
(155.) THIAGARAJAN, S., ESTES, F., and KEMMERER, F.N. Designing compensation systems to motivate performance improvement. In: Stolovitch, H., and Keeps, E. Handbook of Human Performance Technology. 2nd ed. San Francisco, Jossey-Bass/Pfeiffer, 1999. p. 411-429.
(156.) TIAHRT, T. Population Assistance Legislation: Omnibus Appropriations for FY 1999. H.R. 4328, U.S. House of Representatives, 105th Cong. 2nd Sess. Oct. 21, 1998.
(157.) URENA, D., and HERASME, L. Grupos focales y entrevistas a profundidad, IDSS: Incentivos No-Financieros (no incluye aumento de salario) [Focus groups and interviews in depth, IDSS: Nonfinancial incentives (not including salary increase)]. [manuscript]. Santo Domingo, Intrah/PRIME, Apr. 1999. 4 p.
(158.) US AGENCY FOR INTERNATIONAL DEVELOPMENT (USAID). Voluntary participation and informed choice in family planning.. USIAD, Sep. 11, 2001.
* (159.) VAN TIEM, D.M., MOSELEY, J.L., and DESSINGER, J.C. Fundamentals of performance technology: A guide to improving people, process, and performance. Washington, D.C., International Society for Performance Improvement, 228 p.
(160.) WILLEY, C., LAFORGE, R., BLAIS, L., PALLONEN, U., PROCHASKA, J., and BOTELHO, R. Public health and the science of behavior change. Current Issues in Public Health 2(1): 18-25. Jan. 1996.
(161.) WINTER, L., BOUCAR, M., STINSON, W., MASON, D., and MURPHY, G. Niger country report: Tahoua project. Bethesda, Maryland, Center for Human Services, Quality Assurance Project, 1997.28 p.
(162.) WORLD HEALTH ORGANIZATION (WHO). Implementing best practices in reproductive health. [brochure]. Geneva, WHO, 2002. 8 p.
(163.) YOUNG, C.F. (Quality Assurance Project) [Saving mothers and newborns: Improving the quality of obstetric and neonatal care in rural Nicaragua] Personal communication, Feb. 22, 2001.
(164.) YUMKELLA, F. FPAK/PRIME Performance Improvement stakeholders’ meeting. [report]. Nairobi, Intrah/PRIME II, May 11, 2001. 9 p.
(165.) YUMKELLA, F. (Intrah/PRIME II) [Performance improvement in Kenya and Tanzania] Personal communication, Jun. 22, 2001, Jul. 10, 2001.
(166.) GORMLEY, W., and MCCAFFERY, J. Case studies and role plays: “Getting them right”. [PowerPoint presentation]. Presented at the Training: Best practices, lessons learned and future directions conference, Washington, D.C., May 22-23, 2002. 16 p.
(167.) GUSSIN, G. A comparison of four e-learning modalities. [PowerPoint presentation]. Presented at the Training: Best practices, lessons learned and future directions conference, Washington, D.C., May 22-23, 2002. 15 p.
(168.) LYNAM, P., and PLEAH, T. Using on-the-job training for expansion of postabortion care services [PowerPoint presentation]. Presented at the Training: Best practices, lessons learned and future directions conference, Washington, D.C., May 22-23, 2002. 16 p.
(169.) RAWLINS, B., GARRISON, K., LYNAM, P., SCHNELL, E., CAIOLA, N., and GRIFFEY BRECHIN, S.J. Focusing on what works: A study of high-performing healthcare facilities in Kenya. Baltimore, JHPIEGO, May 2001. 17 p.
(170.) RUDY, S. Designing counseling training that works. Presented at the Training: Best practices, lessons learned and future directions conference, Washington, D.C., May 22-23, 2002. 14 p.
(171.) SCHAEFER, L., WYSS, S., and OZEK, B. Strengthening preservice education: A systematic approach and lessons learned [PowerPoint presentation]. Presented at the Training: Best practices, lessons learned and future directions conference, Washington, D.C., May 22-23, 2002. 15 p.
COPYRIGHT 2002 Department of Health
COPYRIGHT 2004 Gale Group