This post describes some of the specifics of the evaluation process leading to our December 2014 recommendation update, and what we learned from the process. For more general thoughts about our recommendation process, see this page.
We began preparing for this recommendation update almost immediately after we posted our previous recommendations in mid-May. Our early preparations addressed broad issues rather than specific groups; we finalized our intervention evaluation procedure and conducted an evaluation of corporate outreach, because we knew that many groups we would be considering use that intervention and that it’s quite different from other interventions we’d studied. We also worked on mapping the international animal activism landscape, with interns searching for activist groups, especially those focusing on farm animal issues, and researching differences in laws and cultural norms that apply to animals.
New Organizations to Evaluate
In July, we developed our list of new organizations to evaluate, drawing from the list of organizations we’d considered but not evaluated in the past, recommendations made to us, and, most substantially, from the list of organizations working outside the United States that we had compiled. We ultimately produced a list of 29 new organizations to evaluate. We considered several aspects of groups at this stage in order to ensure that groups we evaluated had a good chance of moving forward in our process:
We decided to evaluate only groups with reasonably detailed English versions of their websites, so that we could complete shallow evaluations using the websites and so that we would likely be able to communicate effectively with them when we contacted them for permission to publish our evaluations or to interview them for medium-depth evaluations.
We chose to evaluate mainly groups focusing on farmed animals, because we think a focus on farmed animals offers unique opportunities for large-scale impact and cost-effectiveness. We also evaluated a few groups with a focus on general anti-speciesism or which focused on several areas of which farmed animals was one. For these groups, we were more likely to evaluate a group if it seemed to have a larger overall impact on farmed animals or if it appeared to have values similar to our own, such as caring about the reduction of suffering regardless of cause.
It is often not obvious from a brief visit to a group’s website how much energy and resources they spend on each of their programs, so we found this aspect of deciding which groups to evaluate particularly difficult. There were a few groups we chose to evaluate that we had passed over in the round of evaluations ending in May because of our thoughts about the focus of their programs. Similarly, in the future we might evaluate some of the groups that we decided this time did not focus closely enough on farmed animals to be worth evaluating in this round.
We chose to evaluate groups active on a national or international scale. We did not have the resources to also evaluate groups active only in one or two cities. We did evaluate some groups with local-sounding names which are actually active in a larger area, but we may have missed other groups which are active in larger areas than is evident from their name or the front page of their website.
We attempted to conduct shallow evaluations of all 29 of these organizations, as well as one organization we had previously reviewed but whose review we wanted to update because of our increased understanding of animal advocacy work outside the US. These evaluations had three possible outcomes: we could not complete the shallow evaluation, we wrote a shallow review and requested permission to publish it, or we contacted the organization to begin the medium review process.
In our evaluation cycle ending in May, we found all the necessary information to complete a shallow evaluation for each organization we looked into. In the current evaluation cycle, we ran into problems finding one type of information we considered vital: basic financial information. Even though we had ensured all the organizations we were reviewing had substantial websites available, it is common for non-profits not to have financial information published on their website. In some countries, such as the United States and the United Kingdom, certain financial information must be made publicly available, and there is a standard place to look for it. In other countries, some animal advocacy groups cannot register as non-profits, or we could not find where, if anywhere, such information is published online.
If we couldn’t find basic financial information about a charity, such as revenues and expenditures for recent years, we wrote to the organization and requested that information in order to proceed with our evaluation. Of 11 organizations we contacted in this way, 4 replied with enough information that we were able to proceed, and the rest either did not reply at all or asked not to be evaluated. (On our list of organizations we identify any group that we tried to review but which did not actively consent to our publishing a review as having declined to be reviewed, including groups which never responded to us.)
In our shallow reviews, we use publicly available information about an organization to understand what they do, what change that has achieved for animals, and how that relates to the overall size of the organization. We originally intended to write shallow reviews for almost all of the 29 groups, with exceptions only for those groups we decided to conduct medium-depth evaluations of after completing the shallow evaluation process. Because of difficulties obtaining financial information for some groups, we ended up writing 17 shallow reviews, including the one we were updating.
After writing the reviews, we initiated contact with each group to ask that they read the review and grant us permission to publish it. We seek permission before publishing any review both to ensure that our reviews present accurate portrayals of the organizations they describe and to maintain good relationships within the animal advocacy community. While some organizations allow us to publish our reviews in their original form, others make corrections, direct us to additional relevant information, or ask that we not publish any review at all. Even reviews which we altered after communicating with the organization involved represent our own understanding and opinions, which are not necessarily those of the group reviewed.
11 organizations, including the organization whose review we were updating, ultimately gave us permission to publish our shallow reviews. Of the six groups who did not, not all had seen our review; for confidentiality, we do not include the review itself in our first contact attempts unless we have contact information for a specific person with significant responsibility within the organization, and some groups did not respond to any contact from us.
A noticeably higher proportion of groups allowed us to publish shallow reviews in December than in May: 65% of those for whom we wrote reviews allowed us to publish this time compared to 27% in May. This amounted to 46% of those for whom we would have liked to write reviews in December; in May, we wrote all the reviews we wanted. We don’t know whether this difference has to do with our improved materials and reputation or merely with a difference between our two samples, which we did not intend to be at all random.
We conducted medium-depth evaluations, and medium-depth evaluation updates, on groups which we thought had a good chance of receiving highly positive reviews from us. While we hope in the future to conduct even more thorough reviews of some groups, these were the deepest investigations we were able to conduct this year.
Although as we accumulate more completed medium reviews we won’t be able to continue offering this for all organizations, we offered to update our reviews for each of the 6 groups we had written a medium review for previously, as well as one group which had decided early in our last review process that they would prefer we publish a shallow review because they did not feel ready for a higher level of scrutiny. Five of the groups we’d written medium reviews for in May took us up on this offer, providing us with varying amounts of new information about their activities and approving updates to our reviews based on this information and on other things we’d learned in the intervening time.
We decided that we had the capacity to conduct 6 more medium evaluations. We contacted the organizations we felt were most likely to receive a top recommendation or standout status based on our shallow evaluations, and all agreed to participate. We conducted at least one telephone conversation with each of these groups and requested additional materials from them, documenting their recent achievements and their finances in detail. We then wrote a detailed evaluation for each group, addressing each of our criteria for charity evaluation. Finally, we shared our evaluation with the group we’d evaluated so they could offer corrections, address confidentiality concerns, and decide whether to allow us to publish the review and supporting materials.
We were able to cast a somewhat wider net in choosing groups to review at this level than we had been able to in May, and reviewed groups engaging in a wider variety of activities over more diverse areas. In some cases we deliberately chose to evaluate groups engaging in very different projects from our previously recommended organizations, because, while we felt good about those organizations and their activities, we were aware that if we looked into very different possibilities, we might find some groups with projects that we thought were dramatically more effective. While we did not actually conclude that any groups were much more effective than our previous top recommendations, we do think we learned a lot about animal advocacy because of the reviews we conducted, more than we would have if we’d only evaluated the groups that were most similar to our previous recommendations.
Unfortunately, only 4 of the 6 new groups agreed to allow us to publish our reviews, so we can’t publish everything we learned. This surprised us, because in the previous round every group that agreed to participate in our medium-depth evaluation process agreed to our publishing the results. While we were disappointed about the reviews we couldn’t publish, we don’t think this affects the quality of our recommendations or reflects on the quality of the groups that we have recommended because the groups which decided not to have information published were not those we had decided to recommend. Before the next round of evaluations, we’ll be reviewing the way we communicate with organizations, in the hope that we can identify groups that won’t want any review published earlier in the process. We’ll also be considering how best to weigh the value of the information to be gained by evaluating particular organizations against our impression of the likelihood that the evaluation will result in work we can publish.
As we’d planned, our top charities this round are in our opinion the 2 or 3 strongest charities we’ve so far evaluated. We drew the line where it felt most natural, which was at 3 top charities. Without any intentional limit on the number of standout charities we selected, we found 4 groups that we wanted to recognize for their excellent programs but had some reservation about or simply did not feel were quite as strong as our top recommendations. When we asked medium-reviewed groups to look over the reviews we prepared, we also informed them whether or not they had been selected as top or standout charities. With our top charities, we also requested that they implement a way of keeping track of donations directed to them by ACE, so that we can track our impact on their fundraising, one of our major indicators of success for ourselves.
We are continually learning from our review process. We share as much of what we learn as possible, and even if we are unable to publish everything, we are using our knowledge to better inform our next round of reviews. We plan on expanding the recently created “Our Thinking” page with regular updates explaining more elements of our process, increasing our transparency for those who follow our work. We appreciate your feedback – contact us with any of your thoughts on what we can be doing better.