This post provides a brief discussion of the specific process that led to our 2019 charity recommendations. For a more detailed description of the process, please see our 2019 Evaluation Process page. For a general description of the process, please read our Evaluating Charities page.
This year, we made some updates to our charity evaluation process, including publishing overall ratings of each charity on each criterion, increasing the number of visual aids in each review, making changes to our cost-effectiveness models, making our organizational culture survey mandatory for charities receiving a recommendation, and hiring a fact-checker.
Our formal evaluation process took place from June to December. The general timeline of our evaluation process was as follows:
- Late June: selected charities to review
- Early July: sent out invitations to participate, along with the charity evaluation handbook, and began gathering information and conducting research for comprehensive reviews
- October: completed full drafts of the comprehensive reviews, solicited feedback from ACE’s board and Executive Director, and worked to incorporate it into the initial drafts
- Late October: finalized our recommendation decisions
- Early November: communicated recommendations to charities under review and sent them completed drafts of the reviews between November 1 and November 6, 2019
- November: addressed charities’ feedback on our drafts and solicited charities’ approval to publish
- Early December: published our recommendations on December 2, 2019
The process led to the publication of 11 new or updated comprehensive reviews and an update to our recommendations.
Our Selection Process
We began our 2019 evaluation process by compiling an internal list of 173 charities to consider evaluating. We then reviewed the list and attempted to identify the charities that seemed most valuable for us to evaluate. We initially selected a total of 16 charities for evaluation based on factors such as (i) how likely we thought each of the charities was to be recommended and (ii) how useful we thought the knowledge we would acquire and potentially publish from the comprehensive evaluation would be. We then sent out our charity evaluation handbook to each of these 16 charities and formally invited them to participate in the review process. Four of the 16 charities declined to be reviewed.1 We selected one more charity and invited them to participate, but they also declined in favor of being evaluated next year instead. We ended up with a total of 12 charities to evaluate. This included our four Top Charities from 2018, the four Standout Charities that were last evaluated in 2017, and four other charities that we hadn’t evaluated for at least the past three years.
At the start of the evaluation process, we asked each charity to provide information and documentation of their finances, accomplishments, and strategy, and we scheduled information-gathering conversations with their leadership. To assess workplace culture, we provided a survey for distribution amongst staff. After conducting interviews and gathering requested documentation, we began drafting the reviews and sent follow-up questions.
We solicited feedback on the initial drafts of our reviews from ACE’s Executive Director and three board members. After editing our comprehensive reviews and making our recommendation decisions (described below), we sent the reviews to the corresponding charities for approval, along with our conversation summaries and other supporting documents we intended to publish. Charities had the opportunity to request edits, including requests to remove confidential information or to correct factual errors. Nonetheless, all reviews represent our own understanding and opinions, which are not necessarily those of the charities reviewed. This year, 11 of 12 charities for which we drafted comprehensive reviews agreed to have their reviews published.
After the majority of each comprehensive review was drafted, but before the reviews were entirely finished or sent to charities for approval, four members of the research team and ACE’s Executive Director met to discuss the selection of Top and Standout Charities. In preparation for this meeting, all participants indicated their individually prepared suggestions for Top and Standout Charities in a chart.2 There was substantial initial agreement on the status of some charities, but not all. We discussed the strengths and weaknesses of each charity and updated our individual opinions when necessary. At the conclusion of this meeting, we had reached unanimous agreement on the recommendation status of most charities.
The same four research team members and the Executive Director participated in several email threads to finalize our selection of 2019 recommended charities.3 A unanimous agreement was reached for most charities under review, and for the remainder, the Director of Research and Executive Director made final decisions.
Like last year, we selected four Top Charities in 2019. We think that, overall, each of our Top Charities perform outstandingly well on our criteria. They have clear plans for effectively using additional funding, strong organizational structures, and they’ve demonstrated an interest in and capacity for evaluating their own programs. There was unanimous consensus within the evaluation team that each of these four groups should be selected as Top Charities.
Our Standout Charities are those which we did not select for a top recommendation but nonetheless wanted to call to the attention of our readers because we think they are quite promising; we think that donations to these charities seem likely to have a relatively high expected value. Some of these charities performed very well on our criteria, but not quite as well as our Top Charities. It took time and discussion to reach consensus on standout recommendation decisions when charities performed very well on some criteria and less well on others. In these cases, we discussed the arguments for and against making a standout recommendation for the charities in question. Once these points were established, if we still could not reach a unanimous agreement, final decisions were made by the Director of Research and the Executive Director.
A number of important questions arose in our discussions about which charities to recommend. Discussion topics included the relative strengths and weaknesses of the charities we evaluated as well as Animal Charity Evaluators’ role in the animal advocacy movement.
Some of the questions we discussed are the same ones we tend to grapple with annually. For example:
- Is there an ideal number of Top and Standout Charities?
- How much money do we expect our recommendations to influence to our Top Charities next year, and how does that affect the number of Top Charities we should select?
- How should we account for types of impact that are especially difficult to estimate? For instance, we have relatively little—or in some cases, no—evidence estimating long-term impact and the impact of some relatively novel interventions.
Some other questions were more significant this year than in previous years. For example:
- Compared to the U.S., how valuable is farmed animal advocacy in other regions, such as Latin America, Eastern Europe, or Asia?
- How should we weigh more direct indicators of marginal impact (such as intervention type or cost-effectiveness estimates) against more indirect indicators (such as efforts to build alliances and build the capacity of the movement) when the latter likely have significant, but more difficult to quantify, impact?
- How can we know if an organization is making progress when their goals are very long term?
- How should we weigh information from our culture surveys when evaluating an organization’s culture?
- How do we improve the readability of our reviews without compromising the depth of analysis?
We think that answering any of these questions is very difficult and that this activity is laden with uncertainty. Still, in line with our mission, we continually work to improve our understanding of effective animal advocacy.
For further information about our thinking, please watch our blog for upcoming posts on several topics, including aspects of our 2019 evaluation process that surprised us and an explanation of why we only write and publish reviews with charities’ full participation.
Of the charities that declined the opportunity to participate, most responded that they were too busy at the time and or that they preferred to wait until 2020 because they had a recent change in leadership.