Things That Surprised Us About the 2017 Evaluation Process
Each time we conduct our evaluation process, we gain new insights that allow us to improve our research. We feel it’s important to share these insights with our audience in order to solicit feedback, explain potential changes in future evaluations or our general research strategy, and increase the transparency of our decision process. In this post, we present some of the things that we found surprising after completing our 2017 evaluation process. Although some things surprised more than one of us, there was enough variation that most staff members active in the review process have included what surprised them individually.
Allison Smith, Director of Research
This was my fifth time working on a round of charity evaluations for ACE (two rounds in 2014 and one in every year since), so some aspects of the process have become routine to me. However, each evaluation is different, and each year we make some changes. Some of these changes lead to surprises.
The varied nature of the information gathered from confidential calls
One of the major changes we made to our comprehensive reviews this year was that we conducted confidential interviews with staff or volunteers1 at each charity to allow them to share their experiences working with the charity and their thoughts about the work environment. In some ways this was similar to the conversations we had with staff and volunteers at The Humane League in 2015, when we tested a “deep review” charity evaluation process. At that time we did gain some alternative viewpoints on the organization, and we were able to learn a bit more about how leaders’ perceptions of the organization matched (or didn’t match) other staff members’ perceptions. This year’s conversations, however, were more targeted to understanding the culture of the organization. For each organization undergoing comprehensive review, we selected staff members (or volunteers) at random2 and made it clear to these individuals from the start that we would keep our conversation notes confidential, rather than publishing them.
Perhaps as a result of these differences, there was much more variation in the information gathered from these calls than I had expected. In 2015, what we heard from staff, volunteers, and Board Members at THL closely matched what we heard from the Executive Director. I thought at the time that this reflected well on THL—but I also thought that this outcome might be expected at any organization. In particular I assumed that staff members at every level of an organization would probably all have the same degree of investment in attaining a maximally positive review from ACE. This year, there were indeed some charities about which we heard very consistent reports in each conversation we had with them, confidential or not. However, there were others where some staff members or volunteers had distinctly differing viewpoints. These differences spanned many categories—from topics such as professional development opportunities available to staff to the reliability of the organization’s response to challenging situations, to their handling of issues related to diversity and inclusion. I was often surprised and grateful for staff members’ candor in answering our questions, and I found even small points to sometimes be quite illuminating.
The number of organizations preferring to be reviewed next year
Another thing that changes each year is which charities we approach to evaluate for the first time. This year, a greater number of these charities told us that they would prefer to wait and be evaluated next year instead, compared with the number of charities that usually give this reply. This surprised me—generally I have expected that each year a larger fraction of charities we approached for review would be interested in participating (given that each year we are known by a greater portion of the movement and that each year we have accrued more examples of our work). In the past this has been mostly true.
In hindsight, I think that perhaps last year we had already reached a point where most of the established charities that might see an evaluation as a way to gain exposure had already taken advantage of this. At the same time, ACE is gaining a better sense of which charities we’d be likely to recommend, and we’ve become much more consistent about picking these charities for review than we were a few years ago. That means that this year we may have had fewer new charities to choose from that were already strong candidates for an evaluation; we definitely did reach out to some that looked promising but are at an earlier stage of development than our recommended charities and therefore might be much better candidates for a recommendation next year or the year after.
Toni Adleberg, Researcher
The value of conducting confidential calls
As Allison mentioned, one of the major changes we made to our evaluation process this year was that we conducted confidential calls with staff or volunteers at each charity about their work environments. I agree with Allison that the reports from those staff and volunteers were surprisingly varied (both within and between charities). Due, in part, to the variation, I found those calls to be a valuable new component of our evaluation process. Any information about significant differences between charities can only help to better inform our recommendation decisions.
In addition to providing information about specific charities, our confidential calls provided general information about some of the struggles and rewards that people experience working in the animal advocacy movement. We learned about some ways in which charity leadership can either protect or fail to protect their employees from the struggles they face in the workplace. We also learned about some ways in which charity leadership can either heighten or diminish the rewards of animal advocacy work from the perspective of their employees. This information can help us provide advice to charities (and on our website) about how to promote a healthy work culture, which I hope we will do more of in the coming year.
The challenges of conducting confidential calls
While I found the confidential calls to be highly valuable, I also found them to pose a number of unexpected challenges. In order to ensure consistency between calls and to conceal interviewees’ identities from as many ACE staff as possible, we appointed a single ACE staff member to conduct all of these confidential calls. I was not that staff member, and I think this process worked well for most of us. However, I worry that processing such a large amount of confidential information—some of which may have been quite troubling in nature—is a lot of responsibility for a single person to bear.
Conducting confidential calls presented other challenges as well. It was not always clear how heavily to factor the information from the calls into our reasoning about charities’ effectiveness. We strongly believe that the character of staff experiences plays an important role in a charity’s effectiveness, but it was occasionally unclear how representative these interviews were of the overall staff experience at each charity. Sometimes, for instance, we received highly distinct reports of the work culture at one single charity.
Another challenge we faced was determining how much information from the calls we should include in our published reviews. Generally, we provide as much information for our readers as possible, especially when that information influenced our reasoning. However, the interviewees from our confidential calls spoke with us on the condition of anonymity. Given that we ensured them this protection, we made this a top priority when determining what information to publish from these calls. This meant that we occasionally had to compromise some level of transparency about our reasoning, which was not ideal.
Lastly, I’ll mention one more challenge that took me by surprise. In our “things that surprised us” blog post last year, then-Researcher Jacy Reese noted that “some charities were more okay with [the publication of] critical content” than he expected, and I agreed with him. Usually, we find that charities are open with us and that they do not object to the publication of our honest opinions—even critical ones. This year, however, I was surprised that fewer charities were okay with critical content about their culture than in other areas. Perhaps that’s because culture can be a particularly sensitive topic. Perhaps it’s because—for the first time—we had access to information about charities’ cultures that was neither public nor shared with us intentionally by leadership (whereas most of the information pertaining to our other criteria comes directly from leadership). Either way, since we never publish information about a charity without the approval of their leadership, we may need to rethink some aspects of our culture evaluations next year.
Kieran Greig, Research Associate
This year was the first where I was heavily involved in ACE’s charity evaluation process. Below are some things that I found surprising which I think are important.
The uncertain timeline for cost-competitiveness of cultured meat
Earlier this year ACE completed a report on the timelines for cost-competitive cultured animal products. In that report, we discussed some reasons to be concerned about the potential for cultured meat to become cost-competitive as quickly as some organizations claim. In this year’s charity evaluation process, we reviewed two charities that allocate significant resources towards promoting and/or researching cultured alternatives to farmed animal products (these two charities are GFI and New Harvest). As part of our review process, and partly because the outcome of our report had suggested some reason for concern, we asked each of those charities whether they were concerned about the possibility that cultured meat will never reach cost-competitiveness. In asking that question we also noted that there were some reports—such as the Open Philanthropy Project’s Animal Product Alternatives report and van der Weele & Tramper (2014)—that suggested that it is unlikely that cultured meat will become cost-competitive with conventional meat.3,4
New Harvest’s response5 to our question seemed to be that they haven’t done enough research to yield a result suggesting that cost-competitiveness is impossible (or that, if it is possible, reaching cost-competitiveness will be difficult). GFI’s response6 seemed to be that (i) their scientists have become more optimistic about this potential, (ii) they have influenced venture capital investments towards cultured meat companies,7 and (iii) they plan to release a white paper in the next few months that summarizes an analysis completed by some of their scientists.
I was surprised at how little my concerns about the potential for cultured meat to become cost-competitive were alleviated by the responses from those two charities. I think I was expecting more engagement on this topic, as well as more of an attempted rebuttal to van der Weele & Tramper’s (2013) claim that growth medium costs are such that cultured meat will not achieve cost-competitiveness. In the absence of such a rebuttal (which I haven’t encountered elsewhere either), I think there is significant room for reasonable skepticism about proponents’ estimated timelines for cost-competitive cultured meat.8 It is probably also worth noting however that even with pessimistic views about cultured meat’s potential to achieve cost-competitiveness, one can think the expected value of donations to that area are still quite high.
Charities’ interest in allocating resources towards emerging issues
As part of this year’s comprehensive review process, we also asked five charities9 one or both of the following general questions:
- There are some who think that the scale of suffering in the wild is much greater than the scale of farmed animal suffering. What is your charity doing to address wild animal suffering?
- There are many more farmed fish than other species of farmed animals. Has your charity considered allocating more of their resources towards farmed fish advocacy?
I was surprised about the extent to which charities’ responses to these questions indicated a potential for them to allocate significant resources towards reducing farmed fish and/or wild animal suffering in the not too distant future.10 For years I have thought that farmed fish and wild animal suffering are important areas to prioritize. A significant component of my thoughts on this matter was my belief that these two cause areas are currently severely neglected.11 I now wonder if I glossed over a crucial consideration: in cause prioritization, what may matter much more than the current level of neglect is the neglectedness in expectation of cause areas—in other words, how neglected the cause will be in the future.12 I think that neglectedness in expectation could be an important consideration in cause prioritization because it seems to be an essential part of estimating the extent to which some impact would not have otherwise occurred. I now seem to be updating towards thinking that farmed fish and wild animal suffering are significantly less neglected in expectation than I previously thought—several large groups seem to have a promising amount of interest in these issues and seem to have expressed that they have the potential to allocate significant resources towards those two causes.
Jamie Spurgeon, Research Associate
I joined the ACE team in July 2017, so this was my first time experiencing our charity evaluation process. Not only that, but it was my first time working in the animal advocacy movement at all. I intentionally entered the process with as few expectations as possible, so I didn’t encounter too many significant surprises. I was a bit surprised however about the following two points.
The number of charities researching their own impact
When considering how charities measure the impact they’re having, I expected to find that they were attempting to assess this mostly through analyzing the outcomes of their programs, and using tools such as Key Performance Indicators. While this was mostly true, what surprised me was the number of charities engaging in their own independent research. Going into the evaluation process I was aware that Faunalytics did this (given that it’s their main area of focus), but I was not aware that THL, Animal Equality, Open Cages, and CIWF USA—as well as some of our Standouts reviewed in 2016, The Albert Schweitzer Foundation and Vegan Outreach—did so. I find it very promising that so many charities are keen to more formally measure the impact they’re having, and I hope that this lays the groundwork for a future in which we have much more certainty about the respective effectiveness of various interventions.
The positivity of my experience dealing with other organizations
Going into the review process, I was uncertain what it would be like to conduct calls and build relationships with other animal advocacy organizations from the perspective of being the one assessing them. I expected charities to be more guarded or reluctant to answer some questions—especially regarding topics such as their weaknesses, for example. However, I found that those I spoke to were very forthcoming and keen to discuss all aspects of their respective organizations. There seemed to be an underlying layer of openness and trust, even with organizations that we were reviewing for the first time.
Additionally, I was (and am) new to the movement as a whole, and I was quickly put in a position where I was interacting with movement leaders who often had decades of experience. My prior work experience outside of animal advocacy probably contributed to my expectation that I might not be taken seriously, or that organizations wouldn’t be happy dealing directly with a newcomer. On the contrary, I always felt like I was being treated as an equal, and I truly felt welcome in the movement. I think that the collaborative nature of the charity sector is probably largely to thank for this. At the end of the day we’re all working towards the same unified goal of helping to reduce the suffering of animals, no matter what approach we take to do so.
Ashwin Acharya, Research Associate
Like Jamie, I joined ACE this summer—making this my first experience with the charity evaluation process (and my first experience working in the animal advocacy movement). I share his experiences of pleasant surprise with regard to charities’ self-evaluation and welcomingness. Since he discussed those points already, I’ll share some things that surprised me about the process of writing charity reviews.
The amount of internal feedback built into the evaluation process
Before joining ACE, I didn’t have a strong sense of what the evaluation process looked like, but I would have guessed that each review was primarily written by a particular person. On the contrary, I found the process to be highly collaborative. This year, each research team member wrote drafts for one or two of our evaluation criteria, and was assigned the primary critic role for one or two other criteria. In addition to several rounds of editing and overview from primary critics, each section of each review was also looked over by other members of the research team, who often shared thoughts or concerns that significantly improved the drafts.
Especially as someone who was new to the process, I really appreciated having my drafts looked over by team members with more experience evaluating charities on ACE’s criteria. I think the level of feedback makes sense for ensuring that each review can be taken to represent the opinion of ACE as a whole, rather than just the opinion of a particular individual. It is possible that the process can still be streamlined further, however—particularly when it comes to maintaining consistency across reviews both in terms of reused ‘boilerplate’ language and the way we addressed particular questions (such as the value of spreading animal advocacy to new countries). This was worth spending time and effort on, and I look forward to finding ways to do it even more efficiently next year.
While our goal was to contact two staff members at each organization, some of the organizations we evaluated do not have staff—or they have such a small number of staff that it would be impossible to preserve anonymity if we were to restrict our selection of interviewees to only those individuals. In such cases where we were unable to select two staff members, we selected two volunteers or one staff member and one volunteer. (In two special circumstances, we only spoke with one individual at each evaluated charity.)
Our selections were random in all dimensions except the following: we aimed to select staff members or volunteers who had been with the organization for at least one year (where possible), and we aimed to speak with at least one woman from each organization (where possible).
“We are highly uncertain about the eventual cost per kg of cultured meat, and have not closely examined the above cost estimates. However, none of these estimates suggest a cost competitive with that of conventional meat.”—The Open Philanthropy Project (2015). Animal Product Alternatives. The Open Philanthropy Project.
“From an economic point of view, however, competition with ‘normal’ meat is a big challenge; production cost emerges as the real problem. For cultured meat to become competitive, the price of conventional meat must increase greatly.”—van der Weele, C., & Tramper, J. (2014). Cultured meat: every village its own factory? Trends in Biotechnology, 32(6), 294-296.
“While it is possible that cultured meat may never reach cost-competitiveness, New Harvest says that this doesn’t make the research they are supporting worthless.29 New Harvest likens the situation to biofuels—biofuels aren’t cost-competitive today for many reasons but one largely being that they compete with an artificially-priced commodity: fossil fuels. While we aren’t currently using biofuels everyday, in the event of a catastrophe or growing fuel prices, that technology could be mobilized. They think this situation is similar to that of cultured meat. New Harvest also claims that they haven’t done enough research in the area to suggest that cost-competitiveness is an impossible task. They do note that cost-competitiveness is difficult, but they don’t yet know for sure that it’s impossible.”—ACE’s 2017 Review of New Harvest
GFI is certainly aware of the possibility that cultured meats will not become cost-competitive, but reports that the more their scientists dive in, the more optimistic they become about the potential for clean meat to reach cost-competitiveness.100 GFI reports that when their scientists started working on this issue in June 2016, they were explicitly told that GFI does not need to promote clean meat; GFI reports that if their scientists thought it cannot become cost-competitive with the products of industrial animal agriculture then they would stop promoting it and would instead focus on plant-based meat.101
GFI reports that as their scientists investigate further, they have become more optimistic—because clean meat is so much more efficient than animal-based meat.102 One of their senior scientists, Dr. Liz Specht, has met with venture capital firms and other venture investors to present technology plans of specific clean meat companies and their pathways to commercialization.103 GFI further reports that, based partly on her analysis, many leading venture capital investors and firms have become much more interested in clean meat companies. GFI believes this has probably been key to their investments in the technology.104 GFI plans to release a white paper in the next few months that summarizes Specht’s analysis.105—ACE’s 2017 Review of The Good Food Institute.In our 2017 review of GFI we note that one of their biggest claimed successes over the previous year is that their presentations to various venture capitalist firms played a key role in the investment of more than $15 million into cultured meat by those firms. We are unsure what proportion of the responsibility for these investments should be attributed to GFI.
Even Mark Post, the chief scientific officer of MosaMeat who played a key role in creating the first cultured burger in 2013, was recently quoted as saying that the timelines from most companies are “overly optimistic.”
Animal Equality, L214, Faunalytics, and The Humane League (THL) were asked the first question. Compassion in World Farming USA, L214, and THL were asked the second question. Most charities’ answers to these questions are linked to in the “Documents” tab of our reviews.
For instance, consider Animal Equality’s response to the question about wild animal suffering.
For some further information about cause prioritization please see ACE’s write-up on the topic.
I think that neglectedness is usually used by the effective altruism community to mean the current level of neglect. For example, in their cause prioritization 80,000 hours defines neglectedness as “how many resources are already going towards solving this problem?”
Filed Under: Recommendations Tagged With: ACE staff, charity evaluations, process
About Allison Smith
Allison studied mathematics at Carleton College and Northwestern University before joining ACE to help build its research program in their role as Director of Research from 2015–2018. Most recently, Allison joined ACE's Board of Directors, and is currently training to become a physical therapist assistant.
Hi.
One thing that has put me off donating to some charities is the seemingly enormous salaries that their CEO’s are given. I have heard the argument that the charities put forward which says that they have to offer huge salaries to attract the best executives, but I am not convinced by this. I believe there are many capable people who would be willing to run a charity for a reasonable salary rather than an exorbitant one, because their hearts would be in their work, rather than money being the driving force. I realize these people may work very long hours, so I do not begrudge them a better than average salary, but I do object when the salary is way, way above what the average person earns. These Ceo’s are getting paid from donations from many ordinary people who often have very little to spare themselves, and yet they still give because they care so much about animal welfare. Therefore it is not right when senior executives take so much in a salary – this is money that should go to helping animals, not lining people’s pockets. I
I was looking at your assessment of Animals Australia because I’d like to support them, but could find nothing on your website about their CEO’s salary.
Hi Rebecca,
We agree that the best charities use their funding carefully, to create the strongest impact they can, rather than for the profit of individual employees—although paying staff well to attract great workers can be a good use of funds. For our purposes, we find that the best way to measure this is to consider the organization’s overall budget and the amount that they’ve been able to accomplish, rather than monitoring the specific ways that they spend money in pursuit of their goals. This is why you’ll find an assessment of Animals Australia’s overall cost-effectiveness in our review, but no details on how much specific employees are paid.
If in addition to the budgetary information we provide, you would also like to know how much they pay their CEO, I would advise you to contact Animals Australia directly.