Verifying charities’ claims is one stage of ACE’s charity evaluation process. We have repeatedly improved our verification methodology over the past few years, and this year we began systematizing our procedure. We focused our verification on the claims that charities reported to us regarding the outputs (i.e., outcomes) of their programs. We primarily used information about charities’ program outputs from the past 18 months to inform our cost-effectiveness analysis, as well as other aspects of our charity evaluation criteria.
We focused on verifying charities’ reported program outputs from January 2020 through June 2021. This time period was chosen as it is the most up-to-date indicator of a charity’s main accomplishments relevant for our cost-effectiveness analysis.
Our verification team, which consisted of five members of ACE’s research team, followed a detailed, five-step process: i) prioritizing claims, ii) verifying prioritized claims, iii) asking charities follow-up questions, iv) analyzing and incorporating new information from charities’ responses, and v) finalizing verifications and applying them to charity evaluation criteria, particularly the cost-effectiveness criterion. We spent approximately five weeks on the first three stages and three weeks on stage four and five.
We prioritized which claims were most important to verify based on: i) the potential impact of the claim, ii) its relevance to ACE’s work, iii) the potential number of animals that could be affected, and iv) the feasibility of verifying the claim. For claims regarding corporate commitments, we prioritized claims that involved larger corporations and that had the potential to affect the most animals. After prioritizing which claims to verify, we looked for the best available evidence to support or refute each claim. In general, we expect charities to provide us with clear evidence to support the claims they make about their accomplishments.
Verifying prioritized claims
We verified at least one major output per program. To verify charities’ claims about their program outputs, we drew on publicly available information and supporting evidence such as documents, media reports, and other materials provided by charities. We also consulted third-party sources to corroborate the validity of claims; in such cases, we relied as much as possible on primary and/or official sources of information from reputable, non-partisan authorities (e.g., peer-reviewed journals and government agency statistics). When we used secondary sources, we searched for multiple independent sources to corroborate information.
We were sometimes unable to verify charities’ claims. There are several reasons why we may not be able to verify a claim: i) we did not find direct confirmation of the claim in the charity’s documentation or elsewhere, ii) we did not find a third-party confirmation of the claim, iii) we found partial or insufficient evidence for the claim, or iv) we found conflicting evidence about the claim.
Asking follow-up questions
We sent charities our Reporting Guidelines and requested that they provide links and/or other documentation to support their claims, asking them to be as specific as possible when describing their accomplishments. For claims that we were unable to initially verify, we contacted charities to elaborate and provide supporting information by sending them a list of concrete follow-up questions based on our verification work.
Incorporating new information
Once we received new information from charities about claims we could not initially verify, we reviewed the information to determine if it could help us verify the claims. When charities provided us with sufficient information to verify a claim, we incorporated the evidence into our documentation and considered the claim verified. We adjusted some of the reported claims based on our verification work. When charities did not provide us with sufficient information to verify a claim, we considered the claim unverified. We did not include unverified claims in our criteria assessment.
Claims that were successfully verified through our process were used to summarize the key results of charities’ programs and inform other aspects of our charity evaluation criteria.
Results and Limitations
Across all charities we evaluated this year, we conservatively estimate that we prioritized and verified at least 67 major claims1, each regarding a different recent output of charities’ programs. Time constraints limited the number of claims that we were able to verify. We asked a total of 137 follow-up questions to charities. Sometimes the availability of information we have access to can be limited, which can affect our degree of certainty about the validity of claims. During the verification process, we strove to remain as transparent as possible and be mindful of potential biases that could affect our verification work.
During our verification process, we kept a log of improvements that we intend to implement next year. These improvements include: i) optimizing the verification process by using a template from this year’s follow-up questions, ii) having more than one final approver during the prioritization and verification stages, iii) requesting more clarifications from charities ahead of the verification process in order to decrease the number of follow-up questions, and iv) increasing the contextual information we use during our verification by investing more time in studying charities’ theories of change.
We welcome your feedback in the comments below. You can send any questions, ideas, or suggestions to email@example.com.
Katrina Larsen says
ACE continues to be robust in its operations. Great work. Well done.