Read time: 6 minutes

Year-end fundraising season is just around the corner (😱), and if you’re like us, you’re finalizing those all-important campaign plans now. To help with your planning, we’ve rounded up findings from some of our favorite tests from the past few months. Happy planning!

Test 1: Check your required fields
Our first test couldn’t be simpler: People for the Ethical Treatment of Animals (PETA) tested the impact of removing two non-required fields from their “request a vegan starter kit” form. The streamlined form saw a 5% increase in kit order conversions (as with all test results we’ll mention in this post, the increase was statistically significant). 

This reflects previous lessons from the conversion rate optimization field, where Expedia removed a single non-required field from their booking form and the change resulted in $12 million dollars in additional revenue due to increased conversions. 

Chances are, if you have non-required fields on your forms, they’re impacting your conversion rate. You should consider removing them—especially if you’re not actively using that additional data.  

Test 2 & 3: GIVE. US. MOAR.
USA for UNHCR tested ordering their ask string from highest to lowest, against their standard low-to-high string. The test was conducted across the primary one-time and monthly donation forms that receive organic and ad-driven traffic.

They found that revenue per visitor increased by 25% among users who were exposed to the highest-to-lowest ask string, possibly because the users were anchored to a higher amount as soon as they landed on the page. 

Similarly, the Humane Society of the United States tested adding two higher amounts to their monthly ask string on their main donation form, and also tested pre-selecting $36 instead of their usual $19. 

We found that the higher ask strings and aggressive preselect did not harm one-time or monthly conversions on the form, but they did improve revenue per visitor and annualized revenue by 15%.

These two tests indicate that simply presenting higher options to a user can impact their decision on how much to give. A common assumption is that prospects are low-dollar donors who can be scared off by a big ask, but… are they really? Maybe we should give them a chance to tell us otherwise.

Test 4: Don’t discount those small donations though
I know, I know—I just told you that prospects can give more than you thought they could. But sometimes a little goes a long way.

PETA tested adding six simple words to their post-action donation form and social share page. Are you ready to read the magic words? Here they are: “Even $3 can make a difference.” 

This addition—with no other changes!—was enough to improve conversions among their mobile audience by 69%. SIXTY-NINE PERCENT, with no detectable impact on average gift or revenue per visitor. Considering mobile users make up more than half of PETA’s action takers, this one is a keeper.

Previous testing to lower the ask string on this page did not yield any substantial results—and this goes to show that the ask string is not the only place to suggest norms for the user to follow.

Even 6 words can make a difference!

Test 5: Tell us about yourself
A few nonprofits have reported results showing that when the mission is highlighted in some format on the homepage, revenue per visitor increases. The Union of Concerned Scientists put this idea to the test by adding a mission feature call-out to their homepage during November and December 2017. 

Our hypothesis was that by quickly communicating to users who might not be familiar with the organization who the “Union of Concerned Scientists” is, we’d spark interest. That interest would lead them to content that could inform their decision to get involved and, ultimately, donate.  

That didn’t happen—immediate conversions and revenues were unmoved. But what did happen is just as interesting: email sign-ups went up by 17%, traffic to the UCS “about us” page increased by 46%.    

Understanding what elements impact users when they come to your website is complicated. So we repeated the test in the spring with a new design that more closely matched the aesthetic of the homepage and removed the call to action text “learn more about our work.” 

This time, we still didn’t see additional initial revenue, but we also didn’t see the boost in email sign-ups or clicks through to the “about us” page. 

This indicates the “learn more” call to action and contrasting design in the first feature test may have been the elements that were needed to make this design successful—directing the user’s attention to a specific area of the homepage and giving them a single thing to do with the information we’re providing in that space.  

Test 6: A penny (dollar) for your thoughts (email)?
PETA tested differing incentives on an email sign up lightbox, where we offered users either a $1 leveraged gift for every new sign-up, a sticker, or nothing at all (other than the opportunity to save cats and dogs and other animals, which is a pretty great benefit).

The pledge to stop animal testing was the same on all three lightboxes, but users dramatically preferred a bang for a buck. The version offering a $1 match for sign-ups generated 20% more email signups than the control with no incentive. The sticker premium offer did not result in any measurable increase in sign-ups. 

Are you planning to run any of these tests this year? We want to hear about all your testing dreams (and nightmares!) @mrcampaigns 🤓

When Karen isn’t dreaming up tests to improve fundraising or playing with Google Analytics, she’s singing with 160 of her closest friends in the Choral Arts Society of Washington. You can reach her at


Related Posts