Or: The Easier Your Website Makes It for People to Give You Money, the More Money They Will Give You
I’ll be honest with you: until recently, it had been a while since I’d had an experience in this line of work where a client carried out one of my recommendations and then reaped immediate and enormous success.
Don’t get me wrong: our clients here at UpBuild do better with our guidance overall than without it; we wouldn’t be much of a digital marketing agency if we weren’t even able to say that. It’s just that messy realities typically force our strategic ideas to be tried far outside the pure conditions demanded by the scientific method. For one thing, we and our clients both want changes to be implemented in a timely manner. This can lead to multiple recommendations being enacted at once, making for a noisy experiment where it’s hard to tell for certain which change resulted in which improvements. For another, bottlenecks in development processes, or siloed teams on the client side, often make for piecemeal implementations, so one can’t say with confidence “when” a given thing was actually done. The net effect here is of watching the search visibility of your client’s brand improve beginning a few months after you started your work, but of never being entirely sure which of your many changes deserved the most credit for it.
Sometimes, though, you come up with one new idea in one moment, watch it take flight unaccompanied by other changes, and just know from the resulting data that it did what you wanted it to do. And on even rarer occasions, it did way, way more than you wanted it to do, blasting your predictions out of the water. And on the rarest occasions of all, the recommendation itself was simple and clear enough that the positive result actually generated a nugget of wisdom. In case it isn’t clear by now, I was fortunate enough to have this experience recently, and I’m going to start the story by distilling the nugget that it offered up: the fewer steps your customers have to complete in order to buy stuff, the more of them will actually buy stuff.
This might seem intuitive, but not all intuitive-seeming things are true, and not all truth is intuitive. Experimentation is the only way to be sure of the actual truth. Just ask Galileo.
My client in this case is a medical research non-profit, or more accurately the fundraising arm of a large medical research university, and the sole purpose of their website is to solicit donations. The primary conversion funnel is the one that leads to the donation form, which functions in exactly the same way as a checkout page would for an online retailer. It’s linked to from everywhere, with CTAs galore, a button in the navigation made prominent by color contrast, and numerous external links — particularly from social media postings — pointing directly at it. You can’t get near this site without seeing it.
The trouble is, when the site first launched in 2015, the donation form spanned four pages, each one terminating in a “Next” button which indicated plainly that there was more work ahead for the user (as did the four-piece progress arrow running across the top, highlighting the segment corresponding to the step you had reached):
Granted, the form had a lot of info to collect: what area of research the user wanted to contribute to, whether the donation was to be a one-time thing or a recurring direct debit, whether it was to be made in somebody else’s name, and plenty more. All the same, there was no doubt that users were dissuaded by the long checkout process, and the abandonment rate we tracked in our analytics setup bore this out. Each of those four “Next” button clicks offered a shining opportunity for a user to say “you know what, never mind, this is taking too long” and bail out.
The site’s initial conversion rates were far from terrible, but the success we enjoyed — as we knew even at the time — was primarily due to the visitors who flowed in through the university’s parent site with pre-existing intent to donate.
Those of us at UpBuild who serviced this account couldn’t help but wonder how much better our donation conversion rate could be among all other visitors — that is, among the general public — if that donation form were reduced to one page, and if that “Next” button instead said “Donate” or “Give Now”. Imagine if the signal indicating that you were definitely not done after this page could be replaced by a signal indicating that you definitely were.
We mentioned this to the client a few times in our reports and check-ins. We even commissioned a report comparing the CRO strength of various famous charities’ online donation forms from a UX colleague of ours, and were given unequivocal expert support for the idea of a one-page checkout: all the major charities did it, she said, and what little evidence they published suggested that a one-page checkout was an effective donation experience But for various reasons, the client couldn’t act on the recommendation right away. Finally, in October 2016, they called a joint meeting with us and their dev shop to talk about form optimization in depth, and we knew we had an opportunity to actually get the ball rolling.
We were all asked to come to the meeting with five ideas for improving the form, and I had “reduce it to one page” at the very top, knowing that if I could only drive home one idea, it had to be that one. Then, the meeting began with the client asking the head developer to present his top idea, and he went over to the whiteboard and wrote “single page”. I knew it was finally going to happen.
And it did. Behold, the future!
It took about a year for the design of the new one-page form to get approved, coded, and published. When the form finally did go live — in October 2017 — it broke our analytics tracking altogether due to development oversights, and we weren’t able to get that tracking restored until the last week of December. Consequently, the first whole quarter to elapse with this new form in place and tracking properly was Q1 of this year.
But when April hit and it was time to compile the charity’s Q1 performance report, nothing could have prepared me for the year-over-year growth I observed when I logged into GA and pulled up the donation numbers.
Traffic in Q1 increased year-over-year by 12.2%, but donation revenue increased 210%, number of donations increased 158%, average donation size increased 20%, and our donation conversion rate grew from 1.47% to 3.15%.
I rubbed my eyes about sixteen times looking at these numbers, but they never changed.
The mobile traffic numbers were arguably even more remarkable; due to a wildly successful Facebook campaign from Q1 2017, mobile traffic in Q1 2018 actually fell year-over-year by 10%, but donation revenue increased 196%, number of donations increased 109%, average donation size increased 42%, and conversion rate improved from 0.49% to 1.13%. All this from the traditionally conversion-averse device.
OK, I thought, let’s double-check things before we get too excited. Are there any complicating factors at play here? Any noise obscuring the signal?
Aha! It turned out one other major fact distinguished donation patterns in Q1 2018 from their counterparts in 2017: due to the addition of a new link in the navigation of the university parent site — i.e. the new reality that the “Donate” CTA in their navigation led directly to our donation form — far more of Q1 2018’s visitors entered the site by landing on the donation form than did Q1 2017’s. And the difference was not small: fully 28% more site visitors landed on the donation form this year than last. That had to have made a difference, right? How much more likely are you to see a donation through if you land straightaway on the form where you would go to complete it?
Well, hang on… all of the above can still be true without corrupting the idea that the form itself carried more people all the way through it by removing the previous version’s many abandonment opportunities. What if we looked at the donation conversion rate purely among those visitors who arrived on the site via the donation form?
I created a segment to look that number up and my jaw dropped again. Year-over-year, the donation conversion rate associated with sessions landing on a donation page improved from 3.36% to 10.96%.
Wait, wait, I told myself, that still doesn’t prove magical properties on the part of the form. What if the reason that the new form converted so much better than the old among people landing on it was because so many more of the new people fitting that segment had the “pre-existing intent to donate” that I alluded to earlier, and deliberately followed an external link leading to the form because they had made up their minds to donate before they reached our site? The ol’ self-selection bias?
Not knowing what to do next, exactly, I decided to create a sub-segment of this one for mobile users only, hoping that would clarify the picture. I had no idea it would clarify it quite so much.
I had forgotten that in Q1 2017 — last year — there had been a huge rush of mobile visitors landing directly on the donation form because a temporary slider had gone up on the mobile version of the parent university site’s homepage during that quarter pointing straight to it, long before our ultimate and permanent nav link was added. Because that homepage slider was such a traffic magnet, mobile traffic landing on a donation page actually fell year-over-year by a staggering 48%. So the addition of our nav link in the time since hadn’t provided the same new rush of traffic to our donation form from the parent university site on mobile as it had on desktop. It had merely offset a loss, and by so little that the total loss was still nearly 50%.
This convenient bit of trivia was what ultimately cemented my confidence that the form really had done the trick. Despite this huge traffic loss, the donation conversion rate associated with mobile sessions landing on a donation page improved from 1.00% to 5.04%.
Pre-existing intent be damned; donors likely to fit that description dropped by half, yet number of donations doubled, donation revenue tripled, and our donation conversion rate quintupled. And this was all on the device that, according to conventional wisdom, nobody wants to perform checkout on. There is no way to review those facts and avoid the conclusion that the new form worked better.
I’ll end back where I started by saying that this wondrous discovery of mine might seem intuitive, but not all intuitive things are true, and not all truth is intuitive. When we were kids, we pretty much all naturally assumed that heavier things fell at a faster rate than light things, and needed to hear about Galileo’s legendary experiment several times before we could really believe it… and even then, most of us tried to reproduce it ourselves by standing on the roof of our house with a ping-pong ball, an orange, and a friend standing below with a stopwatch. And that’s what I was getting at earlier: as intuitive as something might seem, there is no substitute for generating your own firsthand evidence. Not only is it the only way of arriving at real proof, but it’s gratifying to take matters into your own hands like that. I can now say I have proof from personal experience that keeping checkout simple leads to more checkouts. And since the checkouts in question have served to drive more potentially life-saving medical research funding for an institution I believe in, my gratification is ten times greater. Standing here on the other side of this long and intensive process, I’m as proud of this as anything I’ve ever done in this line of work.