MarketingSherpa Founder Anne Holland Talks Landing Pages and WhichTestWon.com

By this time most online marketers understand that good landing pages are a key driver of business. I had the pleasure of speaking with Anne Holland, founder of MarketingSherpa, who is up to her old tricks, teaching us how to be better marketers with her awesome quiz site WhichTestWon.

Wondering whether testing landing pages is worthwhile? Well the math is quite simple: as a marketer you buy 100 clicks.   If you convert two people to customers and make $100 from each customer you’ve just made $200.  Now, if you do some testing and increase your landing page conversion by 100 percent and get  four customers from the same 100 clicks, you’ve just doubled your revenue. Every week there is a new A/B test on landing pages where you get to guess the winner and then see what the true results of the test were online.

Anne (pictured below) shared some best practices with me, then we talked about the future of landing page testing and online marketing.

WhichTestWon is about a year old now. Why did you decide to launch it and in this format?

I’ve always loved A/B and multivariate testing because it’s the easiest way to get big results and improvement for your landing pages, your lead generation forms, your eCommerce carts – as well as your email campaigns – without spending lots of money and without having to drive more traffic. It freaked me out when I discovered 73 percent of marketers aren’t doing any testing whatsoever these days. A friend of mine says that’s like driving with your eyes closed.

So, one day after I’d retired from MarketingSherpa, it just hit me — why not a fun site to show A/B testing? Sort of like Hot or Not, only what you see are two different creatives, and then you vote on which one in reality did better. Then, we show you the real results data and you learn how far off you were.

What are the three or four best practices that you can recommend to people?

Use bigger, more prominent buttons.

Test your headline copy and your button copy. Test stripping off the navigation bar, extra columns, and all extraneous content from your key conversion pages.

It’s pretty simple really.  And when your IT team tells you the site’s “already been tested”, you have to educate them that usability testing has NOTHING to do with A/B testing. Usability testing is great, but it doesn’t tell you how to stop people from abandoning your site, your registration forms or your cart. Usability testing doesn’t help you learn how to convince people to convert.

What is most overlooked  in landing page testing?

Aside from the fact that it’s not done?!

Obvious stuff like match your headline to the headline of the ad or offer that drove the traffic, make your buttons bigger, get rid of extraneous navigation, etc.

I also think mainstream marketers have overlooked the possibility of overlays; they look like a pop-up but are not blocked by pop-up blockers. It’s a great way to garner email opt-ins, among other things. Make them look classy and they can work for your brand. And probably not enough people have tested added video, let alone all the related permutations like sound off, sound on, auto-play, etc.

What are the most common mistakes?

Tracking only to the click on the page tested. We see a lot of test data showing when the marketer tracked beyond the initial click the true winner of the test was revealed. What you want to encourage are qualified clicks. You can test to improve your qualified click rate, but you have to be able to track further down the funnel. It’s not impossible; you can even do it with Google Web Optimizer which is free technology.

The second biggest mistake is testing a really badly performing page. It’s a lot harder to get conclusive results if you have little conversion data to base the math on. Test a page that already has conversions, so if you raise them by say 20 percent, you’re going to look like a hero to the CEO. That’s what you want.

What are your favorite tests from the past year and why?

The ones where I got the answer wrong; probably a dozen of them I guessed totally wrong.. The point is no matter how big an expert you are, you’re going to guess wrong sometimes because you’re not a true representative of who the marketplace, the page or email was designed for. Your ideas don’t matter. What matters is the marketplace. They are the ones you’re trying to convert. You have to test. Period.

Here are some of my favorites.

#1 Which PPC Landing Page Increased Telephone Inquiries by 42 percent?
The thing I liked about this was that the marketers didn’t just measure clicks. They know that prospects who can be convinced to pick up the phone are much more likely to convert to buying their service. This is true of many pricier B2C offers. So the whole test was geared toward getting more qualified phone calls. Too many marketers forget to measure phones as one of the response media. Each of your test panels needs a different phone number!

#2 Online Video Voiceover A/B Test: Which Accent Convinced More
Global Visitors to Click for a Free Download? (Brit vs. Yank) The results of this test were really fun because the marketers measured conversion rates by country. It turned out people in Australia, for example, preferred a completely different voice-over accent than people in India did. It was also different for the US vs. UK. If the marketers had picked one single “winner” for the entire world some countries would have significantly depressed responses! This test proves that different demographics can and will respond very differently to the exact same creative. You have to test and measure separately by key demographics.

#3 Profile-Pimp.net Tests Giant Button & Landing Page Designs. Which Version Increased Clickthroughs?
I love this test so much I actually invented an award category just for it in last year’s Testing Awards at our site. In this test, the marketers took a very nice fat button and expanded it to be the biggest button possibly in history. It was a button that had wandered too near the nuclear power plant, if you know what I mean. You can see the creatives and the results are eye-opening.

What tests surprised you most and which were counter intuitive?

I’m a former copywriter, so I think I “know copy”, but the wording tests are often nearly impossible to guess at. There was one where a colon versus a dash was used in the subject line of an email, which really made a difference in responses, and I totally guessed wrong. Who knew such a tiny factor would make a difference?

Also, sometimes images can throw me. A happy smiling human: will that help or depress responses? It completely depends on the market and the product.

Lastly, I really hate the idea of auto-play video; you know the ones that start blaring at you when you enter a site or landing page. I think it’s dreadful. But by golly, they can really work for some marketers.

How should we be thinking about the pre-click and post-click experience for the user and optimizing for conversion?

Start measuring further down the conversion path than the immediate click. The marketers who are able to measure farther get amazing data. An  A/B test on one page can send reverberations through the rest of the conversion path! Relevancy. We’ve heard that word so many times in speeches and articles it’s not really sinking in any more. I think if you come to WhichTestWon.com and look at a few of the 65+ tests we have in the library, you’ll start to get it.

Let’s talk about cutting edge practices. What are the future of landing pages?

What surprises me the most when I talk to marketers about landing page testing is they seem to think theirs is a one size fits winner for landing pages. As we well know, not all traffic is equal and behaves differently. Should we be segmenting our landing page testing? If so, how should we be thinking about it? And is there technology out there to help us?

Marketers who want to use a single landing page for everything are a perennial problem as are marketers who want to use their homepage as their PPC landing page or who send traffic for a specific keyword to a general “category” page. I think these marketers know better, but are hamstrung by budgets and politics.

The CMOs of this world have some work to do on this front. They need to cut down the jungle of problems around getting new landing pages created. They need to enable their teams to build and launch landing pages on the fly… and to test them! This is a problem of internal company politics, nothing more. Cheap and easy technology has been here for more than a decade.

How does it compare to the segmentation done in search and other marketing channels?

In every channel, from search to email, audience segmentation is the golden arrow which can make a tremendous difference in conversions. To pick segments, I always say look at the current customer base. How do they segment out? Any segment that’s more than 10% of sales is probably worthy of its own campaigns, offers,  and landing pages  tested to appeal to that segment. This is true for B2B and B2C.

Plus, of course, there are the always popular recency and relevancy segments. What you mail your customers vs. hot prospects vs. not-so-hot prospects, etc. That’s all classic direct marketing stuff.

That’s stuff catalogers were doing in the 1980s and it still applies. The tactics are still powerful. It’s just that a lot of online marketers didn’t grow up in that world so they don’t know this type of marketing science already exists.

Finally, do we need to get the point where landing pages are dynamically generated based on the user for the best converting experience by channel, time of day, key words etc?

Depends on how famous and trusted your brand is, how much traffic you’re driving, etc. There’s no general answer to this. Remember, you need a certain level of conversions per month just to run a conclusive test.

We have algorithms powering so much of our online marketing that at this point very little dynamic content is on landing pages. Do you think this is the future of landing page testing? Does landing page test technology need to catch up with other online marketing technology?

I know some fairly dinky B2B marketers, folks with small budgets and an entire marketing department of just two people, who have been doing dynamic landing pages with headlines that change based on PPC keyword for five years. This isn’t rocket science. It doesn’t have to be enormously expensive. Again, I think the hold-up is office politics more than anything else.

That’s why I wrote a white paper (pdf) all about how to overcome office politics and get your testing ideas and budget approved by the CEO, the IT department, etc. It’s posted on our site under ‘Free PDFs’.

About Beth Kirsch

You can find Beth on Twitter @bethkirsch

2 Responses to MarketingSherpa Founder Anne Holland Talks Landing Pages and WhichTestWon.com

  1. Glenn Bossik says:

    I was glad to hear that Anne is a former copywriter. In this interview, she shows that ad copy is important.

    Much of today's advertising is thoroughly devoid of ad copy. This is especially true in the affiliate marketing sector, where 99% of all creatives contain only a call to action and no ad copy whatsoever.

    If affiliate networks hope to generate revenue for advertisers, publishers, and themselves, they must pay attention to the written word.

  2. Pat Grady says:

    WhichTestWon.com is such a fun, stimulating and fantastic website!

    If you've ever wondered whether you really want to be an internet marketer as a profession, read Anne's site – if you just can't put it down, you'll know you've found your calling.

    If you keep guessing the results correctly, you'll also know you're well along in developing your gut instincts for what will likely work and what doesn't.

    She's presenting a ton of useful info there, in both an interesting and useful way.

    As someone who's equally prone to criticize as well as compliment, I don't normally go gaga for someone's online creation… but I'm now thinking of her site as Ms. Holland's Opus.