In affiliate marketing we’re often trying to find the most effective and targeted way possible to drive traffic to a particular offer. I don’t know about you but I know that I don’t just stumble upon the best way to do that the first time I try. I know many affiliate marketers simply link directly to the offer; however this isn’t always possible. With Yahoo for instance, they won’t let you do that, meaning if you want to take advantage of that traffic you need to create your own landing page. So how do we get the best landing page possible?
Let me introduce a cool tool by Google: Website Optimizer. This is a great split testing tool (sometimes called A/B testing) for internet marketers. Google handles all the complicated code and statistical analysis; all you need to do is come up with a few versions of the page you’d like to try, and drop a few lines of code onto each page.
You can access the Website Optimizer via your Google Adwords account. You’ll see it under the Campaign Management tab, on the far right. You don’t need to use it in conjunction with an Adwords campaign: it can be completely separate.
The concept behind split testing is that you take traffic coming to your site and randomly redirect each visitor to a different version of your landing page. This works best if you keep your different versions to a minimum, hence the term A/B testing – two versions. The whole point is to be as scientific about this as possible. If you remember your scientific method, the idea is that you isolate single variables and measure those alone, while measuring your control.
Ideally, we’re talking about using traffic all from the same source. For instance, I’m only split testing one of my landing pages on Yahoo traffic. This means that your visitors are all coming from the same place, so you’re measuring results from a somewhat standard crowd. However, this isn’t by any means a make or break scenario – you can definitely split test pages getting traffic from multiple sources. The software will take care of sharing the traffic equally.
More importantly, when you make changes to your page, make them incrementally. If you come up with two completely different landing pages, one will definitely outperform the other, but you’ll have no idea why. This can be a good approach initially, as you’re trying to get into the right ballpark, but once you’ve got something that’s working fairly well you’ll want to split test on single variables.
For instance, I recently ran a split test on a landing page for a particular product (gotta keep some secrets you know ;). Anyways, let’s just say I get more money for homeowner applicants versus renters. So my theory was why not put a fairly prominent link on the page saying “Homeowners Apply Here” or something to that effect. Try to encourage my primary targets onward to the offer.
So I started with my original page, which has just a standard “Apply Online Today!” text link, and added a “Home Owners Apply Here” text link, then created a graphic from some clipart that also said “Home Owners Apply Here.” So I was testing two things: does a Homeowner link perform better, and does a graphic outperform a text link? Therefore, I ended up with three different versions of my landing page, each with a single difference from the others, all while keeping my control page constant (I’d been running the control page for months and months).
In the test results (below) Combination 1 is the text link, and Combination 2 is the graphic link. Here are the results from the split test I’ve just described. If you click the picture, it will open a larger copy in a new window.
There are quite a few important things you can learn from this little table. First off, Website Optimizer is telling me that my trustworthy landing page that I’ve used for months is converting at 34% (plus or minus 6%). So this is my baseline. The longer a split test runs, the better this confidence interval gets. I happened to quit this test early, because I could see where it was heading and I wasn’t happy forfeiting the traffic for the sake of a statistically significant result.
Next you can see that Combination 2 (graphic link) underperformed my original page, but over performed the text link. I would have to run the test longer to see which one truly worked better, as they are fairly close, however I’ll save that for another time.
So quickly, through the table, you have the estimated conversion rate range, shown in percentages, with confidence levels, as well as in the very cool graphic. This graphic shows you at a glance how well or poorly your split tests are faring. Next is the chance to beat the original. This is a way of showing you what probability each split actually has of beating your original page. The chance to beat all shows the probability of each split being #1 overall. Observed improvement is the improvement or lack of, as compared to the original. Finally, to keep things grounded in reality, the conversions over visitors tells you exactly how many results went into each split test.
So what did I learn from this split test? Well first off, I realized afterwards that even though I get paid more per homeowner, I still get paid for renters, so why would I want to alienate them? So singling out the homeowner is perhaps a poor idea, and possibly was contributing to the lower conversion rate. Secondly, although the test didn’t run long enough to be truly statistically significant, at first glance it appears that the graphic link was outperforming the text link. This would be a good place to start with another split test – excluding the home owner part. Third, 34% conversion for a landing page isn’t bad, I guess, but it certainly isn’t stellar. Basically this means that I’m throwing away 66% of my advertising dollars before the visitor even sees the actual offer page. That’s kind of horrendous if you think about it.
As a side note, I’m currently split testing a different version of the page which at least in preliminary results, the splits are outperforming the original by 26% and 55%. The implications of a successful outcome from a split test can really make a big difference on the bottom line!
If I can get a landing page that is converting above 65% (the current forerunner) that means that for the same advertising dollars, I now get 30% more people viewing the offer! That is nearly double! Assuming final conversion remains constant, this should more than double my profits.
Here’s a super rough example:
If I currently spend $100 in advertising at $1/click, then 35 visitors make it through to the final offer. Assume I get $150 revenue from the 35 visitors. ($4.28/visitor) = $50 profit.
Now, if I spend the same, but have a better landing page, this means 65 visitors go through. With the same ratio of $4.28 to visitor, that would give me a revenue of $278, or $178 profit!!
So you can see the tremendous value of improving your landing page!
Once I have some solid results from this current test, I’ll post them and discuss what I’ve learned. I might even have a valuable tip for you on how you can improve your page the same way!