Lately I’ve been playing around with Google’s Website Optimizer a bit more. One of my landing pages is getting a good thousand plus views per day, so that gives me a nice platform to split test ideas on. I can run a split test inside of a week and get some relevant results.
Lately however, I started to suspect that perhaps the auto responder follow up series for one of my products was a bit weak, or contributing to poor conversions. So I decided to split my traffic, using the website optimizer, into two separate auto responder streams to follow up.
So the way I did this was very simple: I duplicated the landing page, and simply swapped the opt in code.
You wanna see the results? This is funny:
This is the tricky part with split testing… you gotta split test, because if you don’t, you just have no idea. But even when you do, often times we take results as being a pretty sure thing, when in fact they’re not.
If two identical pages can get these sorts of results, then surely two similar pages can show up as winner/loser when in fact the truth is the opposite.
How do you get around this? As you can see, I ran about 3700 clicks through this particular test and came up with this result. I since realized (duh!) I need to be measuring sales and not opt in conversions so I’m not going to continue this test any longer, but probably the best thing to do is to run the test for a LOT longer than Google tells you you should.
In this particular example, truthfully, I would expect that if I ran 100,000 clicks through it, the results should stabilize. However, you never know…
Any weird things you’ve run across with Optimizer? Leave a comment below and tell us about it..