Steve's blog

A blog about what's going on in Analysis UK...

AB testing with clear results

Like me you've probably heard various voices on the internet speaking of the importance of A/B testing. I've listened and agreed and tried a few bits here and there, but I've never seen such an obvious result as with my latest Adwords.

You may have noticed I recently launched RememberTheBin.com, a reminder service so you don't forget to put the bin out. Initially I set up a Google Adwords campaign with one advert and that didn't get much interest at all so I added a second very similar ad.

Here are my two ads:

Initially these ads were getting served equally, fortunately Google has kicked in and realised it's not making money from the second one so stopped serving it so often.

The “Free SMS, Twitter and email reminders to put the bin out” ad was my first shot and am I glad I decided to do an A/B test on it. Talk about a useless ad. No clicks what so ever, zero, nothing, nada, zilcho and that's over about a month!

It's interesting to note how similar the text of the two ad's are and how different the responses are, they both have the same keywords, cost and even the same words!

My question to you is this: Are you running AdWords, or even SEO, or specific landing pages? Have you tried AB testing? If not, go, go now and try!  I'll wait for you...

Which brings me nicely onto the SEO AB testing. That's a lot more difficult and time consuming because you want the search engines to update their index with what you want them to see. Instead invest some cash in Google AdWords and play with the AB testing through that and see what people click on, what gets served more, then use that information in your SEO campaign.

 

Comments (1) -

  • Alastair Smith

    2/23/2010 1:52:00 AM |

    Interesting post, thanks for sharing; your results are quite impressive!  

    The key to A/B testing, like any other statistical-based trial, is in the significance of the results, and not the results themselves.  Clearly, as you've gone from 0 to n clicks, you have a highly significant result, but this is most often not the case.  You will need to decide your level of significance, do extra calculations to calculate the percentage change, and compare the two.  You also need to work out to control against the random elements: for example, how do you know the extra clicks on your new ad weren't obtained simply because it was a new ad?  The following post from 20bits.com seems to be a good guide to completing a thorough A/B test.  20bits.com/.../

    There's been a lot of noise recently about A/B testing via Carsonified/Think Vitamin et al., and most of it seems to skip over the statistical element of the tests, so I think much of the hype (as ever) needs to be taken with a large pinch of salt.  It's not a simple thing to do properly or right, as the hype seems to suggest.  It can have some uses but it's not the panacea that some people and organisations seem to think it is.  

    Alastair

Comments are closed