In Friday Link, Tools

A/B testing is when a website randomly shows users two (or more) different versions of a web page, and measures which achieves better results.  The results may be click-throughs to other pages, sign-up for a newsletter, registration on the site, or making a purchase.

It is widely considered a big step towards “scientific marketing” – analytics could show what was wrong with a site, but A/B testing helped the owner to understand what was better.

This article points out that we are in danger of being so pleased with ourselves for investing A/B testing, we’ve stopped innovating.  It proposes a better solution:  20 lines of code that will beat A/B testing every time.   This is a bit more technical than most links I post, but should be comprehensible to anyone with an interest in websites – I certainly found in interesting and it gave me some new ideas!

 

Leave a Comment