You can make all the tweaks and changes in the world, but how do you know they're the best choice for the site you're working on? Without data to support your hypotheses, it's hard to say. In this week's edition of Whiteboard Friday, Will Critchlow explains a bit about what A/B testing for SEO entails and describes some of the surprising results he's seen that prove you can't always trust your instinct in our industry.

image

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to another British Whiteboard Friday. My name is Will Critchlow. I'm the founder and CEO at Distilled. At Distilled, one of the things that we've been working on recently is building an SEO A/B testing platform. It's called the ODN, the Optimization Delivery Network. We're now deployed on a bunch of big sites, and we've been running these SEO A/B tests for a little while. I want to tell you about some of the surprising results that we've seen.

What is SEO A/B testing?

image

We're going to link to some resources that will show you more about what SEO A/B testing is. But very quickly, the general principle is that you take a site section, so a bunch of pages that have a similar structure and layout and template and so forth, and you split those pages into control and variant, so a group of A pages and a group of B pages.

Then you make the change that you're hypothesizing is going to make a difference just to one of those groups of pages, and you leave the other set unchanged. Then, using your analytics data, you build a forecast of what would have happened to the variant pages

Read more from our friends at the Moz Blog