Site icon The Blog of Author Tim Ferriss

Google Website Optimizer Case Study: Daily Burn, 20%+ Improvement

This post will show exactly how one start-up improved their homepage conversion rate (visitor to sign-up flow) more than 20%, then 16% again, with a few simple changes and Google Website Optimizer.

Once reading this, you will know more about split-testing than 90%+ of the consultants who get paid to do it…

There are a few advanced concepts, but don’t be intimidated; just use what you can and ignore the rest.

Get exclusive content from Tim right in your inbox

Along with Founders Fund (Dave McClure), Garrett Camp (CEO, StumbleUpon), and others, I am an investor in Daily Burn, one of the premier diet and exercise tracking sites.

Following investing, first priorities included introducing them to Jamie Siminoff, who taught them how to purchase the domain name for DailyBurn (Jamie’s method is described here), and look at their conversion rates for the homepage and sign-up process (sign-up flow to completion of sign-up). This post will look at the former, since the latter cannot happen without the former.

The first step was simple: remove paradox of choice issues.

Below is the homepage prior to tweaking. The bottom of the screen–the “fold”–was right around the second user under the running calorie counter.

Click here for larger version.

Offering two options instead of six, for example, can increase sales 300% or more, as seen in the print advertising example of Joe Sugarman from The 4-Hour Workweek. Joe was, at one time, the highest-paid copywriter in the world, and one of his tenets was: fewer options for the consumer.

DailyBurn (DB) was two founders at that point in our conversation, so instead of suggesting time-consuming redesigns, I proposed a few cuts of HTML, temporarily eliminating as much as possible that distracted from the most valuable click: the sign-up button.

Here is the homepage after reducing from 25 above-the-fold options to 5 options and raising the media credibility indicators. Note the removal of a horizontal navigation bar. The “fold” now ends just under the “Featured On”:

The results?

Test 1 Conversion Rates: Original (24.4%), Simplified (29.6%), Observed Improvement (21.1%)

Test 2 Conversion Rates: Original (18.9%), Simplified (22.7%), Observed Improvement (19.8%)

Conclusion: Simplified design improved conversion by an average of 20.45%.

To further optimize the homepage, I then introduced them to Trevor Claiborne on the Google Website Optimizer (GWO) team, as I felt DB would make a compelling before-and-after example for the product. Trevor then introduced DB and me to David Booth at one of GWO’s top integration and testing firms, WebShare Design.

Why not just use Google Analytics?

David will address this in some detail at the end of this post, but here are the three benefits that Google Website Optimizer (GWO) offers over Google Analytics (GA):

– GWO offers integrated statistics – is new version B better by chance or better because it’s better?

– GWO splits traffic – half traffic runs to A, half of traffic runs to B (if A/B test); it also ensures, using cookies, that a returning visitor will see same the same variation

– GWO really tracks visitors – GA works on idea of a session (a person bounces around on the site for a bit and leaves, which is considered a “session”); if they return, that is generally a new session); GWO uses unique visitors (no matter how many visits, they’re counted as one visitor, assuming they don’t delete cookies). On a fundamental level, it’s the difference between visits and visitors. This is critically important for determining if your result in statistically valid, as ten people and ten visits by one person are not the same.

GA can do a lot of what GWO does, but you need to do a lot of custom work and intricate number crunching to make it work.

Enter Google Website Optimizer

The following is a report of the WebShare / Gyminee Website Optimizer landing page test, and includes a description of the test that was run as well as analysis of the test results. This report was authored by David Booth, to whom, and to whose team, DB and I owe a debt of gratitude. I’ve included my (Tim’s) notes in brackets [ ]. Don’t be concerned if some of the graphics are hard to read, as the text explains the findings.

1. Test Description

The landing page identified for this test was identified as:

http://www.gyminee.com

This A/B/C test included three distinct page versions, including the original (control) homepage as well as two variations designed with conversion marketing best practices in mind:

Original (control)

[same as simplified version above]

Variation B

Variation C

2. Test Results and Analysis

During the first run of the experiment the test saw ~7500 unique visitors and just under 2,000 conversions over the course of about 2 weeks. When the experiment was concluded, both variations B and C had outperformed the original version, and specifically Version B left little statistical doubt that it had substantially increased the likelihood that a visitor would convert, or sign up for the Gyminee service.

Larger version here.

We can see from the analysis of the data that Variation B had a large and significant effect on improving conversion rate. The winning version outperformed by the original by 12.7%, with a statistical confidence level of better than 98%. [This means there is less than a 2% likelihood that you would duplicate these results by chance, which can also be called a p-value of <0.02]

Interesting to note is that the B version, which does not have a “take a tour” button, nor horizontal navigation bar, performed a few percentage points better than their current, more polished design which does offer both.

A follow up experiment was then launched in order to provide more data and ensure that these results were repeatable. The follow up experiment was conducted as an A/B experiment between the original and Variation B, and ran for approximately 1 week, over which time almost 6,000 unique visitors and ~1,400 conversions were recorded.

The results of this follow up experiment showed that Variation B outperformed the original by 16.2%, with a statistical confidence level of better than 99%.

Further analysis concludes the following:

* The absolute difference in conversion rates between Variation B and the original during the test was 3.7%.

* During the test, Variation B’s conversion rate was 16.17% greater than that of the Original design.

* The p-value used in these calculations was <0.01, corresponding to a confidence level of >99%.

The Bottom Line: The results of this experiment were extremely successful.

Putting these test results into plain terms in another way, there is a 98% chance that the true difference between the conversion rates of these versions is between 7.8% (1.8% raw) and 24.5% (5.6% raw).

3. Supporting Analysis (A/B/C Test Only)

A Pearson Chi Square test answers the question: “Out of all the combinations, is any one combination better than another?”

The values here tell us that with >95% confidence, at least one variation was statistically better than another. This further validates the conclusions drawn by Google Website Optimizer.

Was Version C statistically better than the Original?

At an acceptable level of statistical confidence, it was not. However, had we continued to run this test for a longer time period, it is very likely that we would have eventually proven that it was indeed better than the original with >95% statistical confidence. The estimated sample size needed to prove this would have been an additional ~21,000 unique visitors (~7,000 for each variation).

The table below shows you the various sample sizes you would need at different confidence levels to show different relative improvements [Tim: this is my favorite table in this analysis]:

Was Version B statistically better than Version C?

We can be approximately 94.1% certain that Version B is also better than Version C. After applying a Bonferroni correction for the test set, we would still be >90% confident that Version B is better than Version C. The p-value for these calculations is 0.059.

Recommendations:

As Version C did test well, and we believe would have eventually proven itself better than the Original, it is very likely that certain elements of Version C resonated well with visitors to the Gyminee website.

To continue down this path of testing, we would recommend using the winning Version B as a test page for a multivariate experiment. In this experiment, we would suggest testing certain page elements from Version C in the framework of Version B.

Additionally, as testing only covered the homepage, we would highly suggest performing testing on the form found at:

https://www.dailyburn.com/signup

Many concepts such as calls to action, layout, design, contrast, point of action assurances, forms & error handling, and more could be used to increase the likelihood that a user enters information and submits the form.

Lastly, it may be beneficial to begin running tests where the conversion is measured as the paid upgrade. As this conversion rate is much lower than the free sign-up, it should be understood that all other things held equal these tests could take significantly longer to run to completion.

Google Website Optimizer vs. Google Analytics – Parting Thoughts

From David Booth, whose team performed and compiled the above:

1) GA doesn’t have any capability of doing statistical analysis to compare two groups (and it’s not meant to), but it can collect all the data you would need with the best of them. GWO records data very differently and is not meant as (and should never be used as) an analytics package. It runs the stats for you and tells you when you have a statistically significant difference between variations/combinations, but is limited to a single goal or test.

2) The real beauty is to integrate GWO with GA – this gives you the best of both worlds by letting each tool do what they were built to do. You can use GWO to create the test, split traffic, and crunch the numbers for your primary goal, and you can then pull the data out of GA on anything you have configured and run the numbers in a stats package like JMP or Minitab. A very useful case for this is an ecommerce purchase: GWO can tell you if one version / combination was more likely to get an ecommerce purchase (binary – they either purchase or they don’t), while GA data can record things like revenue, and running a different statistical analysis can tell you if one version was more likely to make you more money.

###

Related and Recommended:

Daily Burn 90-Day Fitness Challenge – Starting August 17th! Lose fat and gain muscle with better data and accountability.

How to Tim Ferriss Your Love Life

Get exclusive content from Tim right in your inbox

Exit mobile version