Google Website Optimizer Case Study: Daily Burn, 20%+ Improvement

This post will show exactly how one start-up improved their homepage conversion rate (visitor to sign-up flow) more than 20%, then 16% again, with a few simple changes and Google Website Optimizer.

Once reading this, you will know more about split-testing than 90%+ of the consultants who get paid to do it…

There are a few advanced concepts, but don’t be intimidated; just use what you can and ignore the rest.

Along with Founders Fund (Dave McClure), Garrett Camp (CEO, StumbleUpon), and others, I am an investor in Daily Burn, one of the premier diet and exercise tracking sites.

Following investing, first priorities included introducing them to Jamie Siminoff, who taught them how to purchase the domain name for DailyBurn (Jamie’s method is described here), and look at their conversion rates for the homepage and sign-up process (sign-up flow to completion of sign-up). This post will look at the former, since the latter cannot happen without the former.

The first step was simple: remove paradox of choice issues.

Below is the homepage prior to tweaking. The bottom of the screen–the “fold”–was right around the second user under the running calorie counter.

Click here for larger version.

Offering two options instead of six, for example, can increase sales 300% or more, as seen in the print advertising example of Joe Sugarman from The 4-Hour Workweek. Joe was, at one time, the highest-paid copywriter in the world, and one of his tenets was: fewer options for the consumer.

DailyBurn (DB) was two founders at that point in our conversation, so instead of suggesting time-consuming redesigns, I proposed a few cuts of HTML, temporarily eliminating as much as possible that distracted from the most valuable click: the sign-up button.

Here is the homepage after reducing from 25 above-the-fold options to 5 options and raising the media credibility indicators. Note the removal of a horizontal navigation bar. The “fold” now ends just under the “Featured On”:

The results?

Test 1 Conversion Rates: Original (24.4%), Simplified (29.6%), Observed Improvement (21.1%)

Test 2 Conversion Rates: Original (18.9%), Simplified (22.7%), Observed Improvement (19.8%)

Conclusion: Simplified design improved conversion by an average of 20.45%.

To further optimize the homepage, I then introduced them to Trevor Claiborne on the Google Website Optimizer (GWO) team, as I felt DB would make a compelling before-and-after example for the product. Trevor then introduced DB and me to David Booth at one of GWO’s top integration and testing firms, WebShare Design.

Why not just use Google Analytics?

David will address this in some detail at the end of this post, but here are the three benefits that Google Website Optimizer (GWO) offers over Google Analytics (GA):

– GWO offers integrated statistics – is new version B better by chance or better because it’s better?

– GWO splits traffic – half traffic runs to A, half of traffic runs to B (if A/B test); it also ensures, using cookies, that a returning visitor will see same the same variation

– GWO really tracks visitors – GA works on idea of a session (a person bounces around on the site for a bit and leaves, which is considered a “session”); if they return, that is generally a new session); GWO uses unique visitors (no matter how many visits, they’re counted as one visitor, assuming they don’t delete cookies). On a fundamental level, it’s the difference between visits and visitors. This is critically important for determining if your result in statistically valid, as ten people and ten visits by one person are not the same.

GA can do a lot of what GWO does, but you need to do a lot of custom work and intricate number crunching to make it work.

Enter Google Website Optimizer

The following is a report of the WebShare / Gyminee Website Optimizer landing page test, and includes a description of the test that was run as well as analysis of the test results. This report was authored by David Booth, to whom, and to whose team, DB and I owe a debt of gratitude. I’ve included my (Tim’s) notes in brackets [ ]. Don’t be concerned if some of the graphics are hard to read, as the text explains the findings.

1. Test Description

The landing page identified for this test was identified as:

http://www.gyminee.com

This A/B/C test included three distinct page versions, including the original (control) homepage as well as two variations designed with conversion marketing best practices in mind:

Original (control)

[same as simplified version above]

Variation B

Variation C

2. Test Results and Analysis

During the first run of the experiment the test saw ~7500 unique visitors and just under 2,000 conversions over the course of about 2 weeks. When the experiment was concluded, both variations B and C had outperformed the original version, and specifically Version B left little statistical doubt that it had substantially increased the likelihood that a visitor would convert, or sign up for the Gyminee service.

Larger version here.

We can see from the analysis of the data that Variation B had a large and significant effect on improving conversion rate. The winning version outperformed by the original by 12.7%, with a statistical confidence level of better than 98%. [This means there is less than a 2% likelihood that you would duplicate these results by chance, which can also be called a p-value of <0.02]

Interesting to note is that the B version, which does not have a “take a tour” button, nor horizontal navigation bar, performed a few percentage points better than their current, more polished design which does offer both.

A follow up experiment was then launched in order to provide more data and ensure that these results were repeatable. The follow up experiment was conducted as an A/B experiment between the original and Variation B, and ran for approximately 1 week, over which time almost 6,000 unique visitors and ~1,400 conversions were recorded.

The results of this follow up experiment showed that Variation B outperformed the original by 16.2%, with a statistical confidence level of better than 99%.

Further analysis concludes the following:

* The absolute difference in conversion rates between Variation B and the original during the test was 3.7%.

* During the test, Variation B’s conversion rate was 16.17% greater than that of the Original design.

* The p-value used in these calculations was <0.01, corresponding to a confidence level of >99%.

The Bottom Line: The results of this experiment were extremely successful.

Putting these test results into plain terms in another way, there is a 98% chance that the true difference between the conversion rates of these versions is between 7.8% (1.8% raw) and 24.5% (5.6% raw).

3. Supporting Analysis (A/B/C Test Only)

A Pearson Chi Square test answers the question: “Out of all the combinations, is any one combination better than another?”

The values here tell us that with >95% confidence, at least one variation was statistically better than another. This further validates the conclusions drawn by Google Website Optimizer.

Was Version C statistically better than the Original?

At an acceptable level of statistical confidence, it was not. However, had we continued to run this test for a longer time period, it is very likely that we would have eventually proven that it was indeed better than the original with >95% statistical confidence. The estimated sample size needed to prove this would have been an additional ~21,000 unique visitors (~7,000 for each variation).

The table below shows you the various sample sizes you would need at different confidence levels to show different relative improvements [Tim: this is my favorite table in this analysis]:

Was Version B statistically better than Version C?

We can be approximately 94.1% certain that Version B is also better than Version C. After applying a Bonferroni correction for the test set, we would still be >90% confident that Version B is better than Version C. The p-value for these calculations is 0.059.

Recommendations:

As Version C did test well, and we believe would have eventually proven itself better than the Original, it is very likely that certain elements of Version C resonated well with visitors to the Gyminee website.

To continue down this path of testing, we would recommend using the winning Version B as a test page for a multivariate experiment. In this experiment, we would suggest testing certain page elements from Version C in the framework of Version B.

Additionally, as testing only covered the homepage, we would highly suggest performing testing on the form found at:

https://www.dailyburn.com/signup

Many concepts such as calls to action, layout, design, contrast, point of action assurances, forms & error handling, and more could be used to increase the likelihood that a user enters information and submits the form.

Lastly, it may be beneficial to begin running tests where the conversion is measured as the paid upgrade. As this conversion rate is much lower than the free sign-up, it should be understood that all other things held equal these tests could take significantly longer to run to completion.

Google Website Optimizer vs. Google Analytics – Parting Thoughts

From David Booth, whose team performed and compiled the above:

1) GA doesn’t have any capability of doing statistical analysis to compare two groups (and it’s not meant to), but it can collect all the data you would need with the best of them. GWO records data very differently and is not meant as (and should never be used as) an analytics package. It runs the stats for you and tells you when you have a statistically significant difference between variations/combinations, but is limited to a single goal or test.

2) The real beauty is to integrate GWO with GA – this gives you the best of both worlds by letting each tool do what they were built to do. You can use GWO to create the test, split traffic, and crunch the numbers for your primary goal, and you can then pull the data out of GA on anything you have configured and run the numbers in a stats package like JMP or Minitab. A very useful case for this is an ecommerce purchase: GWO can tell you if one version / combination was more likely to get an ecommerce purchase (binary – they either purchase or they don’t), while GA data can record things like revenue, and running a different statistical analysis can tell you if one version was more likely to make you more money.

###

Related and Recommended:

Daily Burn 90-Day Fitness Challenge – Starting August 17th! Lose fat and gain muscle with better data and accountability.

How to Tim Ferriss Your Love Life

The Tim Ferriss Show is one of the most popular podcasts in the world with more than one billion downloads. It has been selected for "Best of Apple Podcasts" three times, it is often the #1 interview podcast across all of Apple Podcasts, and it's been ranked #1 out of 400,000+ podcasts on many occasions. To listen to any of the past episodes for free, check out this page.

Leave a Reply

Comment Rules: Remember what Fonzie was like? Cool. That’s how we’re gonna be — cool. Critical is fine, but if you’re rude, we’ll delete your stuff. Please do not put your URL in the comment text and please use your PERSONAL name or initials and not your business name, as the latter comes off like spam. Have fun and thanks for adding to the conversation! (Thanks to Brian Oberkirch for the inspiration.)

121 Comments
Inline Feedbacks
View all comments
CoachDom
CoachDom
14 years ago

Damn, very cool and interesting study.

First time i get one so much detailled on a blog. Thank you Tim 🙂

Emmet Gibney
Emmet Gibney
14 years ago

I feel like I’m back in my management science classes in university, except here we’re learning stuff that’s specifically applicable to my work!

Sam Witteveen
Sam Witteveen
14 years ago

Great post with detail on the art of testing for real results, rather than just for testings sake.

Did you go on to use mixpanel or crazyegg for testing this site as well? results?

thanks.

claus
claus
14 years ago

Thank you for the post. Will have al look at GWO and Daily Burn as well. My next project will be tested a little bit more I think.

Every time here is an inspiration.

David Turnbull
David Turnbull
14 years ago

Playing around with GWO is definitely good fun, and every test I’ve done I’ve been quite surprised with the results.

One of the best articles I’ve found on the topic is at http://www.conversion-rate-experts.com/articles/101-google-website-optimizer-tips/

Brian Armstrong
Brian Armstrong
14 years ago

GWO is certainly a valuable tool. Unfortunately it’s only useful for testing static HTML which I’ve often found limiting

To test dynamic content that is generated by code, I still usually have to code up my own solution. Here is an example I did recently on a Ruby on Rails app:

http://www.startbreakingfree.com/1003/results-of-universitytutor-com-price-testing/

If anyone knows a better way to test dynamic content in GWO I’d be curious to hear more.

Thanks,

Brian

Jared Goralnick
Jared Goralnick
14 years ago

This was excellent, Tim and David. We use both GA and GWO at AwayFind but weren’t aware of the depth of statistical analysis that can be drawn from the results. Additionally, that table that you liked will be stapled to our wall during our next launch.

Much appreciated!

Mike
Mike
14 years ago

Thanks for posting such an in depth example of GWO. I checked it out after your video about getting web traffic to your blog without killing yourself. I’ve been wanting to use it, but haven’t had any changes to make to the site worth measuring yet.

Tyler
Tyler
14 years ago

That is fascinating, it can be very difficult for any business, offline or online, to test how changing the product and design affects the consumer. This is a long way from the days that we had to do polls or just go by our gut. I anticipate being able to use this on my own site someday.

Phillip Scott
Phillip Scott
14 years ago

Ohh fantastic information, I didn’t even know how the confidence levels worked before now :p

I personally use the Conversion Chicken (www.conversionchicken.com) for my testing as it does full proper multivariate testing (where it tests how well variables interact with each other to get much more accurate results) while GWO does not.

But Conversion Chicken costs a little bit of money (only like $40 a month so still much cheaper than hiring consultants), so I guess GWO is fantastic for being free (and if you only want to do normal a/b split testing).

I can’t believe how many people don’t test, it’s like 20 – 50% more sales / signups in a week and takes an hour or so to set up.

To get 20 – 50% more traffic to your site could take anywhere from 3 – 6 months yet people focus on more traffic instead! really odd.

Name
Name
14 years ago

Hello, Tim and David.

A little advanced compared to what we see in the current blogs. Thank you for this excellent article and keep staying ahead of the pack!

Could you give us other case studies, such as a website with a call to action: (pick up the phone?).

Liam McIvor Martin
Liam McIvor Martin
14 years ago

Thanks for this post Tim. P values and Chi Squares! I thought I was the only one that was this anal about analytics! You simply talked about the stats here, but I’m wondering, simply based on a visual ‘hunch’ did you know option B for instance was going to convert higher? If so what do you think made the difference between them.

I ask this as I had a massive jump in conversions (from around 1.5-4.0) by just changing my buttons from (Order Now) to (Next Step). I of course used multivariate tracking but I already knew I was going to get a massive jump.

Alex
Alex
14 years ago

A/B Testing always seemed a little tricky for me to set up. I’m really glad you laid it out here in a clear and intelligent way.

Jose Castro
Jose Castro
14 years ago

Awesome, I was waiting for this post ….. I recall you had hinted about this in previous posts. This is a great morning post to read with some coffee. Thanks for the sweet tips.

Jose

Robert
Robert
14 years ago

Awesome breakdown. Thanks Tim, this is where you shine. You’re focus on the details, but method in conveying them has been honed for clear communication. I’ll be “evernoting” this for when I know that next big site I’m working on needs this type of breakdown.

Since you’re close to the matter…personally I wish I could fly to California (assuming the heads of Gyminee are there) and sit down with the CEO of DailyBurn and let them know why I’ve tried to use the site earnestly 3 different times now (once because of a igoogle app, twice on your recommendations on it), and each time go back to dailymile http://www.dailymile.com …. for 6 months consecutive I’ve been able to login into dailymile, log my workout and get out in under 2 minutes. I just can’t do that with DailyBurn. The excel export, tracking, and facebook/twitter connect are nice too. Thoughts?

Ralph
Ralph
14 years ago

Ah man, Great stuff! I love posts like this!

I really like GA over the analytical tool provided by my hosting company. I have been looking for more in depth analysis lately because I’ve been playing with the layout of my site. I usually track visitors and not visits.

I can’t wait to try it out!!!

Name [see comment rules]
Name [see comment rules]
14 years ago

Multivariate testing is very cool, thanks for sharing your experience with this Tim 🙂 If clients were only patient and understanding enough to allow us to do this work for them. For the time being, we try to second guess what the best layout is for the best conversion – we found that adding any form of video and testimonials is a big help, not to mention reducing form fields.

Wayne
Wayne
14 years ago

Excellent post! Most people get stuck on “traffic”. But traffic is secondary to conversion. The only way to increase conversion is through testing, testing and then testing again.

It doesn’t matter if it’s UGLY as long as it converts.

Eric
Eric
14 years ago

geek nitpicking: you might get better click-through rates for geeks if your page works with javascript disabled (ie. using noscript in firefox). I see a big blank box in the middle of the dailyburn front page.

Clifford
Clifford
14 years ago

This is great. We’re in the process of moving our site from one hosting service to another and we wanted to remove all from the website that wasn’t working and enhance our conversion rate. We just weren’t sure how besides staring at GA.

Thanks so much! This is invaluable!

Eden Jaeger
Eden Jaeger
14 years ago

Thanks for sharing so many details. Loved it!

I’ve been dabbling with affiliate sites lately and have one that is becoming mildly successful. All I’ve put into it so far is guesswork, so I think an A/B test should be my next move and now I know how I’m going to get started. Thanks again.

mike mallory
mike mallory
14 years ago

Great post Tim!!!

This kind of information is a great addition to the info in your book on the same topic……..goods for the revised edition?

Tobias Schelle
Tobias Schelle
14 years ago

Great case study Tim.

I have waited for this post for some time now coz it helps spread the word of an analytical approach to website design and optimization.

I did a test on our own site where we removed our frontpage entirely so people were directed to our products on the first page. Observed improvement: 27.3%!

Tobias

stuartflatt
stuartflatt
14 years ago

Great read and something I will look into when i redesign my blog, but! You started off with original, B and C then you start talking about option A? I assume you actually mean option B?

It needs rewording to make it clear.

Dynasty
Dynasty
14 years ago

Perfect timing w/the post. Look forward to all the comments on this post for more suggestions.

Thanks!

Danon
Danon
14 years ago

Very Well Done! Statistical Analysis is the only true way to prove optimization and you guys did it very well. Although complex and possibly time consuming this helped me understand the importance of the little things that a quality design can bring.

Thanks for all the great help!

Matthew
Matthew
14 years ago

Great post Tim. This is very useful in grasping both the efficiencies and the potential of combining GA and GWO in this manner. I am in the middle of my own case study with various sites and your post has inspired me to look at even more alternative ways of testing conversion.

In time I hope you can post another case study on optimising the conversion funnel as a whole in a multipage environment; even more variables to think of 🙂

All the best,

Matthew

Michael
Michael
14 years ago

Tim, this is awesome stuff.

I’ve been looking at this stuff independently for far too long. Thanks for shedding some light on this without us having to go the consultant way.

The link by Dave Turnbull above is a good resource as well.

Mike

Jet Set Life
Jet Set Life
14 years ago

Hey Tim,

With multiple muses in place split testing has really been a challenge for me. This post was perfect. Thanks so much for your efforts.

Best,

Rob

matt brezina
matt brezina
14 years ago

yo tim – question – did you guys test larger buttons? or different button wording? I’ve found size of button and wording (like the previous commenter) to be the most influential changes you can make to a signup/download/reg flow.

glad to hear you and siminoff and chilling.

-matt

Tim Ferriss
Tim Ferriss
14 years ago
Reply to  matt brezina

Hi Matt,

The wording remained the same, but the size, color, and location were all tested. Interesting to note is that the B version, which does not have a “take a tour” button, is performing a few percentage points better than their current, more attractive design which does offer that option.

Tim

Michael Porfirio Mason
Michael Porfirio Mason
14 years ago

Never heard of Google Website Optimizer before.

Thanks for the tip.

JamesT
JamesT
14 years ago

Thanks for the post Tim, but I’m actually a bit disappointed. You’ve promised in the beginning that will know more than 90% of the consultants and I really think that many consultants could point out that you need to eliminate the clutter on the homepage, without testing anything.

Could you have examples of more sites where the difference in the styling is not so clear, and the results were significant? maybe beause of location of the buttons, colors, etc.

Tim Ferriss
Tim Ferriss
14 years ago
Reply to  JamesT

Hi James,

Thanks for the comment. The portions that most consultants won’t get are the stats and significance that come later in the post.

Best,

Tim

Tom Nagle
Tom Nagle
14 years ago

Tim,

Great article, and great example.

We GA and GW with our clients, and small improvements in convresion- even as little as a few percent – leads to giant gains when you have thousands of new visitors each month.

Kudos to DailyBurn for 1) the name change, and 2) the redesign.

I think WebShare could benefit from a redesign a la DailyBurn. It’s like they’re trying to put all the items from the menu on a single plate. I am sure they are lights out at what they do; I just found it a little ironic that they are one of “GWO’s top integration and testing firms.”

trackback

[…] Originally posted here:  Google Website Optimizer Case Study: Daily Burn, 20%+ Improvement […]

DaveinHackensack
DaveinHackensack
14 years ago

Interesting post, but if I may request a topic for a future post, it would be about how to get more visitors to come to your website in the first place.

JW
JW
14 years ago

I love GWO because it gives split testing to the masses. The trick though is that your site needs to have a critical mass of visitors or else it take a ridiculously long time time to get to meaningful data (whether or not it’s statistically significant).

Gabriel
Gabriel
14 years ago

Wow, fascinating. I didn’t know it was possible to so simply set up an experiment that compared two designs!

Given the data you collected, can you really say, “However, had we continued to run this test for a longer time period, it is very likely that we would have eventually proven that [version C] was indeed better than the original with >95% statistical confidence.”?

Why do you think it’s “very likely”? Since it didn’t meet your criteria for statistical significance you established before the test was run, I would think you’d have to hold off judgment. It is tempting to go there, though, with your directionally correct results.

Tim Ferriss
Tim Ferriss
14 years ago
Reply to  Gabriel

Hi Gabriel,

Good question. That’s definitely one for David. I’ll pass it on.

Thanks!

Tim

Matthew Calabresi
Matthew Calabresi
14 years ago

Thanks for this post, Tim. This is a great introduction for people who might be less than experts on the finer points of Web design when it comes to monetization/conversion, but it has enough teeth for the seasoned designer as well.

I’d just like to reiterate that conversions are just as – if not more – important than traffic. Once people get to your site, the site has to fulfill its purpose; it can’t be just a visit then leave thing.

Adam
Adam
14 years ago

Tim, nice post. Split testing is a must for any site that want to truly prosper. Whats interesting about this post is it makes me want to even split test the URL before even beginning a website. Then as you did in this test continue with each element one by one based on the control and introducing modified element of that control.

==> In the beginning of the post you mention that you are an investor in the site and you have also mentioned angel investing of few time before.

Can you send me more information about this and how can others invest in tech sites like this?

thx

Have a great day!

Tim Ferriss
Tim Ferriss
14 years ago
Reply to  Adam

Hi Adam,

To be honest, I wouldn’t recommend angel investing unless you want to tempt losing money. It’s very, very risky and I have an informational advantage being in SF with people who do tech 24/7. I’d suggest reading “E-Boys” to get a feel for how fast-paced but risky this all can be. “Dot-Bomb” is also a good read. These two focus on venture capital, but the hit ratio is similar: low. I haven’t lost an investment yet, but I haven’t recouped any either. Time will tell.

All the best,

Tim

Kijubi
Kijubi
14 years ago

Great post on testing. I believe this is the only way to improve a website’s conversion rate mathematically speaking. Great post!

Lucas
Lucas
14 years ago

Hey Tim,

Great Article! It has been a while since you did something technical. I always love reading about how you go about the “everyday” sort of stuff that we all do (or should do) for our businesses.

Question: Why did you do away with the “Post Read Time” feature in recent posts?

Knowing you, I am sure there is some great reason related to conversions or readership but I am curious.

Thanks,

Lucas

Tim Ferriss
Tim Ferriss
14 years ago
Reply to  Lucas

Hi Lucas,

LOL… I just forgot!

Pura vida 🙂

Tim

Doc Kane
Doc Kane
14 years ago

Tim,

Curious if the photo of Biray was measured at all. . .as a daily user I was noticing her photo bouncing around a bit and notice now she’s been remaindered to the bottom of the page. Was the pic ITSELF (position aside) tested at all? Also interested if there was ever a guy’s photo used instead of a gals.

By the way, I really like the changes to the site as of late. Particularly the ease of use when it comes to the fav foods section. I was bummed that there was a limit for a few days. . .then whammo!…I was back in action again.

The tool is absolutely fabuslous and I put every single one of my recipes in there as well. Keep up the great work. . .look forward to the CEO challenge!

Cheers,

Doc

Busted Keys
Busted Keys
14 years ago

thanks for the tips! i haven’t used Google Web Optimizer yet, but i’m wondering if it can be used in conjunction with Google Analytics?

Simon Chan
Simon Chan
14 years ago

Great article. It is great timing since I just started using GWO and found it extremely useful and saved a ton of money by making minor edits and increasing conversion ration. Your blog also reminded and re-emphasized some of the things that I have learned from my own experience. Thanks for sharing. A great read!

Schmidty
Schmidty
14 years ago

Great post Tim. You have certianly given me a lot of stuff to go away and learn.

I love it how you you leave nothing to chance and never assume anything. Zero emotion involved.

Have you ever tried counting cards?

Raina Gustafson
Raina Gustafson
14 years ago

I’m a little surprised by how radically different each variation is. It seems very random initially – with wildly different colored buttons (orange, an expected ‘action’ color in B and light gray in C), drastically different layouts, etc. But I do feel that the current homepage is a ‘greater than the sum of its parts’ results, and of course numbers are hard to argue with.

Can you describe what instructions were given to the designer(s) before creating each test version? I’m very interested in how the process begins.

Tim Ferriss
Tim Ferriss
14 years ago

Hi All,

Here are some answers from David, who supervised all the testing:

@Brian Armstrong:

Hi Brian – You can absolutely test dynamic content with GWO – there

are a number of different techniques for this. One popular option is

to use tokens and values (that are replaced when variation content is

rendered client side) for the dynamic portions of the test page(s),

and yet another is to render all of the different versions in div

containers whose visibility is toggled between. There are some

additional and more creative approaches to this as well, but the best

test design is dependent upon what you’re trying to accomplish.

Here’s a useful guide (PDF) that includes some more information:

http://www.google.com/websiteoptimizer/techieguide

@Jared Gorainick:

Note that the values in the table shown here were specific to this

particular test… Here’s a link to the tool you can use to generate a

sample size estimation for your unique experiments…happy testing!

http://www.websharedesign.com/conversion-tools/sample-size-estimator-tool

@Name (in response to the call for more case studies around “pick up

the phone”):

Here’s a link to a case study from the GWO Blog about testing with

phone calls as a goal:

http://websiteoptimizer.blogspot.com/2009/04/landing-page-testing-with-offline.html

@Gabriel:

Hi Gabriel, you’re correct, and the analysis revealed that “at an

acceptable level of statistical confidence, it was not” better than

the original. However, the more data you collect, the more certain

you can be of a difference existing. Some quick analysis reveals that

IF this test was left to run long enough for the variations to collect

approximately 9,000 more impressions AND the observed conversion rates

remained, the p-value would at that point drop below 0.05, which would

meet the criteria for judging one to be better than another with >95%

confidence. As a different variation had already crossed that

threshold, the decision was made to move forward with the followup

test for that version. It’s always a balance between how certain you

want to be and how much data you want to spend time collecting. Hope

this helps!

@Busted Keys:

Yep, you can integrate GA and GWO with both A/B and MVT tests. Here’s

some more information from Eric Vasilik:

http://www.gwotricks.com/2009/02/poor-mans-gwoanalytics-integration.html

Peter K
Peter K
14 years ago

great case study! I’ve always wanted to play around with the google site optimizer but have not had the chance. I am motivated now to start experimenting.

Mickel
Mickel
14 years ago

Thanks for a nice and detailed article!

Coll, Rey
Coll, Rey
14 years ago

I know the “x” factor you were blogging was design, but what about other web elements, such as video overlays? When a person pops up on your screen and talks directly to you, I’ve seen where this increases conversion rates 50% and up! (see example of a video spokesmodel popping up on fourhourworkweek.com at this example link: http://www.ivotechnology.com/preview/sem_ivey.php?url=http://www.fourhourworkweek.com#did=http://firstimpression.straightedgemarketing.com/)

Also, as you know with relationships, more time with a person develops a stronger affiliation to them. And studies have also shown you are more likely to take a recommendation from a stranger as opposed to someone you know. So when a video overlay spokesmodel pops up on your screen and engages your web site visitor 30 seconds or more as she describes the benefits on your site, you have beat the 5-13 second attention span the average web site visitor has before they jump ship….this has a great effect of their interest to your web site and resulting conversion rates.

<>

Brian Armstrong
Brian Armstrong
14 years ago

Cool thanks for the follow up, will check it out!

Noel Wiggins
Noel Wiggins
14 years ago

What a great case study on how to use google optimizer.

I want to clear my desk of what I had planned for the day and give this a closer look!

Thanks & Regards

Noel

Josh
Josh
14 years ago

For someone whose doing everything himself — website design, coding, marketing, advertising, copywriting, etc…

Time is very valuable…

Is it worth my time to learn how to use Google Website Optimizer? I see how useful it will be, but I’m very short on time. Should I put this at the top of my priority list? I believe I should… but just not sure.

trackback

[…] Tim Ferriss: Google Website Optimizer Case Study […]

Steve @ Freedom Education
Steve @ Freedom Education
14 years ago

Hi Tim,

I was absolutely shocked to see the impact on subscriber-conversions when the nav-bar was removed. It makes sense that fewer click-options would provide a higher subscriber rate and I see that you guys moved the nav-bar to the bottom of that web page.

I’m not so sure this strategy would work for a blog where one of the main functions of a blog is to create an easy way for readers to navigate. All that being said you make a good point about limiting the number of options a reader has in order to increase subscriptions. This is something I’ll look at more closely for my next squeeze page.

Thanks for this Tim,

Daniel
Daniel
14 years ago

Hi Tim and David,

many thanks for the excellent post. I found extremely frustrating the integration between GA and GWO. I have been running A/B tests for a while on my homepage, the problem being when the visitor is redirected to the alternate page by the GWO script, in GA the source is recorded as being direct instead of the real source (search engine keywords for example). Because I do this test on the home page, this means almost 50% of my traffic ends up being tracked as direct on GA !!!

I know some guys (including Erik mentioned in the link above) have posted some workarounds, but looking around the forums, there are different tweaks and different opinions, and it is difficult for someone like me who is a non-programmer and just wants to use the product and simply copy and paste a piece of code that “just works” w/o having to develop a javascript knowledge…Given that they are both Google products, I find it incredible that the big G does not provide a simple solution to use them in conjunction without messing things up.

Do you guys know of a solution to overcome the problem described above? I have posted several questions on google forums, but the answers are always quite full of technical jargon.

Many thanks !

Trevor Claiborne (Google)
Trevor Claiborne (Google)
14 years ago

Tim,

Thanks for posting this. DailyBurn and Webshare really did a phenomenal job both with the testing and the thorough write-up. I’m glad to have helped get this going.

Trevor

Tim Ferriss
Tim Ferriss
14 years ago

You rock, Trevor!

Tim

Jordan
Jordan
14 years ago

I’ve just got into GWO myself; it’s a great tool that everyone in the industry should learn about and useful posts like this can only help spread the knowledge 🙂

Bearing in mind I’m new to this, am I right in believing you can only track a single goal on any site? Many websites I work on don’t have a single goal or conversion page – and optimising a home page for one product will only show the effects on one goal. I’m finding the opportunities for GWO is very limited on non-ecommerce sites 🙁

Nik K
Nik K
14 years ago

The best thing with a working GA and GWO, IMHO is the ability to see what pages viewers saw in the session depending on the A / B Page they initially stumbled upon.

Beyond looking at conversions, e-commerce, etc. you can start understanding what users are looking for, what they are interested in and how different messages affect their browsing habits.

While GA makes this tedious, it is possible and certainly valuable information. By easily plugging in an advanced filter, you can see how your test affected behavior 🙂

Kenneth Dreyer
Kenneth Dreyer
14 years ago

Who’s the design team behind this wonderful website? Is it webshare?

Kimmoy
Kimmoy
14 years ago

Ok this might seem like a very simiple observation compared to all the stats, but I immediately thought because the image of a woman was more prominent on the page, that is what helped to increase the conversion rates for Variation C compared to seeing a much larger chart in Variation B.

We feel more connected by seeing other people’s faces (right brain) vs. the chart (left brain). Yep, I’m reading Daniel Pink’s book, “A whole new mind”, have you read it?

Shareef Defrawi
Shareef Defrawi
14 years ago

There goes my idea for an online fitness log! Seriously. Very interesting information though. Just wish it were a little easier to convey the importance of metrics to business owners / website operators.

Matt S
Matt S
14 years ago

I appreciate hard work being shared, I think I will start tinkering with GWO.

Thanks!

MsJ777
MsJ777
14 years ago

I’m also interested to hear about how the name change will effect the conversion rate. “Gyminee” sounds like something for kids–not for people serious about reaching their fitness goals. I can’t see many men logging into a site with that name or referring it to other men without getting some flak for the “cute” name. (Just out of curiosity, what were the demographics–men vs. women–before the switch, and are you seeing a difference since?)

“Daily Burn” is much easier to remember and share verbally–you’re not going to have to spell it out for anyone. It also has the added bonus of the “Grrr!” factor–it sounds like you’re going to get in there and burn some calories and get down to business with your fitness goals. 🙂

Looking forward to some followup on this one!

vinay
vinay
14 years ago

I never heard about google web site optimizer before. You always learn something new.

Jeremiah Smith
Jeremiah Smith
14 years ago

Ohhh Tim this is one of my favorite topics. I’m a bit of an analytics junky and conversion nut myself. I’ve commented here about SEO and analytics before and it was well received so I felt obligated to share a tool that I consider to be VERY effective for multivariate testing. Where GWO may pull in 8 variations of results in a week, the guys at Vertster use the Taguchi method to pull in 32 variations in the same amount of time, therefore presenting deeper more effective results in the same amount of time. Check them out here: http://www.vertster.com/

There platform is fully white labeled if you want to offer it to your clients. I recommend it.

Great post! Looking forward to libro numero dos mi amigo!

Jeremiah

Mike Gilliland
Mike Gilliland
14 years ago

Hi Tim,

I’ve been a huge fan of your book for the last year, and I listen to the audio book version probably once a month. (It would have been pretty cool to have heard you read it yourself, maybe something for the next one!)

Question for you…

I want to get deeper into google analytics, testing, and website optimization, and I do have a product that I’m currently developing based on what I learned from 4HWW, but I’m sort of at a loss of where to start. I’m 23, I have no university degrees, I simply read a lot of books.

Can you suggest any good reads on the subject? Do you know how I might link up with a business mentor, or if there are any programs you can suggest to help out young entrepreneurs? Living in Canada sometimes gives me the impression that I’m s.o.l. for local e-commerce business mentors.

Thanks again for all your work on this blog,

looking forward to the revision of 4HWW.

Mike

trackback

[…] Google Website Optimizer Case Study:  David Booth of WebShare shares a case study on split testing. Specifically, he delves into Google Website Optimizer results that show a split test they ran for the Gyminee homepage, which resulted in a 20% conversion rate increase. For those looking to run tests themselves, Booth includes several actionable takeaways in the post. […]

solon
solon
14 years ago

ff3.5

Your press links on the front page don’t work. Js errors. Your tab changes are jerky/blurry as well.

Hope it helps.

Matt S
Matt S
14 years ago

MsJ777,

I totally agree with you on the name change – the owners of the company did too! 🙂

Roland
Roland
14 years ago

Hey Guys!

I’m interessted what you miss in GWO?

If you have the possibility to tell Google one feature they then implement in GWO, what would that be?

thx

Roland

Casey
Casey
14 years ago

SHHHHHH! Tim, you’re letting go of some great secrets here!

I’ve used the Optimizer to increase my opt-in rate 40%. It’s huuuuge for me, and it’s all thanks to this simple tool.

Randy
Randy
14 years ago

I’m about to launch a site called golfpage411.com. Problem is even my own brother can’t even remember the web address after a couple of weeks.

After reading this article and James pointing out how important a domain name is got me thinking, so within 12 hours I tracked down the guy who owns golfpage.com (shorter and easier to remember), emailed and than spoke to him directly, told him my story, negotiated a price at $200, purchased the domain name, and had the nameservers pointed to my website so now the url is golfpage.com… all within 12 hours. Prior to reading this article I didn’t even think it would be possible to do something like this let alone know how to do it.

Unbelievable what a little initiative can make. Thanks for passing along the idea’s and the inspiration.

Alfred Hsing
Alfred Hsing
14 years ago

I agree that less is usually more! – Especially when it comes to options. Leave the important options and big call to actions out but reduce the clutter which might distract users from going to those major options.

John Doyle
John Doyle
14 years ago

Wow this is a really good example,I am going to use the same on my own site based on your findings,many thanks

Laurie
Laurie
14 years ago

Hi Tim.

Great post and great info. Have you ever read Google’s analytic Team Member Avinash’ blog ?(http://www.kaushik.net/avinash/) Great stuff that always addresses testing and “less is more” philosophies.

Morten
Morten
14 years ago

Thank you very much. One of the best blogs about google website optimizer.

Thanks from Denmark

Daryle
Daryle
14 years ago

This article reminded me of my 1st semester in grad school. It was a marketing class, statistical analysis using SPSS 10 and I failed miserably on my presentation.

While “informational”, this article is about as educational as my presentation… the facts are great, but the real question is, why is version B better? That’s the real genius — GA and GWO are just a means to an end, HOW did you get the better response??

“Many concepts such as calls to action, layout, design, contrast, point of action assurances, forms & error handling, and more could be used to increase the likelihood that a user enters information and submits the form.” —— You figure out which concepts made version B better and remove those tables (contingency? no one should care about that, it’s a measuring tool and provides no real insight) and your article would’ve been a lot more educational.

trackback

[…] an example, check out how Tim Ferris of 4 hour workweek used the Google Website Optimizer to improve his […]

trackback

[…] and entrepreneur Tim Ferris had a great post about website optimization that you should definitely read if you’re not familiar with either […]

Claire Jarrett
Claire Jarrett
14 years ago

Extremely interesting tests. I have used the Google website optimiser with some success, but very interesting to see a statistical test.

Mason
Mason
14 years ago

At what traffic levels would split-testing be relevant, any thoughts?

ForMason
ForMason
14 years ago

Hey Mason.

A/B Testing works for up to 8000 uniques per week.

Multiple split testing works from about 8000+ uniques per week.

~Cheerio

Caroline
Caroline
14 years ago

I’d make a version D with a different model looking less sexual. Run D against any of the others and it will do as well as B does.

Anyone want to wager? I’m building a web site as we speak for data wagers. I hope it’s legal but I’ll ask forgiveness later.

C

Nicole
Nicole
14 years ago

Great Post. I especially like the step by step format that makes it easy for one to implement.

Tom
Tom
14 years ago

Thanks Tim for helping me understand how Google Website Optimizer works in a real life environment. I didn’t realize how powerful it was until I read your article. I will surely give it a try myself.

trackback

[…] writers than I. But we all know that a little A/B testing can go a long way. We’ve seen that a quick/dirty redesign of an already effective looking page can pump conversion by more than 20%. Hell, we’ve seen that a few iterations of Twitter language (leading to “you should follow me […]

Johnyou
Johnyou
14 years ago

Any tips on how to add a merchandise page to my site at weebly that links to pay page so that I can test both and see if anyone is interested in the things I would sell? Also where is the place to find a drop shipper speciffically for pool(8ball 9 ball) equipment?

Thanks for any help

John

Olga
Olga
13 years ago

Question, Tim…

(yes, hi… lol)

Curious why you use AB test and not multi-variable test, as from my opinion it’s results in much more accurate data…

And yes, Google analytics is a joke, considering that Google’s resources are behind the tool. The information that is provided is not sufficient to make any SEO decisions, really (from my eagle view lol). Market Samurai did way better job in this department.

Thank you, captain.

David
David
13 years ago

Thanks for posting interesting experiments and study.

Cameron Benz
Cameron Benz
13 years ago

Interesting read now that I’m finally getting around to using analytics. I know when I do my site’s redesign, how it’s going be tested now. This project just keeps getting better.

Michael Sweigart
Michael Sweigart
13 years ago

Without tracking everything risks unknown failure. Analytics and tracking is one of the most underutilized tools on the marketing, and from Google it is free. Tim, have you ever used heat or clickpatch tracking on your sites?

trackback

[…] Gyminee homepage redesign: 20% increase in conversion rate  […]

HeungJun, Kim
HeungJun, Kim
13 years ago

It’s interesting info for my web life. about 1 years ago, I have bought 4 hours workweek book and read it 3 times. but, it was difficult to adapt with my working circumstances, even if doing step-by-step stage. So, I decided to try to start easy thing not existed in the “Book”, like blogging and getting media power. Anyway, this posting is helpful for me. Keep in looking ur site. 🙂

Craig
Craig
13 years ago

I have read this more and more on the internet about using Google website optimiser and using this to test, will definitely be trying this out, when I can put aside some time, Thanks

Lisa
Lisa
13 years ago

Really interesting read, and was recommended to read this post, and have to agree these are becoming important skills to have in web development, and not just produce a website but monitor and improve it! LT

iman
iman
13 years ago

Simplicity in most cases is the key to better conversions. Most tend to think the more busy the site looks the better the upsell potential, that could hold true only if the visitors don’t press the back button.

Thank you so much for these great tips. 2011 will be a year of split tests for my clients.

David
David
13 years ago

I use GA and will try GWO after reading your informative article. Thank you for all the details.

Bros
Bros
13 years ago

Hi there

Quite useful content and really interesting . Keep up your good work.

Cheers

Victoria Morgan
Victoria Morgan
12 years ago

Outstanding Effort! The interest level of the visitor is maximized by matching the right visitor, the right place, and the right time. I recently read an interesting post by Codebaby about “Conversions, Not Just Eyeballs” http://codebaby.com/cbBlog/2011/04/20/conversions-not-just-eyeballs/ that I thought you would find.