Google’s new “Content Experiments” testing tool

Last Friday, Google announced it plans to shut down Website Optimizer, its 5-year old standalone testing tool, on August 1st. Simultaneously, Google unveiled “Content Experiments”, a new test tool that’s fully integrated into Google Analytics.

As the only free, web-based landing page testing tool, non-profit organizations relied heavily upon Google Website Optimizer. Its closure is a significant development. If you’re a current or past user of GWO, be sure to read the final section of this post, which provides specific instructions on how its closure will impact legacy users.

Based on what we know so far, Content Experiments contains some important differences from Optimizer. Below, we’ve outlined the key differences –as well as the similarities.

What’s new:

  • Basic A/B testing (or A/B/n testing) is supported but multivariate testing is NOT. Test page variations must be defined with a separate page URL. For this reason you cannot test combinations of elements on a single page as in multivariate testing.
    The only way to test combinations would be to code separate pages that manually combine creative changes—much more time consuming than the old approach, which allowed developers to use a single page with JavaScript to change only the element(s) being tested.
    Google says it will add back multivariate testing functionality to the tool at some point in the future—but those of us who used Optimizer for MVT are out of luck.
  • Tests will be easier to implement. Only the original page script will be necessary to run tests. The standard Google Analytics tracking code will be used to measure goals and variations.
  • Test page variations are extremely limited. A maximum of 5 variations can be run per test. Compared to GWO, this is a big step backwards. GWO allowed 8 variables per test and a maximum of 10,000 page variations.
  • Using Google Analytics is mandatory. Unlike with GWO, testers cannot use outside analytics programs because the tool is now accessible only through Google Analytics.
  • Users have a bit more flexibility in test goal selection. Besides a goal page URL, you can select an event goal already created in GA as your testing goal, e.g. email signups. (this was not possible in GWO) Yet, it’s not possible to choose important conversion goals such as ecommerce transactions as your test goal.
  • Users can see segmentation data in their tests. If you’ve ever wondered which page variations worked best for specific sources of traffic on your website—the new tool will tell you. This should provide meaningful insights into which audience segment responds best to a particular page design—and flatten the learning curve as far as customizing the user experience on landing pages to achieve better results.
  • No test winners will be declared until an experiment has run at least 2 weeks. This is intended to avoid misleading test conclusions from short-term data samples.
  • Tests cannot run longer than 3 months. They’ll automaticallyexpire at that time.This change is intended to combat the SEO practice of Cloaking—i.e. showing a version of a web page to search engines that differs from the version shown to ordinary visitors, with the intention of deceiving search engines and affecting the page’s search index ranking.It could hamper the ability of non-profits to test strategically important landing pages that receive lighter traffic and conversions (e.g. monthly giving pages). It will require designing simpler tests for such pages to improve the odds that they reach statistical significance in 3 months time or less.
  • Users are limited to 12 live tests at one time. This limitation will impact power-users, but it’s unlikely to impact non-profits, which typically have the resources to run only a small number of tests at one time.
  • Test traffic will be dynamically allocated, meaning that more visitors will get directed to winning page variations and less to losing variations as a test progresses. This feature is intended to limit the damage that losing combinations can inflict and cannot be disabled.

What’s the same:

  • Content Experiments is also free
  • Testers cannot track multiple conversion goals in a single test
  • Testers cannot track revenues by page variation—though custom GA code can be added to accomplish this
  • Testers cannot set different confidence thresholds (we believe it’s still fixed at 95%) to determine a winning variation
  • The reporting interface is nearly identical to Optimizer (though metrics can now be viewed in daily, weekly or monthly intervals)

How the change will affect current users

  • Nothing will be migrated from GWO to Content Experiments in GA.  Tests currently running in GWO will expire on August 1st, so users need to retire them by that date or recreate them in the new tool.  Current or past GWO users must download reports on all tests or lose that data forever. You have until August 1, 2012 to retrieve historical testing data.

Final Thoughts

The new Content Experiments tool appears targeted at beginning and/or infrequent testers.

While it will be easier to implement than GWO, it takes away important functionality that more experienced users (like Donordigital and its clients) relied upon such as multivariate testing and the flexibility to use the tool with analytics programs other than GA.

Google says it plans to build more functionality into Content Experiments over time–unlike GWO, which had no improvements over its 5 year run. But for now, its functionality is limited and somewhat disappointing.

If these shortcomings aren’t remedied fairly quickly, we suspect more organizations will begin experimenting with other testing solutions that are more robust and flexible, as well as cost-effective, e.g. Optimizely and Visual Website Optimizer.

Dawn Stoner is Donordigital’s Director of Analytics & Testing and works with clients to help them increase online revenues with web usability best practices and landing page testing. Dawn speaks regularly about testing and optimization at industry conferences and publishes papers highlighting what’s working and not working with our testing clients.

Boost online response by optimizing your donation landing pages

Most organizations devote lots of time and energy into developing clever creative for their online campaigns and e-mails, whether it’s the Tck Tck Tck global warming campaign or AmeriCares “Send your mother-in-law to Darfur” gift catalog.

But if you’re trying to raise more money online, the first thing you may want to try is persuading more of the people who already click “donate” on your Web site or in your e-mails to actually make the gift on the donation landing page.   Surprisingly, 80% or 90% of the people who go to the donation page typically don’t actually complete the transaction.  And if you can improve your “conversion rate” by 10%, that’s 10% more donors – ka-ching! – without spending a nickel on more e-mails or Care2 names.

Over the last three years, Donordigital has been running landing page optimization tests that have increased conversion rates – the percentage of people who land on a donation page who actually make a donation – by 10%, 20%, even more for some organizations and some pages.  Of course, your mileage may vary, but if you optimize your landing pages, you’re pretty likely to increase your conversions and your revenue,  even if you have to pay a consultant to help you.

So what makes a better donation page?  While every organization gets different results on different pages, these are some of the variables that seem to make a difference for many organizations:

  • Show at least some of the form fields on the donation page “above the fold” (what you can see without scrolling).
  • Cut out unnecessary fields, such as title (unless you will really use it) and how-did-you-hear-about us, and ones that donors are reluctant to fill out (phone numbers).
  • Make the button say “Donate” not “Submit,” make it larger and colored; don’t include confusing and unnecessary buttons such as “back” or “cancel” (sometimes the default in to your software).
  • Provide a clear and compelling “ask” headline (Donate to save the whales!).
  • Show the “secure transaction” symbol from VeriSign or another provider above the fold.

A recent test showed a 28% increase in conversions (over the currently used “control” donation page) on a page that featured the VeriSign secure page logo above the fold, apparently making donors feeing more secure about giving to this well-known organization.

You may also want to create different landing pages for visitors from different sources.  For example, visitors clicking on the main donate link on your home page may know more about you than some of the visitors coming from a search on Google or Yahoo!.  E-mail landing page may work better if they make reference to the e-mail message that brought visitors to the donation page, but you can assume the visitors need less information and less assurance about your organization because they are read the e-mail and are probably signed up for your list.  Conversions are naturally higher on most pages in December because year-end giving and tax deductibility motivate many visitors.

So how do you do it?  There are two choices: A/B testing and multivariate testing.  In A/B testing, you test your current (“control”) donation page and an alternative one, then direct half your donation page traffic to your “control” page, and the other half to a page you think will perform better.  The more sophisticated technique, multivariate testing (MVT), enables you to test many variables at once.  However, but it’s more complicated to set up and requires more traffic to the test pages to get statistically valid results.  While there are commercial multivariate testing platforms such as Interwoven Optimost, Google offers the excellent – and free – Website Optimizer product, which integrates nicely with the free Google Analytics (which you’re already using, right?)

Nick Allen is co-founder and chief strategy officer of Donordigital, the online fundraising, marketing, and advertising company.  Contact: nick@donordigital.com or phone (510) 473-0366.