Is a website finished once the design and development is complete, and the site gets launched? Of course not! It doesn’t matter how beautiful your site looks or how well it converts – there are always going to be changes you can make that will improve conversions further and more importantly, make you more money.

Unlike other facets of digital marketing, conversion rate optimization (CRO) capitalizes on the visitors you already have and can result in some very big, and very quick wins. I can’t emphasize enough how beneficial this practice can be.

Meet CRO and A/B Split Testing

But before we go any further, It’s important to clarify the distinction between CRO and A/B testing. Conversion rate optimization is the whole process of making alterations to a site in order to increase conversions. This can include a variety of different testing methods, as well as data analysis and the development activity needed to implement your tests and make changes.

A/B testing, on the other hand, is a specific method of testing used to help optimize your site for conversions.

It’s really a simple premise. To run an A/B test, you set up two different versions of a page that will run consecutively. Once both pages are live, 50% of your visitors are sent to version one, while the other 50% are sent to version two. You then track how the behavior of your visitors differed between the two versions, and use this information to make permanent, positive changes to your website.

For example, you might run an A/B split test to decide between using green or red call-to-action buttons. For the sake of argument, we’ll say that version one uses green call-to-action buttons and version two uses red call-to-action buttons.

If, when the test is complete, you can see that 6% of visitors clicked the call-to-action buttons on version one, but 12% of visitors clicked the call-to-action buttons on version two, you now know that red call-to-action buttons are more lucrative for your site than green buttons.

Of course, you probably won’t stop the test there. You might then repeat the test and pit red buttons against say, orange buttons, and so on. For best results, you’ll always want to have at least one test running on your website.

Getting It Right

The premise of an A/B test may be simple, but correctly executing the tests to get the most value possible out of the data is a little trickier. To do this topic justice would require a post of its own, but here are a few quick tips to help get you started on the right path:

  • Run tests for at least a month. The longer you run the test, the more confidence you can have in the accuracy of the results.
  • Never stop monitoring site performance. You’d want to know if you were running a test that had managed to halt all conversions, wouldn’t you?
  • Test at the right time. Don’t run your test during seasonal periods where the results may be skewed by natural changes in consumer behavior.
  • Track the right KPIs. It’s not just about conversions. If, for example, you run a test that involves cutting prices, your conversion rate might increase, but if your overall revenue drops, something’s off that you need to get fixed.

Ready to get started? Here are 50 A/B split tests you can run to help optimize your site to perfection.

Test #1 – The wording of your calls-to-action

You only have a few words to convince your customers to click the “Buy Now” button, so keep your call-to-action copy clear, concise, and persuasive. When split testing, start by changing one word at a time – such a small change can make a huge difference.

Test #2 – The color of your calls-to-action

Despite the color’s connotations with the word “stop,” tests executed by Hubspot found that a red call-to-action button outdid a green button by 21%. Not too shabby.

However, what worked for them won’t necessarily work for every website out there, so try playing around with a few different colors to find out what works best on your audience.

Test #3 – The size of your calls-to-action

Bigger isn’t always better. A call-to-action button should be large enough to stand out, but if you make it too large, it might look out of place. Or, even worse, it might look unlike a button enough that your visitors aren’t sure what to do.

Still, there are no set rules in CRO (if there were, why would we keep testing?), so try increasing or decreasing the size of your calls-to-action until you hit on the best fit for your site.

Bonus tip: Try changing their shape, too.

Test #4 – The number of calls-to-action

Do you only have one call-to-action per page? Why? Try giving your visitors more opportunities to convert by breaking up your page copy with multiple calls-to-action.

Test #5 – The tone of your landing page copy

Will your copy work best if it comes across as professional and authoritative? Casual? Comical? This won’t just affect your conversion rate, but will significantly impact how people perceive your brand as well. This means it’s even more important to find something that’s just right.

Test #6 – The length of your landing page copy

I often hear people say that the best web copy should be short and sweet – but I don’t always agree.

Noone can argue with the quality of this copy from Apple:

But this is really pretty great as well:

The copy in the second example works because it’s easy to read. Never be tempted to use industry buzzwords or technical jargon. It won’t make you look big or clever – it’ll  just leave your customers confused. The copy also benefits from being all about the customer (notice how it uses words like “you” and “your” rather than “we”).

How do you know what’s going to work for you? Test, test, and test some more.

Test #7 – The size of your landing page copy

Too small and some visitors may struggle to read your copy. Too large and your copy could be taking up valuable real estate that’s better spent on other website elements. Testing to find the right balance is key.

Test #8 – Your headline copy

Your landing page copy is pretty damn important. Your headline copy? Even more so.

The harsh truth is that many of your visitors won’t even get as far as reading your main copy – they’ll decide whether or not to continue based on the images you’ve used, the general feel of your site and your headline copy. Keep testing until you get those vital few words just right.

Test #9 – The images on your landing pages

Are you still using stock images? Take a few minutes to capture some real images of your staff and working environment, and split test using them instead.

The New Jersey and New York moving firm, Harrington Movers, tested this out and saw their conversion rate increase by 45%. Another test on stock photography resulted in an increase of 34.7%.

If you ask me, stock photos have had their day. If you’re looking for a quick win, you won’t go wrong with this test.

Test #10 – Test professional images against casual images

Go one step further with those “real” images and pit professional shots against something a little more relaxed.

Marketer Carl Taylor tried this with his head shot. He replaced this:

With this:

The change is small, but it boosted responses by 76.62%.

Test #11 – The fonts you use

Fonts make a big impression. Unfortunately, this means that using the wrong font means making a big impression – of the wrong kind. If you don’t believe me, just try using Comic Sans and see what happens.

Readability is key here, and the general consensus is that san serif fonts (those fonts that don’t have “feet”) rather than their serif counterparts.

Arial is a safe bet, but if you want to be a little more ‘adventurous’ with typefaces, then always split test your different choices.

Test #12 – Test benefits-focused sales copy against copy focused on loss aversion

Benefits-focused sales copy is relatively self-explanatory – it works by outlining the benefits the consumer will gain by choosing that product or service. This is a good example of benefits copy in action, from Crazy Egg:

Loss aversion is a little more interesting. It plays on the fact that most of us are programmed to work harder to avoid potential losses than to acquire something new – even when it’s of equal value. For example, we tend to be far more distressed at the prospect of losing $100, than we are motivated by the chance to gain $100.

When writing sales copy that’s based on loss aversion, you need to focus on what will happen if a customer doesn’t buy your product or service. Something like this:

Certainly, you don’t want to go too far into the negative, as nobody responds well to threats. But when done effectively, loss aversion copy can be equally as effective – if not more so – than standard benefits-oriented fare.

Test #13 – Add “USPs” to your product listings in category pages

Your USPs (unique selling points) are short pieces of information that highlight why a customer should choose to buy from you and not your competitor. This might include free delivery, relate to the quality of your products, the speed at which you can deliver, or the warranty you offer.

We often see USPs featured prominently on homepages:

Etsy Homepage

But they frequently disappear further down the sales funnel. Try adding USPs to the product images on your category pages and see if your conversions increase:

Category Page on Woman Within

Test #14 – Add trust signals or increase their prominence

Trust signals are visual snippets of information that reinforce your position as a genuine company. This might be your phone number, customer reviews, or social media icons, or it could be logos from your accepted payment methods or programs like Verisign and Norton.

Basically, it’s things like this:

Chances are you’re already using some of these signifiers, but you could be using more. Split test increasing the number of trust signals you use, and, where possible, their prominence on your site.

Most importantly – and this is a mistake I see a lot – ensure they appear consistently throughout the customer journey. Don’t let them disappear at the checkout – if anything, they’re even more critical at this stage.

Test #15 – Rephrase your email sign-up copy

How enticing is your copy asking visitors to sign up for your newsletter? Ask yourself – and be honest. Would you sign up if you saw your copy on another site?

All too often, I see email sign-ups boxes boasting text along the lines of “Join our mailing list” or “Sign up to our newsletter.” Why should I? What’s in it for me? How do I know you’re not just going to spam me?

Tell your visitors why they want to receive your newsletter. Think features and benefits. And make it clear you have no interest in spamming them.

Unless you do. We’ll talk about why that’s a bad idea in another post.

Test #16 – Increase urgency to create a fear of loss

This tactic involves playing on the consumer’s fear of missing out on getting something they want.

A common way we see this technique employed today is to display the number of products available. Some sites take this one step further and show how many people have already bought the product:

Screenshot from eBay

Unfortunately, although the above technique is definitely worth testing, I do question how well it works. Claiming there are limited numbers available is an age-old sales tactic – do some consumers assume the remaining stock figures are fabricated? I think the savvier ones might at this point.

Instead, I really like how Amazon increase urgency. It feels far more genuine, as they display precisely how long a visitor has to order if they want to receive the item by a particular date:

It’s far less threatening (which, remember, we said earlier can backfire), but it still inspires a fear of loss that can boost conversions.

Test #17 – Pitch a rotating carousel against a static image (on your homepage)

Companies (and web designers) love rotating carousels because they mean more real estate for sales messaging. Unfortunately, for many people they’re a source of great frustration.

Don’t automatically assume rotating imagery is a good idea. Try both, and test thoroughly.

Test #18 – Test free delivery against charged delivery

This rarely fails. Consumers like the price they see to be the price they pay, so if they reach the checkout and there are no surprise extra charges, everybody wins.

Digital agency Avantec tried this, split testing sales of the same product over a month – one priced at £10 plus £3 delivery, and one priced at £14 with free delivery. Can you guess the result? Even though the second option was more expensive, it came out on top.

Of course, I’m not expecting you to eat the cost of free delivery. If you’re going to try this split test, make sure to absorb some or all of the cost into your products first.

Test #19 – Trial a special offer and/or a discount

Everybody loves a bargain. If you don’t already regularly promote special offers, try it out.

Test #20 – Implement offers to return customers who didn’t convert the first time around

Convert customers that are sitting on the fence by utilizing cookies to display offers to returning customers only. You may find that it’s much easier to focus your efforts on converting those who are already familiar with your brand than it is to convince new visitors to buy.

Test #21 – Try increasing your prices

Seriously? Yes, I’m dead serious. Did you ever hear the tale of jewelry store employee who accidently doubled the price of products that weren’t selling and quickly sold everything?

Consumers like to buy quality, and high prices say “high quality.” Give it a go – if it works and you sell more products for a higher price, that’s a huge win!

Test #22 – Try removing distractions from your sales funnel

Your website needs to include information that anticipates and answers your customer’s questions. But, at the same time, too much information, in too many places, could actually prevent your customers from making a purchase.

Test making the route to purchase as distraction-free as possible by relocating non-essential information to a different page. It worked for MoneYou.

Test #23 – And try removing the “discount code” box from your checkout page

Discount codes are a big win for customers. They were about to buy something anyway, but look – there’s a discount code box. Let’s go Google “retailer’s name”+”discount code” and see if we can save some money.

Studies have shown that 27% of shoppers will abandon their cart to do this.

What happens next? They either find a discount code and get a few dollars off a purchase they were probably going to make anyway, or they get distracted and forget about their purchase altogether.

Get rid of it and see what happens. Chances are you aren’t going to lose out on much – if anything – at all.

Test #24 – Test adding “featured products” to your homepage

Got a product that you’re struggling to sell, or want to sell even more of a super-profitable product? Try highlighting it on your homepage as a “featured product.”

Bonus tip: Play around with the wording too. Try things like “customer favorite,” “recommended product,” or “our top pick.”

Test #25 – Test the performance of a free trial versus a money back guarantee (or neither)

If you sell a service or a product that’s billed recurringly, play around with a offering free trials, money back guarantees, a combination of both, or nothing at all.

Neil Patel, for example, tried this on his Quicksprout software and found that free trials performed the best overall. In fact, revenue was 15% higher than with the money back guarantee he tried.

Don’t assume you’ll see similar results, though. Always take the time to perform your own tests to account for your unique industry, audience, and offer.

Test #26 – Test the size of the discount you offer annual subscribers

Annual discounts are a win all round – you get more money up-front, and your customers are happy because they’ve secured a nice discount.

However, you don’t want to cut too far into your profits by offering a bigger discount than you need to. Try changing the size of the discount to find a balance between maximizing conversions and profit.

Test #27 – Try adding more pricing options

Psychology and behavioral professor, Dan Ariely, noticed something curious in The Economist’s subscription model. Despite having three options, there were only two price points:

  • Web-only subscription – $59
  • Print-only subscription – $125
  • Web+print subscription – $125

Dan’s question was: why bother with both of the more expensive options?

To find out what was going on, Dan tested what would happen if the middle option was removed.

First, he asked a group of students to choose between all three options:

  • 16% chose the first (web-only) option ($59)
  • 84% chose the third (web+print) option ($125)
  • 0% chose the second (print-only) option ($125)

He then repeated the test, with the middle option removed:

  • 68% chose the web-only option ($59)
  • 32% chose the web+print option ($125)

By including the second option, which – understandably – nobody was interested in, the third option was perceived as far better value than the much cheaper web-only option.

This is clever stuff. If you’re not doing something similar already, give it a go for yourself.

Test #28 – Pitch round prices (i.e. $2) against odd prices (i.e. $1.99)

It’s widely accepted that consumers tend to perceive odd prices as cheaper than their relative round price counterparts. That said, other studies have concluded that consumers actually prefer to see round prices.

Whichever pricing structure you decide to employ, it’s worth testing if an alternative structure would work in your favor.

Test #29 – Trial reducing customer choice

Earlier, in test #27, I looked at how cleverly increasing the options you offer customers can significantly boost revenue. That said, it’s also possible to offer too much of a good thing.

Present a customer with too many options and they may become overwhelmed and decide not to purchase anything.

Dutch gift voucher company Pluimen, for example, reworked their homepage to have just one clear option, from this:

To this:

As a result, they saw revenue increase by 19%.

Test #30 – Trial social media icons that include social proof

Social proof plays on the “sheep effect.” If we see someone else doing something, we’re more likely to follow suit.

If your site’s social media icons currently look something like this:

Try switching them for icons that show how many people have already become fans of your pages, like this:

As long as your buttons don’t read “0” followers, other visitors will see these numbers as social proof that your profiles are the place to be.

Test #31 – Test the size and position of your social media icons

That said, there’s little point enhancing your social icons with social proof if you hide them away.

Try moving your social icons to a prominent place on your site (the top header’s usually a good bet), as well as increasing their size for maximum visibility.

Test #32 – Test a single-page checkout versus a multi-stage checkout

Some shoppers find single-page checkouts overwhelming and abandon their cart as a result. Others prefer the process to checkouts that span multiple pages.

If you’re noticing unusually high dropout rates at this stage, test a move to a single-page checkout process or to one that’s split over two, three, or four pages. Kalio eCommerce tested this on one of their clients and saw revenue increase by 3.5%.

However, Elastic Path did a similar test and found that the single-page checkout performed better than the alternative, multi-paged option.

Lesson learned: Always test. Just because something works for one site doesn’t necessarily mean it’ll work for you.

Test #33 – Remove (or minimize) navigation bars within the checkout process

The fewer distractions that are presented to a customer who’s ready to buy, the better. However, removing distractions can affect usability, so approach this test with caution.

A good compromise is to try removing the navigation from these all-important, while including a clear button that a customer can use if they need to return to the main site.

Test #34 – Add an “impulse buy” to your checkout page

Ever notice how, in the supermarket checkout line, you’re surrounded by cheap and tasty treats? This is called in-queue merchandising, and the idea is to get customers to top up their shopping trip with impulse purchases. The same principle applies online.

Suggest an additional product (or products) to visitors just before they make their purchase and you could drastically increase their total spend.

That said, impulse buys act as a distraction, so there is a risk you could wind up losing the whole sale. What’s the solution? As always, test, test, test!

Test #35 – Reduce the number of boxes on your contact forms

Does your contact form look something like this?

Or is it more like this?

As a general rule, you should be making it as quick and easy as possible for people to get in touch with you. Try cutting your contact forms down to:

  • Name
  • Email address
  • Phone number (if necessary)
  • Comment

Only the name and email address should be required. Let your customers decide whether they want to give you more information than that.

Alternatively, even with a lengthy contact form, you may be able to boost conversions by making some of the boxes optional. Eduhub did this and improved form completion rates by 5.21%.

Test #36 – Trial removing the obligation to register before purchasing

This has always been a frustration of mine. Why should I need to register my details in order to buy something? I can see the benefits to the company – who know has my information to use – but it’s not really necessary to my experience.

Luckily, companies are increasingly catching on to the fact that the easier it is to buy, the more money they’ll ultimately make.

If you’re still forcing customers to register before buying, split test a checkout process where people can choose whether or not they want to create an account, or just want to complete their damn order.

Test #37 – Try different 404 pages

I see a lot of businesses overlooking their 404 pages, yet a good one could mean the difference between someone leaving your site, or staying and making a purchase. That’s why they’re well worth taking the time to get right.

Here are a few good efforts:

Urban Outfitters

South Park Studios

Suspended Animations

If yours doesn’t stand up against these examples, try something new that’ll really resonate with your visitors.

Test #38 – Experiment with your use of customer testimonials

Customer testimonials can make or break a sale. If they don’t appear in a prominent place on your site, try giving them a starring role and making sure they appear consistently throughout the buying process.

Bonus tip: Try switching around the testimonials you choose to display. You never know which words or phrases will help others convert.

Test #39 – Test pop-up chat boxes versus chat boxes that customers have to click to activate

Live chat can be a really useful addition to your site, but it also serves as a distraction – and if the box pops up once it’s already been clicked away, they can start to get really annoying.

Split test a chat box that makes itself known automatically against a box that’s clearly available, but only activates if the customer requests it. Doing so will help you find the balance that’s best for your customers.

Test #40 – Test auto-play videos versus videos that visitors have to click to play

This is another potential way to inadvertently annoy your customers – especially if the video auto-plays with sound.

If you’re seeing high exit rates on a page with an auto-play video, test the auto-play version against a video that a customer has to choose to start playing.

Test #41 – Try different video voiceovers

For example, pitch a male voice against a female voice. Or, try a voiceover with a different accent. Interestingly enough, British accents have proven very popular in the US.

Test #42 – Pit informational videos against sales videos

Some consumers respond best to passive information – they want to feel educated about a product without feeling as if they’re being “sold” to.

Other consumers respond to videos that are much more direct – they want to be told that they need this product rather than deciding for themselves.

Try both, and use your split test data to come to a decision.

Test #43 – Trial a horizontal navigation bar versus a vertical navigation bar

For years, left-hand vertical navigation was the norm. Yet times are changing and vertical navigation is fast falling out of favor.

However, despite horizontal navigation becoming increasingly common, some users (generally older generations) do still feel more comfortable with side navigation.

Unless your audience is primarily 40 and below, it may be worth testing both layouts.

Test #44 – Use icons in place of text in your navigation

This one can be a little controversial. From a visual point it can look stunning, but even widely-recognized icons can result in a poor user experience and diminished SEO results.

Mezcal Buen Viagje

Adam McFarland tested this and text came out tops, although his results weren’t exactly cut and dry.

Bonus tip: Test using icons alongside text for maximum usability.

Test #45 – Remove your navigation menu

Am I being serious? Yes. It’s a bold decision, but one that can work wonders (it increased conversions for Yuppiechef by 100%). Best of all, the beauty of split-testing means that if it all falls apart, you just stick with the version you love and know works.

Test #46 – Try increasing the size of your product photos

Consumers want to see large, crystal-clear images of the product they’re considering buying – remember that they can’t touch or hold the product before purchasing, so they need the next best thing.

Try increasing the size of your product images, even if that means other content is pushed below the fold. Czech home retailer MALL.CZ tested this exact change and saw a 9% increase in sales.

Test #47 – Go one step further and let your product photo dominate the space above the fold

Online imagery is more important today than it’s ever been – it’s why Pinterest and more recently, Instagram, have seen such incredible success – so why not make your entire product page about the image?

It’s a bold step, but split test case studies have demonstrated positive results. Auction site Skinner increased the size of their product photos by 28% and saw conversions from the page increase by 63%.

Test #48 – Label good value products as “best buys”

Indecisive shoppers often need extra guidance to help them choose a product and, ultimately, convert. Help them out by highlighting which products offer the best value.

Test #49 – Cut superfluous features in favor of speeding up your site

Studies show that the majority of users expect a site to load in less than 2 seconds, and will generally abandon a site that takes more than 3 seconds to load. If your site could be quicker, try removing large files such as videos or carousels.

Bonus tip: Check your site in PageSpeed Insights too. There, you’ll find all the info needed to get your site up-to-speed.

Test #50 – Try geotargeting

Geotargeting occurs when you serve different content to your visitors, according to their physical locations. This might mean changing languages, currency, imagery, offers, or even available products. Effective geotargeting increases the relevancy of the information that’s delivered to customers, which will almost always make a customer more likely to convert.

So there it is – 50 A/B split tests you can perform to get you started on the road to a perfectly optimized website.

But with all these different options, don’t forget about those best practices I listed at the start of this article. If you don’t perform a test for long enough, perform it at the wrong time, or take the data at face value, the results could be meaningless – and that’s not good, especially if you’ve invested a lot of time gathering your data.

Give them a try, then come back to this post to share your results. And of course, if you have any other tests to suggest, I’d love to hear them. Leave a note with your favorite tests in the comments below!

  1. I noticed on your mobile page that you have a banner for email signup. Every couple of seconds, the “Subscribe” button wiggles around. I didn’t catch the banner until the button moved; I bet that feature helped conversions! Some slight animation at key parts of a page can be really helpful. As usual, test test test.

  2. Thanks for all the great ideas. One thing I struggle with is speed of learning vs. signal quality. We have one main funnel we are trying to optimize and with our current traffic it takes about a month to get a result we can have confidence in. Obviously running one test a month is too slow, but on the other hand running multiple tests within the same funnel simultaneously makes it hard to understand which change is having an impact. How have you dealt with this issue?

    1. Peter,

      At WhenIWork.com we have the same issue as we have a 30 day trial. It takes 60 days to get accurate data. I usually do a 3 (a/b/c) way tests at once and conduct test on different traffic sources. For example I’m always testing something on our Adwords traffic and landing pages as well as Facebook Ads and our homepage. It’s not apples to apples comparison to all of our traffic but it helps.

      We also choose our test carefully and have a simple rating system to help us prioritize what we test and when. Currently we have our next 4 months of tests already planned out.

  3. Nice post, Sujan. Thanks for the mention too 🙂
    Ref. Test #49: We’ve seen an increase in conversions since speeding up our site – an obvious but often overlooked piece of advice.

  4. Hi
    I am in business technology and education industry small consultancy in the east midlands.

    I am looking to grow and build a solid customer base.


  5. Hi Sujan, I discovered your blog today. Also had a look at ContentMarketer.io and your upcoming book. I landed on this page when I was looking for an image on A/B testing.

Leave a Reply

Your email address will not be published. Required fields are marked *