When I first got into digital marketing [REDACTED] years ago, I was blown away by a very simple concept: A/B testing. It was such a simple idea but the impact was powerful.
I discovered an easy way to continually increase how much money my campaigns made.
I started A/B testing everything from landing pages to ad copy and right down to email subject lines.
To this day I am still an advocate of constantly testing every part of your funnel with A/B testing (though I realise the headline up there may make you think otherwise!) It's still almost a cheat code to constantly improving results.
I remember when I first got really into A/B testing - I had dreams of being able to automatically test hundreds of different variations all the time across my websites and have it automatically choose the best variations without me needing to change things manually.
I actually think there are companies doing that with machine learning these days!
I'm not going to go into detail here and how best to do A/B testing, I have a feeling my audience has a good idea of what I'm talking about. Instead I want to spend some time on the other side of the coin: are there real problems with it?
You might say sure: it can't exactly be good for SEO and it can be a lot of work, but those are too easy to guess. Let's talk about the stuff you may not have thought of:
Brand can't be measured fast enough
I talk about brand here a lot. I kind of have to because digital marketers tend not to think about it enough - or worse, consider it meaningless.
I talked about brand in depth here, I recommend you have a good read of it.
The main issue: brand is both hard to measure and takes a long time to see the results of - both things that A/B testing isn't really compatible with.
You don't want to be making changes that may cause a short term gain in conversion rate but hurts your brand in a way that makes it more expensive in the long run to get sales.
I'm not talking about A/B testing on just a landing page - while that's important - something like email subject lines could be hurting your brand if you focus all your time on open rates, which brings me to:
The wrong metrics can lead down a bad path
I am subscribed to a lot of emails, many of which are run by marketers and I've noticed a horrible trend: they focus on getting email opens.
The issue is that while open rate is important, it's possible that you are hurting your conversion rate or LTV by using these subject line tactics.
While you may get someone to open an email by using something like: "Emergency, it's important you open this (name)!" it also becomes a bait and switch which just pisses people off.
A/B testing to the wrong metric, in this case open rate, can hurt both your brand, as mentioned above, and your sales and conversion rates.
Think about your other channels - are you making changes to your search ads that could be focusing too much on clicks or conversions before looking at the long game of LTV and margin?
It created clickbait
Number 6 will surprise you! <-- that line used to be able to entice people to click a link. I'm sure that A/B testing article headlines lead to that kind of addition.
The main issue is this: "Clickbait style headlines and listicles are far less effective at generating social engagement than they were.". To be frank, people got smart to them and stopped sharing them and clicking them. | [Archive.org link in case of linkrot]
This may just be the law of shitty CTR's as Andrew Chen calls it. | [Archive.org link in case of linkrot] but in general: what if you ruined your brand and worked your butt off making those only to find they don't work any more and now you can't get any sales at all?
You might be optimising to the wrong customers
Some companies may say they don't have wrong customers: if the customer bought something, they are a good customer. Those companies are wrong.
Some customers are easy to acquire and convert but give you a crappy return. You need to look at the LTV of different segments of customers so that you get the ones that actually get you the best return.
Credit card companies don't want the customers that sign up - get all the benefits and then never use the card again - those ones cost money. Telecoms don't want customers that are always calling customer service for basic issues, they cost money.
You want the customers that you the best return. Not only that, you want to make sure you have a regular group of brand advocates that can help you with marketing your brand and lowering your cost per acquisition.
While the conversion rates or revenue might be increasing due to A/B tests it is possible you are hurting your long term goals just for a short term gain.
Remember that: A/B tests generally focus on short term gain
Again, I want to make it clear: I am an advocate for A/B testing (and I'm lumping in multi-variate testing here as well). It is incredibly important and everyone should be doing it. Just make sure you consider the above while you do it!