Why Bigger Is Better For Variations In Creative Testing
The truth about advertising is that there's a lot of money spent on it — and a large portion of that money is spent on inefficient content. One of the root causes of ineffective content is the incorrect implementation of A/B testing. In particular, drawing conclusions based on small variations in ad creative. Small variations are things that don't alter the underlying asset in an immediately recognizable way. These variations are often quick and easy to implement, as they don't involve making any major changes to the original ad concept.
While testing similar variations of an ad works in the sense that there will always be a clear winner, the outcome is unlikely to yield any higher performance across the board. What if some assumptions you've made are false and an element in the ad that you overlooked or didn't believe needed testing made all the difference? In Meta's campaign budget optimization structure (CBO), your algorithm automatically distributes the budget among all ad sets and creatives in your campaign. When there are too many similar ads in an ad set, Meta will not optimize your distribution because the algorithm will treat assets with small variations as the "same". For example, if you have two versions of an ad that differ only in a few works or CTA, Meta will likely only place budget in only one of them. As a result, small variations are unlikely to improve overall performance.
Instead of changing just one or two elements, you might be better off thinking big, and changing the visual or contextual concept of an ad. Large variations involve the introduction of new prominent graphic elements.
While making big changes can feel intimidating and even risky, they may be necessary to find out what converts best for your audience— leading to better, more optimized campaigns. To ensure that your variations are significant, take one variable at a time, and make sure your tests include a version that has no variable at all. For example, let's say you want to test whether adding a specific image to the ad makes it more appealing to your audience. Try creating three versions of your ad:
This way, you'll see which of those two images helped your ad perform better—or if neither of them does! In fact, you can go one step further and combine these large strategic variations with multivariate testing of creatives. One ad could have multiple people, while the other has none. With these significant variations, you can now attribute any differences in performance between the ad variations to changes in creative treatment, not just luck. This is also great for algorithms. They won't group ad sets containing large variations, helping you discover "champion assets" or an ad that Meta will place the majority of an ad set’s budget into.
Despite having overwhelming benefits, large variations in A/B testing are expensive. The amount of time and resources that go into developing these variations requires that you come upon a winning design that actually converts. Omneky has put significant effort into creating a platform that reduces friction in the process of A/B testing, helping us generate optimized ads for hundreds of businesses.
We rely on creative analytics to drive A/B testing variations
At Omneky, we leverage creative insights to guide the ad variations we test across platforms. Specifically, we use computer vision to identify how numerous text and image ad features correlate with conversion metrics. Additionally, Omneky uses AI to determine which ads are performing the best. We highlight the top-performing ad for every customer on the analyze tab of our dashboard. Our designers then have the ability to take an ad and create variations based on proven best practices and insights from top-performing ads. Armed with these data insights, Omneky develops strategic variations in the ads that we A/B test. One Omneky customer experienced the power of Omneky's data insights firsthand. When analyzing the creative insights dashboard, Omneky discovered that video ads had much higher CTRs (some weeks >15% increase) and impressions than image ads. Yet, only 50% of the client's ad budget was spent on video ads. Omneky acted on this priceless insight by significantly increasing the number of video ads and consequently % spend on video ads, helping drive CTR and conversions in the following months. Ultimately, the ability to have a robust data infrastructure and a tool like creative analytics saves countless time in the process of creative testing. Creative analytics helps you narrow down all of those competing hypotheses about what needs to be tested into ones that are backed by data.
We experiment with dozens of ad styles
At the heart of Omneky's A/B testing strategy is experimenting with different ad styles. We take an ad and render it through multiple styles, producing unique variations that give us meaningful feedback that we can use for further iteration. Here is an example of how we perform multivariate creative testing for one of our customers, simultaneously altering copy, graphics, and ad styles. Changing media types allow Omneky to understand our audience's preference for different ad types. For example, instead of repeatedly running single image ads, we might experiment with carousel ads to create a better storytelling experience. We might have previously run numerous testimonial ads because we think it's best for branding —but we experiment with motion graphics to create a more engaging experience for the customer. There are dozens of different ad styles that aren't difficult to implement. When you're ready to try something new, there's no need to start from scratch — dig into your asset library and experiment with old ad designs by switching up the media type! In today's mobile-first world, it's even more important to experiment with large strategic variations, as digital advertising viewing habits are changing at an unprecedented rate.
Without the proper framework in place for your A/B testing, you are liable to waste precious time and money on ineffective ads, ending up disappointed by the results. By making large variations in your ad sets through data-driven insights and relentless experimentation with different media types, you will be able to generate the kinds of actionable insights that move the needle for your business.