How to set up a creative testing campaign

You didn’t hear it here first: content is king. 

Your floss startup might have the best dental-hygiene strategy, the most advanced technology (floss is crazy these days), and the hottest celebrity smile endorsements, but if your ad creative sucks then your CPAs will too. 

To scale your Paid Social, you need to know if it was the headline, “This floss saved my marriage!” or the authoritative voice of the dentist that made your CPA drop last week. 


We consistently launch dedicated and controlled creative tests to fuel our clients’ growth, and in case you missed our webinar, here is our long-awaited step-by-step guide on Creative Testing. 


Rules to play by

First, let’s establish a few ground rules to ensure a successful creative test.  

  1. The creative should be the only variable, including the on-site experience. You need to be sure that the creative is responsible for the difference in performance, not copy, nor CTA, nor landing page.
  2. Run the creative test in a separate campaign to ensure a controlled environment. If new creatives are added to the evergreen (always on) campaigns, it may disrupt the learnings and result in a performance dip.
  3. The creative testing campaign is an exact copy of the main acquisition campaign/ ad set. If the main spending campaign is a Conversions - Purchase campaign targeting a lookalike of past purchasers, the creative testing campaign will be too. The only differences should be:
    - The budget
    - The one ad per ad set structure (see Rule #4) 
    - The duration of the creative testing campaign
  4. Utilize a one ad per ad set structure. Forcing an equal (or almost equal) budget across all ads means that the platform doesn’t get to choose its favorites without any data; all ads get an opportunity for significance.
  5. Use a benchmark ad, whenever possible. A benchmark is an ad we pull from the main campaign, that is either the best performer, or the most average performer (more below).
  6. Don't run massive rounds. Keep the number of creatives tested between 5 and 7 (including the benchmark).
  7. Run the test for at least 7 days non-stop. This covers good and bad days in the week for the account, reducing the chance for misinterpreted data. 
  8. Don’t turn it off, even if performance looks poor. We learn almost as much from ads that did not perform well as we do from ads that did. Don’t get played by delayed attribution, miss-attribution, or “bad days” (see Rule #7).
  9. Wait 2-3 days after the test ends before analyzing the data. Allow for delayed data to come in before you decide on winners and losers. 


Campaign set up 

Now that we’ve got the theory covered, let’s talk logistics. How exactly does one set up a creative testing campaign? In this example, we'll assume the account is an average eCommerce account, where the flow is Link Click ➝ Landing Page View ➝ Add to Cart ➝ Checkout ➝ Purchase

The main acquisition campaign/ad set is optimized for Purchases, and the audience is 100% Broad (both genders, 18-65+, all devices, no interests / audiences targeted, all platforms, all devices).


Step 1: Select your benchmark creative 

There are two ways to determine your benchmark creative: 

  1.  It is the best performer in your main campaign (i.e. lowest CPA). 
  2.  It is the most average performer in the acquisition campaign. 

We generally prefer the “most average performer” method as the barrier to finding a winner is lower. In theory, using a “lowest CPA” benchmark ad would be the better option, however in our experience we often see that a new ad will win with an even lower CPA, but when graduated to the main campaign it doesn't have the same performance. Using the “average performer” method uses more historical data and thus the winners have a better chance for success in the main campaign. 


Step 2: Create your creative 

This step is a blog post in itself, but for the sake of time here, just keep a few things in mind: 

  • To get results that actually mean something, think about what hypotheses you are testing. Maybe you want to see if a negative hook is more performant than a positive one, or maybe you want to know which content creator resonates more (i.e. test the same script with different actors), or maybe you want to know what color background is most appealing. 
  • You can test more than one variable at a time, but we recommend no more than 2-3 differing elements as to not muddy the waters too much. 

Step 3: Set up the campaign

  • First, duplicate the main ad set into a new campaign. This way all the settings are moved to the new creative testing campaign to ensure consistency. 
  • Duplicate the ad set in the new creative testing campaign enough times for each creative you want to test. Once the structure is set, it should look like this:


Step 4: Upload creatives 

Upload each of your baby ads into their own ad set.


Step 5: Set budgets 

If using CBO: 

  • Set a minimum spending limit on ad set level to 20% of the campaign budget. That way the platform should spend almost equally on all ads during every day.

If using ABO: 

  • Divide the budget by the number of variables and make sure all of them have the same budget.


Step 6: Press play 

Double check that your settings are the same as your main campaign, that each ad is in its own ad set, and that the budget is defined appropriately. Now launch that sucker! 


Step 7: Wait one week 

Don’t even think about touching the campaign with your grubby little fingers for a full seven days/ 168 hours.. You can set an end date if you like to use automation/don’t trust yourself, or pause the campaign/ad sets once the week is over. 


Step 8: Let things settle 

Wait another 2-3 days before analyzing the data. Delayed conversions are a thing, so pick up a hobby if you must. Knitting usually calms the nerves. 


Step 9: Analyze the data

Use the benchmark to determine which creatives are winners and losers. 

  • If you selected the “best performer” method, all ads that have SIMILAR performance to the benchmark, are winner(s).
  • If you selected the “most average performer” method, all ads that have BETTER performance than the benchmark are winner(s).

When looking at the data, use the main optimization event as the primary indicator. When deeper analysis is needed, use a few "supporting" events like up funnel metrics or CPC. 


Step 10: Extrapolate learnings

Congratulations stunner, you’ve got yourself some solid creative data to learn from! Even if your test did not reap obvious winners, you can still learn a lot from your “losing” creatives. 

  • Separate your creatives into “top performers” and “bottom performers.” What do “top performers” have that the “bottom performers” don’t? Did you prove or disprove any of your hypotheses?


Step 11: Graduate the winners 

Move those glorious winners to the main ACQ campaign where they can take on bigger and better things. 


Step 12: Rinse & repeat

Use your learnings to keep learning. Test new hypotheses. Make iterations of winning ads and see if you can beat them. Now that you’ve got some momentum, you’re unstoppable.

  • Wondering how often you should run creative tests? Check the FAQ below.

Frequently Asked Questions 

What should my testing budget be?

We decide on a budget based on the significance score that we want to see. 

  • Significance score is the number of results we want from each variable in the creative testing round. We want this to  be as high as possible since the more significance, the more confidence we have in the results. At Pearmill, we aim for a significance score of 10, however if the CPA is high we will work with a significance score of 5, but no lower. Got to 10 because we aim at 10, but when the CPA is high we sometimes get down to 5.
  • To get to the budget needed for the test, we use the following formula:

                     b = n* s * CPA

                              b = budget

                              n = number of creatives

                              s = significance score

                              CPA = average cost per conversion we want to optimize for

  • For example, if the number of creatives is 5, the significance score is 10, and the CPA is $50. The cost for this creative test would be 5* 10* 50 = $2500. 
  • Since the test needs to run for a week, we divide the $2500 by 7 (days), which results in $355/day. The $355 would then get divided by 5 (the number of testing variables), which = $71 per variable.
  • $71 is the spending limit we set on ad set level so each ad spends $71/day.


What if the budget is much much lower than what we need to run a proper test?

In this case, we have a few options (in order of preference): 

  • Lower the number of testing variables
  • Optimize for a higher funnel even (e.g. Add to cart)
  • Lower the significance number. This is the last resort.


How often should we run creative tests?

For new accounts we run these tests up to 3X/month. 

For seasoned accounts where you already have a sense of what is working creatively and can make decent predictions about the outcome of the tests, the most often you should test is once per month.

  • Another factor to consider here is ad fatigue. If the creatives fatigue quickly, we may increase the frequency to 2X/month. If the creatives have a longer shelf life (i.e. they continue to perform for weeks or months on end) we may lower the frequency to once every 6 weeks.
  • Never test creative just for the sake of doing “more.” Plan and execute accordingly to your account needs.


Should we use pictures, videos, carousels, collections?

In general, videos tend to do the best, however, we have accounts where that is not always the case. Ideally you should test both static and videos often, and see what works better. 

Once you have more clearly defined what creative elements resonate with your audience, you can test format (i.e. video, static, story, feed, carousels. etc.) with more accuracy. 


What if the account is 100% new and there is no evergreen campaign?

Run an audience test with ± 5 ads for each audience (same ads). Once you find a winning audience, you can start running creative tests.

Related Article

3 organic social trends to use in your paid ads

3 organic social trends to use in your paid ads

Leala King

Related Article

Top 10 ads of 2023: Notable trends & learnings

Top 10 ads of 2023: Notable trends & learnings

Mary Boyagi

Related Article

3 steps to video ads that convert

3 steps to video ads that convert

Oleh Kovalevskiy

Related Article

Ad Creative Newsletter #82 — Brush your (leg) hair

Ad Creative Newsletter #82 — Brush your (leg) hair

Mary Boyagi

Related Article

Ad Creative Newsletter #81 — Middle parts & airplane farts

Ad Creative Newsletter #81 — Middle parts & airplane farts

Mary Boyagi

Continue learning from our team

Close button

Let's talk...

Oops! Something went wrong while submitting the form.

Ready to grow?

Pearmill — © Copyright 2023