Marketers as Vibe Providers

The other week, I wrote about Marketers as Eval Providers (and how event optimization works on ad networks). I figured I’d write about the other most important lever in advertising these days: ad creative.
The Two Constituents of Creative
Similar to how modern ad networks use your event signals to target their audience, they’ve also started using the contents of your ad creative to do the same. While the campaigns ultimately optimize towards your event, the creative data is available much earlier (i.e. immediately!) – so they use it as to decide which initial users to target with your ads.
This means that when you’re producing your ad creative, you have to think about two constituents: the AI/algorithm consuming your data, as well as the potential users/customers that will actually see your ad and engage with it.
The AI analyzes every pixel, every word, every emotion your creative conveys. It's looking at the aesthetic, the messaging, the implicit promises you're making. Then it uses all of that information to find people who are most likely to respond based on everything it knows about them.
We are vibe providers for AIs.
The New Creative Brief
If you accept that your creative is talking to AI as much as it's talking to humans, your creative brief needs to evolve. You need to think about:
- What signals are you sending to the algorithm about who this is for?
- How clearly does your creative communicate the problem you solve?
- Are you communicating in aesthetically diverse ways as to broaden the targeting?
You have to produce ads which are aesthetically diverse (because different people have different aesthetic sensibilities), and select for the audience (by clearly communicating the pain points, and the audience).
To do this, it’s now best practice to run an “evolutionary process” to iteratively discover the right mix of copy, aesthetics, and overall approach to creatives that attract and convert customers.
An Iterative Approach
At Pearmill, we've structured our entire creative testing process to iteratively discover creatives that target the right people as well as engage them to convert. Here's how that works in practice:
First, we test concepts systematically. We'll run distinct creative approaches against the same audience to see which direction is showing promise.
Then, we iterate on the winners. Once we identify a creative direction that the AI can work with effectively, we create variations that maintain the core "vibe" (either the hook, the aesthetic, or the overall approach) while testing different executions.

Notice how in Round 2, we’re testing the different aspects of the winners in Round 1:
- First creative is testing if “your house might not be a mansion” was the reason the first ad from Round 1 was important.
- Second creative is testing if the aesthetics of the first ad from Round 1 did was important (i.e. the wall with golden hour lighting)
- Third creative is testing if “3 hour cleaning” hook from the second ad in Round 1 was important
Through these iterative tests, we can then combine learnings to help the AI target different audiences that convert and have the cheapest costs.
The 4 Quadrants
When we get results, we use the following analysis mode to figure out how to iterate on the creative.
Using the following metrics: CPA (Cost per Action) and CTR (Click-through Rate). We use the following heuristics to decide on action items after a test is concluded:
❓FAQs
How do I know which creative approach to start with?
Start with diverse creative approaches that represent different aesthetics, messaging styles, and value propositions. The goal is to cast a wide net initially to see what resonates with both the AI and your target audience. Test at least 3-4 distinctly different approaches before diving deeper into iterations.
What's the minimum budget needed for creative testing?
Ideally, you reach for statistical significance. However, that’s not always possible with budget constraints. Generally, you want each creative to receive a minimum of number of conversions before making decisions. The exact budget will depend on your industry and target audience, but plan for at least a few hundred dollars per creative test round.
How long should I run creative tests?
Each test round should run for at least 7 days to account for daily performance variations. However, don't let tests run too long - if a creative is clearly underperforming after reaching statistical significance, it's better to iterate quickly rather than waste more budget.
What if none of my creatives perform well in the first round?
This is actually valuable information! If nothing performs well, you likely need to reassess your core value proposition or targeting approach. Look for any small wins or patterns across the poor performers - even a slightly better CTR or CPA can hint at what might work better in the next round.
Should I be testing different platforms with the same creative?
While some creative principles work across platforms, it's best to adapt your creative to each platform's unique characteristics and user behaviors. What works on Facebook might not work on TikTok or LinkedIn. Consider testing platform-specific variations of your winning concepts.