As designers and creatives, we have a natural reaction to resist the notion that creative decisions can be made through analytics and metrics. But, as someone who has converted over to the dark side where analytics and creative can live happily together, I am here to tell you that it will change how you design and it will change how effective you can be as a designer.
I strongly encourage everyone to continually run small A/B tests that can help you make concrete creative decisions and build your knowledge of your customer – what they like, what they don’t like, and what they react to. A/B tests are easy to run in all WhatCounts platforms and take relatively little production time to implement and get going. But most importantly, they can generate tremendous return for your business.
A/B tests are simply when you compare a variable (an idea) against the control (what you are currently doing) by sending each to two different segments of your list (often 50/50). The important thing to remember when running an A/B test is to run it more than once. Consider picking one element to test and run it for 2 weeks or a month to get a true gauge of how the change impacted your sends. Don’t rely on just one send to make a final decision, since you will want to be sure that the audience will consistently react and it wasn’t just a fluke.
Here are 5 small A/B creative tests that can make a difference in your program and will help settle some creative questions that we hear clients ask all of the time:
- Call to action color and size. What color performs the best? Should it be a big button, a small button, or a text link? Should it have an icon (arrow, no arrow, cart)? Play with these options to see which one gets the most reaction from your audience.
- Email Length.Do users scroll and act on all of your content, or are they just reacting to the main message? Test a long version versus a postcard version of your email campaign. This test will not only give you insight about your customer base, but it will also help you determine if the resources it takes to create a longer email are warranted.
- Layout. Do you generally send the same template week after week? Try something new. Instead of having a banner at the bottom of every email, try two or three buckets of content in its place. Always send out an email with one large main message? Break it up and try two or three messages.
- Imagery v. Type. Do your emails utilize large images with a main message overlayed? Try testing a version where there is no image and just well designed html text or a version where you are utilizing html text and imagery together.
- Background color. Is your email background white? Try using a color from your brand’s color scheme or a color that compliments the images and content within your send or vice versa.
Have you run some of these tests in the past to no real conclusive, substantive results?
That is actually not all bad news. Look at it this way: if your change didn’t make a big difference you now know that rendering an element or creative in that way is okay and won’t hurt your numbers. Sometimes trying an idea that you think is “off the wall” and getting non-conclusive results can be a freeing thing for a designer. It just means that your idea wasn’t really all that crazy to begin with. It also could lead you to find that it opens the door to trying and implementing new creative concepts that you wouldn’t have tried otherwise.
Sure, we all want to uncover that one change that dramatically raises the needle for our programs and causes your fellow employees to hoist you up on their shoulders, carrying you off the field while chanting your name…But sometimes the test that has little impact is just as powerful and refreshing. Even if you fail, you win. Behold the true power of creative testing.
Creative Manager, WhatCounts