Skip to content

A/B Testing Campaigns

Test different subject lines to find what resonates best with your audience before sending to your full list.

A/B test setup

How A/B Testing Works

A/B testing lets you compare two versions of your campaign to see which performs better. Here is how it works:

  1. You create two subject line variants (A and B)
  2. Outsprint sends each variant to a small portion of your list (10% each)
  3. After a waiting period (default: 4 hours), the winning variant is determined by open rate
  4. The winning version is automatically sent to the remaining 80% of your list

Set Up an A/B Test

  1. In the campaign builder, select A/B Test as the campaign type
  2. Write your Variant A subject line
  3. Write your Variant B subject line
  4. Set the test size (default: 20% of the list, split evenly between A and B)
  5. Set the wait time before picking a winner (default: 4 hours)
  6. Click Send or Schedule

Choosing a Winner

Outsprint determines the winner based on open rate. The variant with the higher open rate after the waiting period is sent to the rest of the list.

If the results are too close to call (less than 2% difference), Variant A is sent by default.

You
How did the A/B test for the February newsletter go?
Outsprint AI
Variant A ("Our February Update") had a 42% open rate. Variant B ("New Features Inside") had a 51% open rate. Variant B was sent to the remaining 680 contacts.

Pro Tip

Test one variable at a time. Change only the subject line between variants to get clear results about what works.

Note

A/B testing is available as a future enhancement. Check back for updates on availability.

What's Next