Skip to content
Check out our new developer portal and join the Braze developer community!

Creating multivariate and A/B tests with Braze

You can create a Multivariate or A/B test for any campaign that targets a single channel.

Step 1: Create your campaign

Click Create Campaign and select a channel for the campaign from the section that allows multivariate and A/B testing. For detailed documentation on each messaging channel, refer to Create a Campaign.

Step 2: Compose your variants

You can create up to 8 variants of your message, differentiating between titles, content, images, and more. The number of differences between the messages determines whether this is a multivariate or A/B test. An A/B test examines the effect of changing one variable, whereas a multivariate test examines two or more.

For some ideas on how to get started differentiating your variants, refer to Tips for different channels.

Step 3: Schedule your campaign

Scheduling your multivariate campaign works the same as scheduling any other Braze campaign. All standard delivery types are available.

Step 4: Choose a segment and distribute your users across variants

Select segments to target, then distribute its members across your selected variants and the optional control group. For best practices around choosing a segment to test with, see Choosing a segment.

For push, email, and webhook campaigns scheduled to send once, you can also use an optimization. This will reserve a portion of your target audience from the A/B test and hold them for a second optimized send based on the results from the first test.

Control group

You can reserve a percentage of your target audience for a randomized control group. Users in the control group don’t receive the test, but Braze monitors their conversion rate for the duration of the campaign.

When viewing your results, you can compare the conversion rates of your variants against a baseline conversion rate provided by your control group. This lets you compare both the effects of your variants and the effects of your variants against the conversion rate that would result if you didn’t send a message at all.

A/B Testing panel that shows the percentage breakdown of the Control Group, Variant 1, Variant 2, and Variant 3 with 25% for each group.

Control groups with Intelligent Selection

The size of the control group for a campaign with Intelligent Selection is based on the number of variants. If each variant is sent to more than 20% of users, then the control group is 20% and the variants are split evenly across the remaining 80%. However, if you have enough variants that each variant is sent to less than 20% of users, then the control group must become smaller. Once Intelligent Selection starts analyzing the performance of your test, the control group grows or shrinks based on the results.

Optimizations

For email, push, and webhook campaigns scheduled to send once, you can select an optimization. There are two optimization options: Winning Variant and Personalized Variant (early access).

Optimization options listed in the A/B Testing section when choosing your target audience. Three options are listed: No Optimiziation, Winning Variant, and Personalized Variant. Personalized Variant is selected.

Both options work by sending an initial test to a percentage of your target segment. After the test ends, the remaining users in your audience are sent either the best performing variant (Winning Variant) or the variant they’re most likely to engage with (Personalized Variant).

Sending the winning variant is similar to a standard A/B test. Users in this group will receive the winning variant once the initial test is complete.

  1. Select Winning Variant, then specify what percentage of your campaign audience should be assigned to the winning variant group.
  2. Configure the following additional settings.
Field Description
Optimization metric The metric to optimize for. Choose between Unique Opens or Clicks for email, Opens for push, or Primary Conversion Rate for all channels. Selecting Opens or Clicks to determine the winner does not affect what you choose for the campaign’s conversion events.

Keep in mind that if you’re using a control group, users in the control group can’t perform Opens or Clicks, so the performance of the control group is guaranteed to be 0. As a result, the control group can’t win the A/B test. However, you may still want to use a control group to track other metrics for users who do not receive a message.
Initial test start date The date and time the initial test starts.
Initial test end date The date and time the initial test ends. This is when the winning variant is sent to the remaining users.

When sending in users’ local time or with Intelligent Timing, the winning variant must be sent at least 24 hours after the A/B test to ensure delivery to all users in the winning variant group.
Fallback What happens if no variant wins by a statistically significant margin. Choose between sending the best performing variant anyway, or ending the test and not sending any further messages.

Use personalized variants to send each user in your target segment the variant they’re most likely to engage with.

To determine the best variant for each user, Braze will send an initial test to a portion of your target audience to look for associations between user characteristics and message preferences. Based on how users respond to each variant in the initial test, these characteristics are used to determine which remaining users will get each variant. To learn more about how personalized variants are determined, refer to Multivariate and A/B test analytics.

  1. Select Personalized Variant, then specify what percentage of your campaign audience should be assigned to the personalized variant group.
  2. Configure the following additional settings.
Field Description
Optimization metric The metric to optimize for. Choose between Unique Opens or Clicks for email, Opens for push, or Primary Conversion Rate for all channels. Selecting Opens or Clicks to determine the winner does not affect what you choose for the campaign’s conversion events.

Keep in mind that if you’re using a control group, users in the control group can’t perform Opens or Clicks, so the performance of the control group is guaranteed to be 0. As a result, the control group can’t win the A/B test. However, you may still want to use a control group to track other metrics for users who do not receive a message.
Initial test start date The date and time the initial test starts.
Initial test end date The date and time the initial test ends. This is when personalized variants are sent to the remaining users.

When sending in users’ local time or with Intelligent Timing, personalized variants must be sent at least 24 hours after the A/B test to ensure delivery to all users in the personalized variant group.
Fallback What happens if no personalized variants are found. Choose between sending the winning variant instead, or ending the test and not sending any further messages.

Step 5: Designate a conversion event (optional)

Setting a conversion event for a campaign allows you to see how many recipients of that campaign performed a particular action after receiving it.

This only affects the test if you chose Primary Conversion Rate in the previous steps. For more information, refer to Conversion events.

Step 6: Review and launch

On the confirmation page, review the details of your multivariate campaign and launch the test! Next, learn how to understand your test results.

Things to know

Tips for different channels

Depending on which channel you select, you’ll be able to test different components of your message. Try to compose variants with an idea of what you want to test and what you hope to prove.

What levers do you have to pull and what are the desired effects? While there are millions of possibilities that you can investigate using a multivariate and A/B test, we have some suggestions to get you started:

Channel Aspects of Message You Can Change Results To Look For
Push Copy
Image and Emoji Usage
Deep Links
Presentation of numbers (e.g., “triple” versus “increase by 200%”)
Presentation of time (e.g., “ends at midnight” versus “ends in 6 hours”)
Opens
Conversion Rate
Email Subject
Display Name
Salutation
Body Copy
Image and Emoji Usage
Presentation of numbers (e.g., “triple” versus “increase by 200%”)
Presentation of time (e.g., “ends at midnight” versus “ends in 6 hours”)
Opens
Conversion Rate
In-app message Aspects listed for “push”
Message format
Click
Conversion Rate

In addition, the ideal length of your test may also vary depending on the channel. Keep in mind the average amount of time most users may need to engage with each channel.

For instance, if you’re testing a push, you may achieve significant results faster than when testing email, since users see pushes immediately, but it may be days before they see or open an email. If you’re testing in-app messages, keep in mind that users must open the app in order to see the campaign, so you should wait longer in order to collect results from both your most active app openers as well as your more typical users.

If you’re unsure how long your test should run for, the Intelligent Selection feature can be useful for finding a winning variant efficiently.

Choosing a segment

Since different segments of your users may respond differently to messaging, the success of a particular message says something about both the message itself and its target segment. Therefore, try to design a test with your target segment in mind.

For instance, while active users may have equal response rates to “This deal expires tomorrow!” and “This deal expires in 24 hours!”, users who haven’t opened the app for a week may be more responsive toward the latter wording since it creates a greater sense of urgency.

Additionally, when choosing which segment to run your test on, be sure to consider whether the size of that segment will be large enough for your test. In general, multivariate and A/B tests with more variants require a larger test group to achieve statistically significant results. This is because more variants will result in fewer users seeing each individual variant.

WAS THIS PAGE HELPFUL?
New Stuff!