A Multi-variant test is a great way to test two or more variations of a design against one another to measure the impact on a pre-determined metric. During a Multi-variant test, anyone matching your target audience will see one variation of the campaign.
These types of tests are great at helping you understand which copy, creative, position or format is best to do a particular job. Oftentimes, we find that we can unconsciously decide that a certain format is better than another - Multi-variant testing helps us validate these assumptions.
When to add a control group to a Multi-variant Test
Any Multi-variant Test can run with a control group. In this case, a control group refers to a portion of your website traffic that will qualify for all of the same targeting and triggering rules as consumers who see a Yieldify experience, but this group will not be shown any creative.
We recommend that you include a control group in any Multi-variant Test with the goal to increase on-site behaviours such as Conversion Rate, AOV, or any Yieldify Purchase Indicator. This way, you can guarantee that your top performing variant provides more value that a user seeing no experience at all.
We recommend that you do not include a control group in any Multi-variant Test with the goal to increase engagement with a Yieldify experience such as increasing Lead Capture Rate or Click through Rate. For tests with these goals, we recommend that all qualifying consumers see a Yieldify campaign to quickly determine which creative is best.
Creating a Multi-variant Test
To create a Multi-variant test in the Yieldify Conversion Platform is easy. Once you have decided to create a new campaign and you have selected your template, you will find yourself on the Design Builder. To create a Multi-variant test, simply click on the 'Create multi-variant test' button located in the top left corner. You can decide to duplicate a variant of your choice, or create a new blank variant. When you select to duplicate a variant, it will copy the current Desktop and Mobile designs of that variant. This is helpful to save time when you want to test different copy on the same base design.
A user can create up to five unique variants for any individual test.
Deleting a variant
To delete a variant from your test, simply hover over the variant the would like to delete and press the 'X' button.
From here confirm the deletion.
If a variant you delete was previously active, then no future variants will have the name of that variant. For example if you paused a lead capture test with A/B/C variants to remove underperforming variant C and test a new option against A and B, the new option would be labeled variant D instead. This way, you can always ensure that each variant name is unique to test and can be consistently tracked across tests
Adding a Control group
When you are finished with designing your variants and are happy with your targeting rules, continue onto the settings page. From there, navigate to the bottom of the page, and you will see a section named Audience Split. To add a control group to any test, press the the '+Add control group' button. Once pressed, traffic will be evenly divided between the control group and variants.
When we add a Control group, this Control group becomes the Baseline. What the Baseline means, is that it now becomes the point of comparison for your test's performance. The Baseline is always the first variant (usually A) or the control group.
Altering Split Weights
By default, all variants evenly divide traffic to ensure experiences reach confident results as quickly as possible. When a user adds a control group, they have the option to manually configure its split percentage. To do this, click into the box that says Control, and then enter a number between 1-99 to configure the split percentage. Alternatively, you can click on the sliders and alter it to your choice.
Having a smaller control group can increase the potential value each variant drives to your website, though these results take significantly longer to reach confident outcomes, so we only recommend this if you are willing to wait longer for test results.
In the example above, the Control group will receive 50% of all traffic, while the other four variants will each receive 12.5%. To reset the Split Weights, click on the 'Evenly Split' button.
Reviewing your variants
To review the designs of each variant, continue to the Review stage if the builder. From here, you can scroll down the page until you see the Setup section. Here you will see the list of variants, their split weight allocations and the baseline.
A/Bn Campaign Status
A user can determine which type of test they are running by the status of the campaign located on the dashboard. These are split into four different types of tests. A Single Variant test, a Multi-variant test, a Single variant + control test and a Multi-variant + control test.
Checking your Multi-variant test performance
To check the performance of your Multi-variant test, select the 'Performance' button located on the Dashboard.
On this page, you will be able to see a detailed breakdown on the performance of each variant and also a breakdown for each device (desktop, mobile & tablet). Each variant is compared against the performance of the baseline.
When a variant reaches confidence, it is determined as the winner and it will be highlighted on the performance table.
Key metrics used in calculations
Sessions - The number of sessions where a user successfully triggered a campaign event. If a user triggers a campaign multiple times in the same session, this will still only count as one session.
Sales - The number of sessions in which a user triggers a campaign event and completes a sale. This does not include any sales made in future sessions.
Revenue - The amount of revenue generated from sales attributed to a campaign.
Clicks - Any time a user clicks on any CTA on a campaign. A CTA must be selected as Count Towards CTR to be tracked as a click.
Leads - Any event in which a user fills out a form and submits the information via a Submit CTA. This includes both campaigns that request an Email/SMS or survey campaigns.
Closes - Any time a user clicks on a campaign close button. This does not include Dismiss CTAs (a click) or if someone clicks outside of an overlay.