A/B testing ( also called A/B/n testing or Split testing ) is synonymous with modern web development. In a simple A/B test, two sets of similar viewers are sent to two different versions of a page, an "A" version and a "B" version.
The A and B versions may be totally different, or may differ only in key areas; for example, the headline, image or call to action.
The goal of an A/B test is to determine which version performs best. Metrics that may be evaluated include email opt-ins, link clicks, bounce rate, or purchases.
An A/B test is conducted using Symplify Conversion that evaluates page performance.
You need to do 4 things to run an A/B test:
Setup the test. Define which visitors should experience the test and which URLs that will be included.
Track on-page analytics. Symplify Conversion tracks the actions users take on your site; for example, which links they click, how long they stay on the site, which url's they visit, how many of them opt-in to an email form, etc.
Analyze the results. Once you have the data on how the "A" and "B" variants performed, you analyze the result and determine which variation should be implemented permanently.