We will work with you to determine what can be tested faster, with less effort and has a notorious effect in your final conversion rates.
At least 1 running A/B test per month, and have as less downtime (without tests running) as possible.
We will agree on the primary goal of your test and we will provide you at least 2 variations of the test that we will run each month
We will set the test using Optimizely or any tool you might want to work with and we will provide you a bi-weekly report.
No. After completing a successful test, the responsibility to implement a winner falls on your team. Although we can share the tested code to make your life easier, it is better that your developers highlight any winners, correctly.
For sites with lots of traffic, we can consider a minimum of 1 week. however we always suggest to keep a test running at least for 1 month. In order to reach greater statistical validity, we'll calculate the required duration from results based on visits, the number of concepts involved and the existing conversion rate.
Yes. We justify all of our design decisions as well as basing them on hypotheses. We ensure that all tests run for at least a week or two. Afterwards, we don't just tell you that you have a winner, we first look at things like statistical significance, margin of error, week-to-week consistency and secondary metrics; to arrive at an overall assessment.
We are pleased to help and are confident that we can deliver positivity for you: from all of our tests! Although projects and our tests may differ, we achieve an approximate average of 10% relative improvement over previous projects. That’s why we can offer performance based agreements and why the risk is shared. Alternatively, we also try to retest at least once, with a failed test.