Strategy & Tactics

A/B Testing

Comparing two versions of a marketing asset to see which performs better.

What is A/B Testing?

A/B testing (also called split testing) is a method of comparing two versions of a webpage, ad, email, or other marketing asset to determine which one performs better.

The process: split your audience randomly into two groups, show version A to one group and version B to the other, measure performance, and determine the winner based on statistical significance.

Test one variable at a time for clear results: headline, CTA button color, image, copy length, or offer. Testing multiple changes simultaneously makes it unclear which change drove results.

Successful A/B testing requires: sufficient traffic/volume for statistical significance, clear hypothesis, single variable testing, adequate test duration, and proper tracking.

A/B Testing for NZ Businesses

NZ businesses with lower traffic volumes need longer test durations to reach statistical significance. Don't conclude tests prematurely.

Focus on high-impact elements: headline, offer, CTA, and social proof. These typically have the biggest performance impact.

Common NZ A/B tests: NZ vs international imagery, casual vs formal tone, pricing displayed vs "Get a quote", testimonials vs no testimonials.

NZ Business Examples

  • A Wellington SaaS company tests "$99/month" vs "From just $3.30/day" pricing display - daily rate increases conversions 18%
  • An Auckland service business tests phone call vs form submission CTAs - call button generates 2.3x more enquiries
  • A Christchurch e-commerce store tests free shipping threshold of $50 vs $75 - $50 increases conversion rate by 14%

Real-World Industry Examples

E-commerce

Scenario

An online retailer tests checkout page variations

Outcome

Single-page checkout vs multi-step: single page wins with 19% higher completion rate and 23% more revenue

Lead Generation

Scenario

A B2B company tests long-form vs short-form landing pages

Outcome

Long-form page (2,400 words) converts at 9.2% vs short-form (600 words) at 5.8% for high-ticket service

SaaS

Scenario

A software company tests "Start Free Trial" vs "Get Started Free" CTA button text

Outcome

"Get Started Free" increases trial signups by 12% - simpler language performs better

Related Terms

Frequently Asked Questions

How long should I run an A/B test?

Run tests for at least 1-2 weeks or until you reach statistical significance (usually 95% confidence level with 250+ conversions per variation). Account for weekly patterns - don't stop mid-week.

What should I test first?

Start with elements that impact conversion most: headline, offer, CTA button, and hero image. These typically drive 70%+ of performance. Save minor elements (footer links, color shades) for later.

What if my A/B test shows no difference?

Inconclusive tests are common. Either the change doesn't matter to users, or you need more volume. Try testing a more dramatic variation, or move on to test a different element. Not all tests will be winners.

Need Help With Your Lead Generation Strategy?

Our team specializes in delivering 30 qualified leads in 30 days for NZ service businesses. We handle the strategy, execution, and optimization - you handle the sales.