Analytics & Data

Stop Guessing: How to Know Which Website Changes Make Money

Learn how to stop wasting time on website 'tweaks' that don't work and how to actually prove which changes bring in more phone calls and sales.

AI Summary

This article explains the importance of using real data rather than 'gut feel' when making website changes. It breaks down the concept of statistical significance into plain English, using a plumbing case study to show why small sample sizes lead to bad business decisions. It provides a practical guide on what to test first—like headlines and offers—to ensure more phone calls and sales.

Look, I’ve seen it a thousand times.

You’re sitting at your desk, looking at your website, and you think, "That 'Book Now' button would look much better in lime green." Or maybe your nephew, who's "good with computers," tells you that you need a big photo of a smiling family on the home page to build trust.

So, you spend a few hundred bucks—or a few hours of your own time—making the change.

Then you wait.

A week later, you’ve had three phone calls. Was that more than last week? You think so. Maybe? But then you remember it rained last Tuesday, and usually, the phones go quiet when it’s pouring.

This is the problem with most small business marketing. It’s all gut feel and guesswork. You’re throwing spaghetti at the wall and hoping something sticks, but you’ve got no idea if the spaghetti is even hitting the wall or just landing on the floor.

If you want to stop burning cash, you need to understand one thing: how to tell if a change actually worked, or if you just got lucky. In the nerd world, they call this "statistical significance." In the real world, we call it "knowing if this actually made me money."

Imagine we’re at the pub in Paddington. I pull out a fifty-cent piece and flip it. It lands on heads.

Does that mean the coin always lands on heads? Of course not.

If I flip it three times and it’s heads every time, you’d probably think I’m a bit lucky. If I flip it twenty times and it’s heads every time, you’d start checking if I’ve got a double-sided coin or if I’m pulling some magic trick.

Website changes are the same.

If you change your phone number color and get two extra calls the next day, that’s just a coin landing on heads once. It doesn't mean the color worked. It might just mean a couple of people finally got around to calling you.

We need enough data to be sure that the "win" wasn't just a fluke.

A Real Story: The Plumber and the "Free Quote" Button

Let’s look at a real-world example from a client we worked with a while back. For the sake of his privacy, we’ll call him Mick.

Mick runs a plumbing business. His website was okay, but he wanted more enquiries. He had a big button that said "Contact Us."

He read somewhere that "Get a Free Quote" works better.

Now, most blokes would just swap the text and hope for the best. But Mick’s smart. We decided to run a test. We showed the old "Contact Us" button to half the people visiting his site, and the new "Get a Free Quote" button to the other half.

After four days, the "Free Quote" button had 5 clicks. The "Contact Us" button had 2.

Mick was ready to pop the champagne. "It’s more than double!" he said.

I had to be the buzzkill. "Mick," I said, "it’s seven total clicks. That’s not a trend, that’s a coincidence. If one person’s internet cut out or one bloke clicked by mistake, the whole result changes."

We kept the test running for three weeks until we had enough people through the site to be 95% sure the result was real.

In the end? "Get a Free Quote" actually performed worse once we had enough data. It turned out people just wanted to "Book a Service." If we had changed the site based on that first week of "gut feel," Mick would have actually lost money.

To stop guessing, you need to look at three things. You don't need a maths degree, you just need to understand the logic.

If you only have 50 people visit your site a month, you can’t really run these tests. It’s like asking two people their opinion on a beer and deciding it’s the best beer in Australia.

You need enough "flips of the coin" to see a pattern. For most local businesses, we like to see at least 100-200 "conversions" (calls or forms) before we start making big calls.

If Version A gets 50 calls and Version B gets 51 calls, that’s not a win. That’s a rounding error.

We’re looking for a clear daylight between the two options. If Version B gets 75 calls while Version A stays at 50, now we’re talking. That’s a 50% increase that is much harder to attribute to just "luck."

Did the new version win every day, or did it just have one massive Tuesday because you ran a cheap radio ad that day?

This is why we track every sale back to the source. If you don't know where the traffic is coming from, you might think your website change worked when really, you just had a different group of people visiting that week.

One of the biggest mistakes I see business owners make is stopping a test too early because they see the result they want to see.

It’s human nature. You spent money on a new logo, so you want to believe the new logo is bringing in more business. You’ll find any scrap of data to prove you were right.

But Google doesn't care about your feelings. Neither does your bank account.

If you make decisions based on "early wins" that aren't actually backed by enough data, you’re likely to "optimise" your business straight into the ground. You’ll be making change after change, wondering why your profit isn't moving even though every "test" was a "winner."

When we talk about testing, we only care about the numbers that actually keep the lights on.

I’ve seen tests where a new page design got 300% more "engagement"—people clicking around, looking at photos, sharing it on Facebook—but actually resulted in fewer phone calls.

If you’re chasing likes instead of sales, you’re playing a losing game. A "statistically significant" increase in Facebook likes doesn't pay the lease on your truck.

When we test for our clients, we look at: - Phone calls (using tracking numbers) - Contact form submissions - Actual sales (if they sell online)

Anything else is just vanity.

How long should you run a test?

Too short, and the data is rubbish. Too long, and you’re wasting time and money showing a "losing" version of your site to potential customers.

For most Brisbane businesses we work with, the sweet spot is usually between 2 and 4 weeks. This covers the natural ups and downs of a week—Monday morning rushes, Friday afternoon slumps, and the weekend quiet.

If you haven't reached a clear winner in a month, the change you made probably wasn't big enough to matter anyway. Move on to something else.

Don’t waste time testing the shade of blue in your footer. Nobody cares.

If you want to see a real jump in your bank balance, test the big stuff:

1. The Headline: What’s the first thing people read? "Brisbane’s Best Electrician" vs "We’ll Be There in 2 Hours or It’s Free." 2. The Offer: "Request a Quote" vs "Book a $50 Safety Inspection." 3. The Friction: Does your form have 10 fields or 3? Is your website form costing sales because it's too hard to fill out? 4. The Proof: Do people care more about your 5-star Google rating or the fact that you’ve been in business for 20 years?

Here’s the truth: Most small businesses don't have enough traffic to do complex A/B testing every single week.

If you’re getting 200 visitors a month, trying to find a "statistically significant" winner on a button color change will take you three years.

In those cases, don't obsess over the maths. Instead, focus on the big, obvious wins and use your common sense. But if you’re spending thousands on ads every month and sending that traffic to a website, you must test.

When you’re paying for every click, a 10% difference in how many people call you isn't just a "nice to have." It’s the difference between a holiday in Noosa and a weekend spent worrying about payroll.

1. Pick one thing to change. Don't change the headline, the photo, and the price all at once. You won't know which one worked. 2. Decide what a 'win' looks like. Is it more calls? More emails? 3. Use a tool. Don't try to do this manually. Tools like Google Optimize (or its newer alternatives) can split your traffic automatically. 4. Wait. Don't look at the data for at least a week. Resist the urge. 5. Check the 'Confidence'. Most tools will tell you "95% chance this is the winner." If it’s lower than that, keep running it or call it a draw.

Marketing isn't an art project. It’s an investment.

If you’re just making changes because you "feel" like it, you’re gambling. And the house (Google and Facebook) always wins when you gamble.

By understanding that you need a decent sample size and a clear margin of victory before claiming a win, you put yourself ahead of 90% of your competitors. You’ll stop chasing ghosts and start making decisions that actually grow your business.

If you're sick of guessing and want to see exactly which parts of your marketing are actually bringing in the bacon, come have a chat with us at Local Marketing Group. We help Brisbane businesses sort the facts from the fluff.

You can reach us here: https://lmgroup.au/contact

Need Help With Your Analytics & Data?

We help Brisbane businesses implement these strategies. Let's discuss your specific needs.

Get a Free Consultation