Why do you think its in the margin of error? I never mentioned the error bars on the statistic - We did an extensive AB test and it was a 3% stat-sig improvement
A 3% improvement on a 3 second page load means it loads in 2.91 seconds. So even if it were statistically significant, it's not practically significant?
Unless your initial page load is taking like 10 seconds, a human wouldn't even notice. And if it is taking 10 seconds, well, then you've got better things to fix.
That's not an actual improvement. That's just noise.
I worked at a big web site and we could demonstrate lots of incremental revenue (many millions of dollars) from a change like this. Even though it doesn’t seem noticeable to an individual, it adds up. If you draw a graph with page load time on one axis, and usage on the other, it’s a smooth curve, not a step function. Out of all your customers, some fraction of them are going to get distracted and disengage after 3 seconds but not after 2.9 seconds.
Imagine you lower the grip on everyone’s tires by 3%. Most people wouldn’t notice, but for a few people it will prevent them from being T-boned when they wait too long to brake for a red light. That improvement would be measurable if your data was good enough. I’m not saying optimizing your website is important like preventing car crashes, but it’s an illustration of how a small change can lead to a measurable improvement.
Thats why you should also track other metrics when AB testing performance changes. During my time at IG, even small perf improvements almost always corresponded to stat-sig improvements in engagement metrics. 100ms is a pretty significant performance win, definitely not noise - the other thing is that those wins often scale by the quality of device & connection. So a 100ms win for a high end desktop on fiber internet could be multiple seconds for a low end mobile on 3G.
I'm actually kinda curious about this in regards to browser finger printing.
This isn't like statistics in a political poll where there is an expected margin or error, when it comes to logging sites you absolutely know everything about your users in regards to browser, operating system, and even the hardware (sometimes).
3% of a million is 30,000, 3% of 500,000,000 is 1,5000,000. An absolutely real chunk of users that can produce real value (for Instagram, ad dollars).
Why would there be a margin error at all? How would it be introduce? A large portion of crawlers masking themselves? I mean it's possible, but I do not know if that's enough to throw off proper logging.
Feels like you should make that the start of the blog post and cut it down massively, so people know it's not worth bothering with?