Tags: abtesting



Friday, August 23rd, 2019

Is client side A/B testing always a bad idea in your experience? · Issue #53 · csswizardry/ama

Harry enumerates the reasons why client-side A/B testing is terrible:

  • It typically blocks rendering.
  • Providers are almost always off-site.
  • It happens on every page load.
  • No user-benefitting reuse.
  • They likely skip any governance process.

While your engineers are subject to linting, code-reviews, tests, auditors, and more, your marketing team have free rein of the front-end.

Note that the problem here is not A/B testing per se, it’s client-side A/B testing. For some reason, we seem to have collectively decided that A/B testing—like analytics—is something we should offload to the JavaScript parser in the user’s browser.

Monday, May 27th, 2019

Plain Text vs. HTML Emails: Which Is Better? [New Data]

Spoiler: it’s plain text. Every time.

Nothing boosts opens and clicks as well as an old school, plain-text email.

I feel vindicated.

People say they prefer HTML emails ..but they actually prefer plain-text.

This seems like a plausable explanation:

Think about how you email colleagues and friends: Do you usually add images or use well-designed templates? Probably not, and neither does your audience. They’re used to using email to communicate in a personal way, so emails from companies that look more personal will resonate more.

Now get off my lawn, you pesky HTML-email lovin’ kids.

Saturday, November 18th, 2017

Hooked and booked

At Booking.com, they do a lot of A/B testing.

At Booking.com, they’ve got a lot of dark patterns.

I think there might be a connection.

A/B testing is a great way of finding out what happens when you introduce a change. But it can’t tell you why.

The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.

If I were trying to convince you to buy a product, or use a service, one way I could accomplish that would be to literally put a gun to your head. It would work. Except it’s not exactly a good solution, is it? But if we were to judge by the numbers (100% of people threatened with a gun did what we wanted), it would appear to be the right solution.

When speaking about A/B testing at Booking.com, Stuart Frisby emphasised why it’s so central to their way of working:

One of the core principles of our organisation is that we want to be very customer-focused. And A/B testing is really a way for us to institutionalise that customer focus.

I’m not so sure. I think A/B testing is a way to institutionalise a focus on business goals—increasing sales, growth, conversion, and all of that. Now, ideally, those goals would align completely with the customer’s goals; happy customers should mean more sales …but more sales doesn’t necessarily mean happy customers. Using business metrics (sales, growth, conversion) as a proxy for customer satisfaction might not always work …and is clearly not the case with many of these kinds of sites. Whatever the company values might say, a company’s true focus is on whatever they’re measuring as success criteria. If that’s customer satisfaction, then the company is indeed customer-focused. But if the measurements are entirely about what works for sales and conversions, then that’s the real focus of the company.

I’m not saying A/B testing is bad—far from it! (although it can sometimes be taken to the extreme). I feel it’s best wielded in combination with usability testing with real users—seeing their faces, feeling their frustration, sharing their joy.

In short, I think that A/B testing needs to be counterbalanced. There should be some kind of mechanism for getting the answer to “why?” whenever A/B testing provides to the answer to “what?” In-person testing could be one way of providing that balance. Or it could be somebody’s job to always ask “why?” and determine if a solution is qualitatively—and not just quantitatively—good. (And if you look around at your company and don’t see anyone doing that, maybe that’s a role for you.)

If there really is a connection between having a data-driven culture of A/B testing, and a product that’s filled with dark patterns, then the disturbing conclusion is that dark patterns work …at least in the short term.

Monday, November 16th, 2009

Statistical significance & other A/B test pitfalls

Cennydd delivers a slap of common sense to A/B testing. With science!