Article Image
read

Mobile A/B Testing Yesterday I read Dan Waldschmidt’s post “Why A/B Testing is For Idiots” and it pissed me off. If you haven't read the post don't, it's not going to help you improve your business, or move the needle on your startup. It will only serve to offer more useless, buzz-word fluff that suggests that everyone can be Steve Jobs if only they follow a simple life philosophy. The post has annoyed me enough to write this article for a number of reasons;

  1. clearly it's a click-bait post
  2. it is an ideological argument based completely outside of factual evidence
  3. its suggestions are downright dangerous to startups with potential

A dangerous thesis

Dan's thesis is that you cannot A/B test your way to transformative, successful change. He makes a very aggressive statement, "Find a company anywhere who achieved massive, mind-blowing success by A/B testing their way to greatness. I dare you to find one. I’m begging you to find one.” His suggestion is that A/B testing is at best for small changes and at its worst, is damaging to your business.

The points that the article tries to make, to prove his thesis are:

  1. A/B tests are for small changes
  2. Experimenting gives you information that isn't actionable
  3. A/B tests are always flawed
  4. You can't A/B test to radical change
  5. You can't measure boldness in numbers

I intend to present an alternative view of A/B testing and contest each of these points, so that anyone reading this will understand why his suggestions are dangerous to follow.

An improved thesis

I believe that A/B testing enhances and amplifies transformative change. True transformative, radical change that makes businesses explode in popularity needs the rigorous data analysis and objectivity that A/B testing provides. In his article, Dan says that A/B testing is for idiots. In my opinion not A/B testing is far more idiotic. The world view that Dan is suggesting is one where people make a change just for the sake of being different. Change for change's sake will get a business nowhere and can be intensely damaging to its potential. Untracked, unmeasured change can confuse customers and frustrate founders. A/B testing is limited only by your creativity and your willingness to apply it to the types of transformative changes that create the biggest impact.

Isn't A/B testing only for small changes?

This sentiment and this belief about A/B testing come from reports and stories that talk about how Google A/B tests shades of blue. These stories are interesting because they show tiny optimizations making small percentage changes in behavior and result in millions of additional revenue. This kind of success is a great benefit of A/B testing, but it isn't the whole story.

People rarely reference the major feature A/B tests that Facebook, Linkedin and other massive technology companies constantly run. These seem to sit as outliers to what A/B testing is in the popular psyche. But they are proof that some of the biggest companies in the world today live and die by deep analysis of what every change they make means to their customers. If you read those articles, you'll see some crazy things. You'll see that Linkedin is running 200 tests in parallel every day, or that Facebook cares so much about A/B testing that they built an internal system for it. If that isn't testing at the core of product development, I don't know what is.

Facebook tested the navigation system in their mobile app thoroughly back in 2013. The result was dropping the "Hamburger" menu and adopting a more prominent tab bar navigation system

Experimenting gives you the most actionable information you can find

The article suggests that whatever information you get from A/B testing is most likely not credible and not actionable. It doesn't do much to back up this claim; it says, "just because you get results doesn't mean your conclusions are accurate." Well, I'm going to say that a well-run test on any part of your product will be successful. You just need proper controls, proper distribution and a complete set of data that includes confidence intervals, significance and variance and the test will show you very accurately how your users react to changes.

No test and no bold moves can prove a causal relation without a doubt, but running enough experiments and tracking your data appropriately will get you far closer than blindly making changes.

A/B tests, done right, are powerful

The article makes the dual point that both the way in which you structure your test, and the dataset you're testing are outright flawed. The point that Dan is making is that A/B testing can fall victim to issues of cyclicality, bias, regression towards the mean and most importantly small sample sizes. You should not disregard these issues as they are very real. But they are not sufficient to suggest that A/B testing is flawed out of the gate.

No matter the situation you can set up a test to give you reliable data if you take the time to understand the pitfalls and what the data is telling you. Set up appropriately, A/B tests accurately reflect the behaviors of your userbase as a whole and can predict widespread effects.

One of the over 200 parallel A/B tests run by Linkedin determines what content you see in your feed

Radical change works best, coming out of an experiment

Dan believes that just taking a radical position in the marketplace will lead to success. He seems to suggest that just doing things your way and not following the crowd will make riches fall in your lap. Well, what if you were making sandwiches and the radical change you decided to try was making a pickle sandwich with chocolate sauce. That's a pretty different idea that no one else is doing. Let's say you even tried it yourself and thought it was delicious. Being radical in this situation and not testing your concept on customers would lead to ruin. No one is going to buy that sandwich (except a few random adventurous folk).

The same concept applies to any business idea, technology or otherwise. If you have a radical, transformative idea it will work best and be guided best if run through an appropriate experiment. Unlike Dan's suggestion that experiments kill radical change, experiments only work to prove that people react positively to the change. Your experiment can also tell you how best to distribute it to the rest of your customers by demonstrating early adopter traits, accelerating the effects of your transformative change.

You can measure what matters

Dan finishes his arguments by saying you can't measure boldness in numbers. I'll agree with him there; you aren't going to get a definitive, objective rating of boldness. But who cares? I'm not in the business of being bold; I'm in business to be successful. Do you really care the descriptor of what you did to get there? Whether it's boldness, savvy, genius or whatever, measuring that doesn't matter. So not being able to measure it doesn't matter.

What you can measure very precisely is how many new users you get, how many users stick around, how many people purchase, how much they purchase and on and on. All of these measurements are the things that matter for your business. No matter what the change, you can test in a controlled environment if any of these things goes up or down. And no matter which way the test results go, you have learned something about the change you made. If revenue tanks from a change, you know to STOP RIGHT NOW! and do something different and if revenue doubles you know to double down as quickly as possible. This is what A/B testing is all about, and this is why the best companies in the world rely on it so heavily.

How to succeed

To be truly successful in your business, you need to break away from the inherent gravity that tries to pull you down. The way you do this is to build something that people actually want, not what people say they want. The implication of this statement is that people may tell you that they want what you’re building, but all you should care about are actions. When you are thinking about a new feature you should only ask yourself, will this new feature cause more people to use our platform or will it cause our current users to engage more often? The problem in business is that you rarely know the answer to that question before you push a new feature or product.

Are you making something people want?

My goal has been to present a true alternative thesis about how to build great businesses. Great businesses do not come from some stroke of genius or from arbitrarily acting differently from the crowd. Great businesses come from deep understanding and care for your customers or users. They come from looking critically at everything that you do at all stages of the business and asking the hard questions as to whether or not what you’re building matters to people. Great companies and products come from hard work, sweat, tears and creativity that is guided to fix the problem at hand. Great companies squeeze every last bit of value out of the tools that are at their disposal. The tools aren’t the reason for the success, they enable and enhance it. Tools like A/B testing software give you the data necessary to answer the question;

Am I making something people want?

Make Something People Want

Blog Logo

Cobi Druxerman

Co-Founder and CMO of Taplytics


Published

Image

Taplytics Blog

The latest and greatest on Mobile A/B testing, analytics and growth from the Taplytics team

Back to Overview