Chris Castiglione Co-founder of Console.xyz. Adjunct Prof at Columbia University Business School.

Confessions of a Growth Hacker

3 min read

I want to come clean.

I don’t always practice what I preach when it comes to growth hacking. It’s easy to say test everything. In the growth hacking community, testing is dogma. At One Month, we can’t possible test everything that we’re doing. The reality of a startup hits you hard: whether you do it explicitly or not, you have to decide what you’re going to test, because you can’t do it all.

Instead of doing another post about all the things you could and should be doing to growth hack your startup, I want to talk about some of the problems you’re going to run into trying to follow the techniques that growth hackers (like myself) have talked about.

Implementing tracking systems is hard. Like, really hard. Your data is going to be off from the real numbers, no matter how hard you try. If you’re running Javascript-based tracking, it’s going to fail to load in some cases (because people are using adblockers, broken browser extensions, and some just disable Javascript). All a result, the numbers you see in various dashboards will be off from what you see internally in your own logs.

Here’s what I’ve learned so far.

Which system should you trust?

To this day, the conversion rate that we see in Mixpanel is different from the conversion rate that we see in Optimizely, by about 0.5%. That may not seem like a lot but it is when you’re talking about the difference between 2% and 2.5%.

Consistency is key here. Of course, try to get your data to match up as much as possible, but once you’ve gotten close enough, just pick one data source for your experiments and stick with it.

Building and automating a process around these systems is hard.

It turns out that the more data you’re tracking, the harder it is to keep up with each one. Eventually you reach a point where you’re spending all your time analyzing data and not actually acting on it.

Which brings me to another point:

Who’s responsible for monitoring the data?

For a long time, we were tracking a lot of stuff but never looking at most of it. Even today, there are some metrics that we only check monthly or even less frequently.

Acting on the data.

I wish I could say we’re running multiple A/B tests and have a running log of tests to run once those are done and validated. We’re not. We haven’t tested our homepage in weeks, because we’re testing paid ads, course landing pages, and our new learning library.

Tracking your validations and learning.

Your experiments take place across all these different tools from Optimizely to Customer.io. After a while, you lose track of what you’ve actually tested. And how do you make sure your learnings actually gets distributed to the rest of the team, so that they learn from your experiments? Just managing your data — the systems, process, and evaluation — becomes a whole ordeal, which is hard to justify spending time on when you’re already trying to build one company.

We try to do this with a Google document archiving all of our experiments, but it’s a pain to keep up to date and other people don’t always refer to it to see what they can learn. I know a few companies are trying to build solutions to this problem but I haven’t seen one that is very compelling.

Doing the real A/B testing that matters is extremely technical.

Sure, tools like Optimizely and Unbounce make it relatively easy to test superficial stuff on your pages by manipulating the page itself with Javascript. But what about that new feature you’re thinking of releasing? How do you make sure half the users keep seeing that new feature? How do you track the results of that over a long time? That test actually has to be written into your code, which can be quite difficult.

Prioritizing tests.

You can’t test all the things. Some people argue about whether you should even test most things. Should you only test optimizations? What about changes that are obviously going to make the product better? We regularly roll out changes that we strongly believe in without testing them in advance. There are only so many things you’ll have the time and resources to test.

Getting caught up in the stupid shit.

I know that there’s a method for identifying the problem in your bottleneck (I’ve written about it before). I also know that companies should focus on engagement and activation when building up traction, not acquisition. But I still end up getting caught up in the buzz of PR articles, social media, and driving traffic to our site. I still crave those small spikes in traffic because they feel good.

Taking big risks is hard.

We know it’s good for us, but there’s a temptation to just do the safe thing and not try the crazy stuff. It takes courage. We follow the conventional methods far more than we should, and we often assume we’re more likely to be right than we actually are (see confirmation bias).

Having one person manage the entire growth process is almost impossible.

It’s too massive. The whole thing ends up getting split and different people focus on different parts. But when you’re small you have to monitor both acquisition and retention at the same time. And then your attention is divided.

Growth hacking is like spinning plates. If you take your eye off of one for too long, it starts to wobble.

But that’s okay.

That’s the reality of growth hacking. It’s not always as clean and easy as people say it is. The truth is, you’re going to fuck up a lot.

You won’t be able to measure everything you do. Finding the right data and the right things to measure is sometimes way harder than people say it is. Organizing the systems to keep everyone looped in and to take action after you run experiments takes a significant amount of energy.

Just try to do more good than bad. As long as you do more good than bad, you’ll probably be fine. And make mistakes. That’s what it’s all about anyway, right?What have you learned about growth hacking? What’s working well — and how’s reality treating you?

Thanks to Justin Mares and Sarah Kathleen Peck for reading drafts of this

Learn to Code Comment Avatar
Chris Castiglione Co-founder of Console.xyz. Adjunct Prof at Columbia University Business School.