How Data-Driven Decisions Set You Up for Growth

6 min
ipad showing a calendar and a laptop with charts and data on it
Classy blog author photo of Elizabeth Chung
Elizabeth Chung

Many leaders think they know what their audiences want. As such, when they test theories they sometimes rely on anecdotal evidence or feedback to support the claim that it was a success or failure. However, without the data to prove a theory successful or not, in a statistically significant manner, you’ll never really know for sure.

Data-driven decisions can take the guesswork out of choices and even improve your organization’s metrics. Below, we look at the importance of testing and how a data-driven culture can positively impact your bottom line.

Question Your Assumptions

It’s easy to make assumptions about what drives user behavior, but this can cause us to misjudge our audiences altogether. The same can be said about procedures and decisions that are based on habits and tradition, rather than data. For example, you may:

  • Rely on general industry benchmarks in a vacuum
  • Fall back on operational processes because they’ve worked in the past
  • Don’t run A/B tests

Regardless, a lack of data to support decisions and corroborate test results at your nonprofit can potentially halt forward progress towards a goal, prevent growth, or even damage your relationship with your audience. For example, if you’ve historically focused on outbound marketing, it’s worth considering how inbound strategies like content marketing—which gets three times more leads than paid search advertising—can help.

The key is to implement this new strategy in a way that doesn’t disrupt your current engine, which is why A/B testing is so important. Check out the example below to see what this process looks like in action.

A Real-World Example at Classy

We recently experienced an eye-opening moment around user behavior with our weekly blog newsletter, the Classy Roundup. We considered whether adding images to each item in the newsletter would improve click-through rates.

Up to that point, none of the sections in the email included images. Given what we knew about the power of visual content in engaging and attracting readers, however, we assumed adding images would improve our numbers.

Despite our strong conviction, we set up an A/B test to determine the effectiveness of adding images to the email. You can’t rely purely on a hunch. Version A was the control:

Classy Roundup with no images
The control: Version A

And we incorporated images in the test for version B:

Classy Roundup with images
The test: Version B

Through our email client we split the audience so that 50% of subscribers received Version A, and the other received Version B. The test ran for several weeks, and our initial assumption was, much to our surprise, proven wrong.

Version A, the one without images, maintained a higher click-through rate than Version B. If we hadn’t stopped to test our theory here, we would’ve implemented a change that hurt far more than it helped by reducing the number of click-throughs to our blog content.

Improve Your Bottom Line

Beyond testing singular questions or hypotheses, like whether adding images will improve an email’s click-through rate, consistent testing can have a long-lasting and positive impact on your organization. That’s because the results of testing different variables in isolation, no matter what they might be, will help you determine what changes nurture your supporters to make a desired behavior.

Once you know what you want to influence with your changes, and what the outcome of those changes will likely be, you can tailor your experience to drive better results than before. In the example above, the desired action is to click through on content in the newsletter.

Making the change to include images doesn’t contribute to that goal and actually lowers clicks. Thus, instead of including powerful images in the newsletter doubling down on creative, grabby copy and calls to action (CTA) will have more impact.

This mentality carries over into almost every area of your supporter and donor experience. Some, like the newsletter, might indirectly improve your bottom line while others have a direct impact.  For instance, improving the conversion rate of donors clicking the donate button on your homepage will likely yield higher donation revenue. To run an A/B test around this, you could experiment with:

  • Different deadline copy
  • Where your button is located on the page
  • What color your button is
  • Additional CTAs for recurring monthly donations

To go a step further, you could then look at the actual donation form people click through to and A/B test things like:

  • Headlines
  • Copy
  • Overall layout
  • Suggested giving levels
  • Custom forms
Pro Tip
No matter what you A/B test, it’s imperative you calculate the effectiveness of your work to ensure it is statistically significant. This will tell you, which a very high certainty, whether your changes will help or hurt.

Tips to Encourage Data-Driven Decisions and Testing

To help you support data-driven mindsets among your team, we rounded up a few tips and best practices to inform your strategy:

Encourage brainstorms

Get your team members together and encourage everyone to ideate on how your organization can improve processes or the overall business, no matter how unconventional the idea may seem. This helps you take a hard look at your traditional way of doing things and question what you could do better or differently.

Create a schedule of A/B tests

A/B testing is a specific way to support data-driven decisions. Whether it’s for your emails, website, social media strategy, or landing pages, create a list of things you want to test and a schedule for each experiment. Always ask:

  • How many times will you be running the test?
  • What is your cadence for running tests?
  • How will you circulate the findings to your team?
  • What strategies will you use to make sure the takeaways are applied to your organization?

Ask “why?”

As you continue to brainstorm and make decisions, reassess your team’s choices and ask why that route was chosen. This applies to both habits as well as suggestions and assumptions while looking at data.

An assumption based off immediate data might be that visual content has been shown to increase retweets on Twitter, but that doesn’t necessarily mean you should include images in all your tweets. In fact, with some testing you might find that images could increase retweets but actually decrease clicks on links. It’s a matter of asking why and ensuring your decision aligns with your actual business goals.

 

The importance of testing cannot be understated. When you take the steps to “know” rather than “assume,” you pinpoint the strategies that will impact user behavior and power your growth. Have any other thoughts or tips on how to support data-driven decisions at your organization? Let us know in the comments below.

This post was originally published in July 2016 and has been recently updated with new best practices. 


Resources


Where social entrepreneurs go to learn and grow

Join over 60,000 leaders just like you who get their weekly dose of technology, innovation, fundraising ideas, and the latest industry trends.