Blog

Maximizing ROI with Marketing Experimentation

Written by Jorie Munroe | Jun 6, 2023 7:00:00 PM

Building Effective Experiments for Your Audience

Let’s face it: It’s not enough anymore to simply report on your marketing activity. In order to move from a reactive marketing strategy to a proactive marketing strategy that solves for your customers, you need to turn your data into action.

One of the best ways to do that is to find out where there may be friction in the buyer’s journey (often identified with conversion rates) and figure out how to optimize for that interaction.

Choosing the right attribution model

Reporting on the various aspects of the buyer’s journey results in a lot of metrics and a lot of discussion. The good news - and bad news, is that there are no definite rules in the attribution model you choose. There are simply ones that will fit situations and questions a little more than their compatriots.

Ultimately, the one you choose will depend on the report you’re trying to build and what you need it for. The more you use attribution models and see how they influence your reporting, the easier this decision will be.

This way, every update, optimization, or change you make to your website, strategy or approach is backed by data forecasting that it’s the right move to make.

How do you turn small proven ideas into huge hits at your organization? It all starts with great marketing experiments.

Marketing experiments

A marketing experiment is a form of market research in which your goal is to discover new strategies for future campaigns or validate existing ones. For instance, a marketing team might create and send emails to a small segment of their readership to gauge engagement rates, before adding them to a campaign.

Think of running a marketing experiment as taking out an insurance policy on future marketing efforts. It’s a way to minimize your risk and ensure that your efforts are in line with your desired results.

A/B Testing

The most common way to incorporate marketing experiments into your strategy is to start with A/B testing. A/B testing, also known as split testing, is a marketing experiment wherein you split your audience to test a number of variations to a piece of content to determine which performs better. To run an A/B test, you need to create two different versions of one piece of content with changes to a single variable.

You’ll show these two versions to two similarly sized audiences and analyze which version performed better over a specific period of time — long enough to make accurate conclusions about your results. A/B testing has a multitude of benefits to a marketing team depending on what you decide to test. Let’s review a few.

 

With so many benefits, the question becomes, what A/B tests can you run to start optimizing for your customer experience at scale?

Common A/B tests

The most common A/B tests involve landing pages, CTAs, and email. Let’s review some key elements to successfully conduct your own tests.

Landing Page A/B Testing

With landing page A/B testing, you have one URL and two or more versions of the page. Visitors will be randomly sent to one of your variations. Standard landing page A/B testing tools remember which page the reader landed on and will keep showing that page to the user. For statistical validity, split tests need to set a cookie on each visitor to ensure the visitor sees the same variation each time they go to the tested page.

This is how HubSpot’s landing pages tool works. HubSpot’s landing pages enable you to create A/B tests and track several metrics to evaluate how your experiment is performing. It keeps a record of the number of people who viewed each variation and who took the intended action.
For example, say each of your landing page variations was viewed 180 times.

However, the top-performing landing page generated 20 clicks and the lowest performing one generated five clicks. You’d want to stick with the version that was inspiring your visitors to take the desired action, which in this case is clicking an element on the page. That way, if you continue to drive traffic to that page, you know it’s optimized for conversions.

Landing page design is about creating an enticing site page for your target audience and website visitors. It should encourage them to convert from leads into subscribers or customers. So, if you’re looking to optimize said landing pages, where should you start?

On your landing pages, conduct tests on your offers, copy, and your form fields. You can use a similar approach for your blog’s conversion strategy.

HubSpot's landing page A/B testing results: 

Based on previous data, HubSpot found that non-bounce desktop users who engage with search have a 163.8% higher blog lead conversion rate than those who do not. However, only a very small percent of blog traffic interacts with the search bar. That’s why HubSpot decided to test the visual prominence and functionality of the site search bar.

HubSpot used three variants for this test, using offer thank you page views as the primary metric. For variant A, the site search bar had increased visual prominence and the placeholder text was altered to "search by topic." 

For variant B, the search bar had increased visual prominence, the placeholder text was altered to "search by topic," and the search function searched the blog, rather than the whole site.

For variant C, the search bar had increased visual prominence, the placeholder text was changed to "search the blog," and the search function searched the blog, rather than the whole site.

HubSpot found that all three variants increased the conversion rate. However, variant C showed a 3.4% increase in conversion rate and a 6.46% increase in users who engage in the search bar.

CTAs

You can also run A/B tests on your CTAs. CTA split testing works pretty much the same way as landing page split testing. You create two or more variations of your CTA, place them on the same page and display them to visitors randomly. The goal is to determine which CTA attracts the most clicks.

HubSpot’s CTA module enables you to quickly build A/B tests and identify the data that matters the most to your organization. For instance, you might look at the views-to-click rate in an effort to optimize the CTA, but if your click-to submission rate is surprisingly low, the problem might lie with the landing page.

That is why, ultimately, you want to optimize your view-to-submission rate.

You should only run one A/B test at a time, so don’t try to optimize both the CTA and the landing page simultaneously. Change one variable at a time to understand which element triggered the results.

When running a CTA A/B test, consider testing your placement, size, color, copy, and any graphics on your button. HubSpot uses several different calls-to-actions in its blog posts. On any given blog post, you’ll notice anchor text in the introduction, a graphic CTA at the bottom, and a slide-in CTA when you scroll through the post. However, on mobile, these CTAs might seem intrusive — HubSpot tested mobile CTAs.

HubSpot's CTA A/B testing results: 

Previous A/B tests revealed that HubSpot’s mobile audience was 44% more likely to click through to an offer landing page and 18% more likely to convert on the offer if all CTAs were stripped from blog posts and there was only one CTA bar at the bottom of the page with no ability to exit.

HubSpot decided to test different versions of the bottom-of-the-page CTA bar, using thank you page views as the primary metric and CTA clicks as the secondary metric. HubSpot used four variants for this test.

For variant A, the control, the traditional placement of CTAs remained unchanged. For variant B, the CTA had a maximize/minimize option so readers could dismiss the CTA using an up/down caret. For variant C, the CTA had an X that would completely dismiss the CTA from the post leaving no formal CTA on the blog. For variant D, the CTA had no X or a maximize/minimize option.

Variant B saw a 7.9% increase, variant C saw an 11.4% decrease, and variant D saw a 14.6% increase.

HubSpot used that data to project that using variant D on mobile would lead to about 1,300 additional submissions each month.

Email A/B tests

Last, but certainly not least, you can choose to A/B test your emails. Most email providers automate the split testing process enabling you to compare different elements about your email. 

These email tools randomize the list of recipients into two or more groups (you need to ensure the groups are big enough to give you a statistically significant result) and associate each email variation accordingly. HubSpot, for instance, splits your email campaign to help you find out the best subject line and time of day to send an email. A tool like can also send the winning variation to the remainder of your group.

This is a solid way to optimize your list and deliver the message that attracts the most attention. HubSpot and most standard email providers enable you to pick a winner based on either open-rate or clickthrough rate.

However, you also want to see which email is bringing in the most conversions. Identify which variation, combined with the right landing page, delivers the best results. For this type of reporting you need to integrate your email marketing to marketing analytics. There are a variety of different areas you could test in your emails, including: format, layout, timing, sender, subject lines, and target group. 

HubSpot's email A/B testing results: 

HubSpot found that unlike emails, in-app notifications are often overlooked or missed by users. The emails outperformed in-app notifications by 140%. 24.9% of people who opened either email variant left a review, compared to 10.3% of those who opened the in-app notification.

This by no means a comprehensive list of all the A/B tests you could run. While landing page, CTA, and email are the most common areas to experiment on, you may find yourself wanting to experiment on certain web forms or the layouts of key site pages (like your pricing page). That’s great! The appeal of A/B testing is that it can be done on many different marketing assets.

Ready to optimize your audience experience and drive conversions?

Test whichever type of content you work with the most to ensure you get a better sense of the experience your audience wants from your organization. That way you can maximize your potential conversions and continue to solve for your customers. 

Marketing experimentation not only helps you mitigate risk, it puts your marketing data to good use by helping your forecast future success. 

Ready to get started? Reach out to our team at Aptitude 8 and let's see how we can help you optimize your efforts.