If message testing isn’t part of your marketing strategy, you’re likely wasting resources on copy that doesn’t resonate, spending thousands on campaigns that don’t engage, and publishing generic landing pages.
Message testing gives you insights into what messaging generates an impact.
In this article, we'll share how to get started with message testing, methodologies that improve your message’s effectiveness, and three practical examples of revamped brand messaging.
Message testing analyzes a company’s marketing message to evaluate how well it resonates with its target audience.
Marketing messages are your company’s offer to potential customers. A compelling marketing message makes customers feel understood, while showing how your product benefits them and how it can solve their problems.
Your message will only resonate if it clearly communicates solutions that matter most to your audience.
Let’s say you’re creating messaging for an email marketing platform targeted at senior marketing leaders. If your messaging focuses on ease of use, it’s unlikely they’ll care. Marketing leaders are measured against return on investment (ROI).
With message testing, you put your landing page copy and value proposition in front of your ideal customer profile, or buyer persona, and collect qualitative insights to improve it.
Message testing is different from user testing because it tests the messaging only, not how your customers use your website.
Too many marketers aim to be clever and original with their marketing. However, cleverness can be confusing. Clarity, on the other hand, communicates precisely why your audience should care.
For example, a clever marketing subheading might read:
“Entrepreneurial-approved time-savers.”
This leaves customers confused about what they’re being sold and why it matters to them.
Clear brand statements set up the value and intentions from the beginning.
For example, Webflow’s landing page states, “The site you want—without the dev time.”
Its purpose is clear: to provide a no-code web design tool that allows customers to build the site they’ve always wanted in a fraction of the time. Why should they care? Because, unlike with other tools, you don’t need coding experience.
When marketing messages work, they are 2x as influential as design in converting customers. When they don’t, you’re missing out on conversion potential—and your business's success depends on how well you can convert.
However, message testing isn’t only for conversions. Conducted well, it can:
In contrast, when messaging isn’t working, it can confuse, frustrate, and alienate your customers.
Message testing uses qualitative and quantitative methods.
Qualitative research relies on observation or non-numerical market research analysis, such as in-depth interviews. It’s ideal for getting a more rounded understanding of customer motivations.
Quantitative research collects hard data using techniques like surveys, polls, and other close-ended questions. It’s a scalable way to identify patterns and trends.
For instance, in a focus group designed to gather user input about a specific product, you can use qualitative observations and qualitative observations.
Qualitative questions will sound something like: What do you like about this product? How does it solve your problems?
Quantitative observations will sound like: How likely are you to recommend our product to a friend? Rate 1–5, 1 as very unlikely, and 5 as highly likely.
In message testing, your quantitative and qualitative research should focus on these key dimensions of your message:
If any of these is lacking, your message might fail to land. Collecting information on all of these dimensions may be a weeks-long process, but there are tools to make this easier.
Wynter delivers results in 12–48 hours, far quicker than any method on this list. We have our own B2B panel of validated professionals in various industries with different seniority levels, job titles and company sizes to help test your messaging with your target customers.
We test your website and landing page messaging, overall brand positioning and narrative, sales and marketing funnels, and outbound email messages and sales demos.
How a message test works with Wynter:
Wynter is the most efficient way to gather the needed qualitative data to refine your message. However, using Wynter isn’t the only approach to message testing. We’ll review alternative methods and how you might deploy them for your company.
The Likert scale is a psychometric tool used to measure attitudes, values, and beliefs. A typical scale is made up of five items, each having a different value representing the degree of agreement or disagreement with the topic.
To compile Likert scale questions, you’ll use the six dimensions discussed above: clarity, relevance, value, timeliness, consistency, and differentiation.
For example, sample statements might look like this:
The message is clear, and you know exactly what the company offers.
The company’s messaging motivates you to take action.
Likert scales have many advantages over other types of quantitative research surveys. They’re simple to create, easy for respondents to answer, and provide precise quantification of people's feelings about the topic surveyed.
However, there are limitations. Most fundamentally, Likert scales cannot capture nuanced responses and typically fail to pinpoint why a customer disagreed with a statement.
They’re also only as good as the person who writes them. Even a minor ambiguity in wording or presentation can compromise the validity of responses. It takes time and professional expertise to achieve reliable results.
Open-ended questions cannot be answered in a single word or number. They permit a broad range of potential responses and allow a nuanced response the Likert scale lacks. As a result, you can draw out as much information as possible from your customers.
Examples of strong open-ended questions are:
These questions help you gauge how well (or not) your customers understand your product. If your customers struggle with their answers (or miss the intended point), it’s a sign you need to rework your messaging.
Review mining is the process of digging into customer reviews and surveys to see what customers think of your offering.
Using this qualitative method, identify common words to describe your product or uncover hidden insights, like what features your customers like best. Then, use this data to craft the right messaging.
For example, software solution finder Cuspera is a goldmine for Wynter customer testimonials.
As you can see, there are common themes in our reviews:
These are results from real people. Use this data to highlight your customer’s outcomes and favorite product features.
It’s worth noting that reviews are experiencing a dip in consumer trust. In 2020, BrightLocal reported that 79% of survey respondents trust reviews as much as recommendations from family and friends. In 2022, just 49% trust reviews.
This is likely down to consumers wising up to fake reviews. Steady Demand’s SEO expert Ben Fisher says:
“It’s no surprise that Google and Amazon lead the pack here, followed by Facebook and Yelp. Google is horrible when it comes to detecting and removing fake reviews.
Amazon has always had an issue with them as well. Yelp happens to be the best when it comes to detecting and removing fake reviews, but, then again, reviews really are their business.” [via BrightLocal]
When looking for reviews to mine, be on the lookout for fake reviews as this could skew your results. Anything that sounds repetitive (key terms worded the exact same way across several reviews), overly positive (a suspicious amount of exclamation points), or like it came straight out of a boardroom deserves a skeptical eye.
The message mining process might look like this:
This process helps you gain insight into user preferences to better inform your website, landing page and ad copy, blog post topics, and more.
However, while message mining can produce useful results, it can be time-consuming. Worse still, the insights don’t uncover any flaws specific to your brand’s current messaging.
Website polls are questions that pop up as a customer navigates a website. Polls help you collect customers’ feedback about your product or service through open- and closed-ended message testing questions.
Common survey tools include Hotjar, Mentimeter, and Qualaroo.
Web-site polls work because they’re convenient for customers to answer quickly while continuing (or not) on their buying journey.
Unfortunately, feedback does not include specific customer issues and there’s no way to target ideal customers.
Heat maps are visual representations of data showing where users click on a webpage. They identify and quantify user behavior patterns such as what buttons users click and where they spend the most time on the website—and can predict what may happen next.
“Heat map” is a general term that includes:
To make accurate conclusions from your heat map reports, use a substantial sample size. We recommend 2,000–3,000 pageviews per design screen.
Heat maps are an effective way to collect, visualize, and analyze data quickly and easily, but they’re severely limited. They only reflect the user’s actions on the site, but communicate little about their motivations, such as why specific keywords, text, links, or images draw their attention.
If you are on a budget or are more interested in evaluating user experience on your website, heat maps can be a good start.
Website analytics consists of tracking website activity and how visitors interact with a site using tools such as Google Analytics and Adobe Analytics.
These tools provide insight into how users navigate a site, where they go when they visit, what they are looking at, what they click on, and other relevant information.
Various types of analytics exist, but all aim to provide insight into website behavior.
These include:
Website analytics provide great quantitative analysis but miss the emotional qualitative piece of the customer experience.
Quantitative message testing and A/B testing are both forms of empirical testing but are often inaccurately lumped together as the same. Message testing and A/B testing are not the same thing.
A/B testing is a method in which two different versions of a web page or website are created and tested against each other. It does involve testing, but it doesn't determine what customers find important or how they interpret your message.
Message testing gives you all these details.
The method of messaging research you choose will be down to your needs and resources. For a more comprehensive understanding, let’s look at three companies and their message testing process.
In this section, we’ll look at three B2B companies whose messaging we tested at Wynter. We’ll explore their case from problem to solution, including target audience responses and our recommendations.
PandaDoc is a cloud-based SaaS company helping users create proposals, quotes, contracts, and more. But you wouldn’t know it based on their early messaging.
For PandaDoc, we put together a 50-person marketing panel, analyzed the results, and found that the overall brand messaging was confusing.
The company uses unclear, generic words like “on-brand docs.” “On-brand” left marketers uncertain, and the word “docs” made some think the company was targeting doctors.
PandaDoc is suffering from a lack of clarity. They could message mine for commonly used words; in this case, “professional look” was used multiple times in their reviews, so this is a great term to add to your marketing efforts.
Apart from this, “on-brand docs” isn’t the most essential value proposition for the target market. It left marketers wondering, “why does this matter?”:
Because there is plenty of competition in this market, we’d recommend differentiation and clarification:
Another missed opportunity is the sub-heading “easy-to-use, hyper-personalized sales documents.”
Although clear, it provides no information on how PandaDoc provides more personalized documents than anyone else. Spell out the customer relevance by identifying the problem and selling the customer on it in their own words.
Finally, PandaDocs should back everything up with social proof. Add specific numbers of other businesses who use their software (or who have switched from a competitor) and include authentic customer testimonials to build customer trust.
Metadata is an AI demand generation platform for B2B marketing companies that automates mundane tasks.
The first thing we noticed was the graphics. There are too many elements on the page fighting for our attention, namely Benjamin Franklin and the woman. This detracts from our focus on the value proposition.
The copy also lacks consistency; the rest of the homepage doesn’t match the quirkiness of the material above the fold.
Apart from toning down their hero section and aiming for a more consistent tone of voice throughout, Metadata could also aim for specificity and clarity.
“Drive more revenue” is an overused, obvious, non-specific phrase. Instead of this buzzphrase, Metadata should explain what the platform does. By the end of the landing page, we still don’t know how it automates paid campaigns, and neither did the audience.
Finally, Metadata should include better examples of differentiation. They use a vague chart comparing themselves to other companies but never mention what makes them different from competitors.
This chart doesn’t clarify who the competitors are. There are logos but no company names, so it leaves customers to do the guesswork.
To more clearly demonstrate where it excels, Metadata could use a comparison chart more like the one featured on their ROI page.
This gives customers a clear view of their competition and how they stack up against them.
Find more insights on Metadata, like we did for PandaDoc, in this Google Sheet.
Loom is a screen recording application allowing users to record audio, video, browser windows, or whole screens. Loom does many good things on its website but the messaging could be more clear according to our panel.
We put this in front of Project Management directors to find the problems.
First, there are unclear use-cases. The photo shows “Q1 Closed Deals,” which focuses on sales, but this is an excellent opportunity to showcase more use cases and call out target users other than salespeople.
There’s also a lack of differentiation. It’s clear that it’s a video service, but how is this different from Zoom or Camtasia? Loom fails to communicate this well.
Lastly, distracting visuals draw our attention away from the value prop. The video of the woman distracts from the product preview. They should feature this in another section on the page or do away with it entirely.
Loom needs to explain who it’s for. Since the company is relatively new (founded in 2016), the copy needs to be explicit about their offer and their audience.
Loom could list several popular use cases for their video software and why customers choose them over competitors (the ability to send video links instead of files, notifications when the video has been viewed, etc.).
Many customers requested more information on how it works: does it record your whole screen? Can you limit the recording area? Do you have to be in a circle alongside your screenshare?
Finally, Loom should also feature their value proposition more prominently, as right now the hero section is dominated by the graphics.
Find more insights on Loom here in this Google Sheet.
If you want to connect with customers, message testing must be part of your marketing strategy.
Without it, you’re guessing whether your marketing communications resonate with customers and potentially wasting valuable time and money on misguided campaigns.
Luckily, there are several ways to test your messages—both before and after launch. Validate your copy, resonate with your audience, and increase conversion rates with message testing.
Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.