It may seem almost impossible to distance yourself from judging designs and your brand against personal preference, but it’s really important to put your target audience first when making decisions about web design, branding and advertising.
Quite a few years ago now I attended an APG event titled ‘On the contrary’ where the words of esteemed marketing professor, Mark Ritson, have forever been etched in my mind. ‘You’re not the customer’...
Just like parents with their own kids, brand managers think their brand is beautiful. Once you start working for a company, you stop becoming a customer for your brand.
This is so true. When ‘meeting’ a new company for the first time, usually via a brief, I like to retain as much of the ‘fresh pair of eyes’ objective viewpoint of their website and brand as possible. Once you start getting familiar with it, you definitely don’t have the same objective point of view (you start to know too much and have access to more information than the customer). As a marketer you may also find that you’re already biased with regards to assessing subjective matters because you’re more aware of what marketing is trying to achieve. This is why data can come in handy as it can break through personal opinion and provide evidence for positives, negatives, strengths and weaknesses.
For more thought provoking content on your own personal biases check out this podcast from Hidden Brain: The Double Standard
It’s easy to spot bias in other people, especially those with whom we disagree. But it’s not so easy to recognize our own biases. Psychologist Emily Pronin says it’s partly because of our brain architecture. In this episode Hidden Brain explores what Pronin calls the ‘introspection illusion’
Objectively use data to make decisions
Recently I undertook some website performance analysis for a client where I focused on what their Google Analytics had to say about how their website was being used, and what recommendations should be made based on their targets and expectations.
Not all the information presented was well received as it jarred with their pre-existing decision making and marketing activity, however you can’t argue with the results. Yes, data is open to interpretation, but if it’s lacking in conviction then there is definitely a lot of room to consider alternatives.
I would advise being open-minded when reviewing your web analytics data. Don’t be precious over the stats just because you’ve spent loads of time nurturing your social media presence, writing blogs, or spending in PPC. And don’t try and paint the picture that supports this effort if it’s tenuous. If you’re tracking correctly then the data should indicate where you may need to pivot if you’re not getting the desired impact.
Sometimes getting peer reviews or a third party to review activity with a fresher pair of eyes can offer the objective analysis you need to make pragmatic decisions.
Likewise search is riddled with fact vs. personal preference. In order to meet search demand you need to align with what the user is searching for, not what you’d like them to search for. When it comes to misspellings this can be a little tricky as you don’t want to purposefully spell things wrong, but you do also want to align with what the customer is looking for (and they’re probably none the wiser if they’re spelling something wrong). So tread carefully with poor spellings and find the balance between matching customer expectations and conforming to brand guidelines.
Test with your target audience
Now I’m not the target audience for all briefs that come in, so again I can’t pretend that my opinion is any less flawed, but my naivety sparks me to go and seek out audience insights. Finding people who are customers or could be customers to ask them; promoting a culture of testing and experimenting with live activity to gain data which would infer preference; and evaluating what the category norms are.
With conversion rate optimisation
Recently we finished a project where the outcome couldn’t be closer to the truth with regards to personal opinion not aligning with customer preference. We’ve been working with postal gold buyer Postgoldforcash.com for the past 10 years (so you’d think we’d know them and their customer base quite well by now), and over that time we’ve launched a few A:B tests to evolve their website performance. In most cases these tests haven’t really shifted the needle with regards to performance, but our last, most comprehensive conversion rate project, really surprised us.
For full details of the case study click here. To summarise, we deployed a series of landing page tests in the pursuit of a better conversion rate. These tests were in most cases iterative and concentrated on changes to call to action, colour palette and the introduction of more urgency into the decision making. With each new test we made headway with tiny improvements to conversion rate, but we really wanted to push the boundaries and came up with a completely different re-design including colour palette, tone of voice and visual style.
In all honesty, none of the Adido team (inc. the designer!), nor the client actually liked the newer design but its construction was born out of thinking ‘what would the customer like’ and ‘what would be best for them.’ We put it in front of some members of the client’s team who were more attune to the target audience and their reaction was a lot more positive. This gave us the confidence to test with our live PPC audience, and following a period of statistically significant testing it was (surprisingly) proven to be the winner!
Objectively reviewing the data clearly had a bearing on the final decision to roll-out a site design which the client was less keen on, but they couldn’t argue with the impact it had on results. At the end of the day, that’s all that really matters.
So, when reviewing designs try to distance yourself from personal preference, and think about your target audience. Wherever possible you should seek out their feedback, whether that’s qualitative or quantitative, and test with enough data to make concrete conclusions. That is the beauty of conversion rate optimisation, you get to see what happens without relying on a post-rationalised answer if you asked someone why they did what they did.
I honestly don’t think we use conversion rate optimisation enough, and whilst this is an article championing the need to engage with your target audience with designs and content, it is also a piece about inspiring us all to experiment more.
It can seem like an (exhausting) never ending journey of ideas, tests and changes, but we should embrace this advantage in digital activity to observe, analyse, test and evolve to push website and marketing performance further.
From an advertising perspective we often lean on behavioural science to help formulate tests and determine what motivates the customer to take action.
For one of our B2B clients they have always lead with a powerful statement about the total number of people in their community, however we wanted to know whether it was the size of the whole, the relevance of a smaller number (people and/or companies), or just a strong call to action which would get people to take action on their social ads.
We were testing cognitive biases with a particular focus on social proofing in this example. For this professional audience we weren’t confident on what would drive their click intent, but it turned out that relevancy of audience size amongst peers was the most impactful. This does align with behavioural science theory (relevant social proof) but had been the first time that the client was made to think that the total number may not be as persuasive as the smaller more relevant one.
Testing messages in ad text and ads is far easier, quicker and cheaper than addressing visual elements of a website, however they all require putting the customer first. If your message doesn’t resonate, or your website doesn’t meet expectations then your efforts to engage will be wasted.
If you’re going through a website design / redesign, or are deciding on ad copy for a campaign and you have some polarising (personal) opinions, as long as they’re not likely to damage the brand/business then it's definitely worth being willing to test them all. Know what it is about them that makes them special / right / appropriate (the hypothesis) and ensure they are evaluated on the same basis.
With user experience baked into your web design
Long gone are the days of personal preference dictating the look and feel of your brand's digital presence. Done correctly, crafting a user experience will give your customers what they want, when they want it, delivered how they would expect it.
There is a lot that goes into web design that turns it from a pretty looking website into a high performing one. You should never underestimate the role user experience plays in crafting an effective website experience.
5 second tests on design elements, and user testing with prototypes can form part of the UX phase of a web project which brings the target audience into your world. It’s really important to stay connected with WHO you’re doing this website for, and gain as much input from them as possible to steer your designs in the most effective direction.
So, all that’s left to conclude is that everytime you notice yourself making a creative decision based on personal preference, pause, and think whether this should be something to test rather than commit to straight away. Can it be tested? Can it be validated by your target audience via qualitative or quantitative methods? A straw poll with a few people in your target audience, whilst not statistically significant, is better than relying on your own personal opinion.
Remember. You are not (and never will be) the customer. You know too much.
You talk to your audience in their language, not yours. That's the only route into their minds.