In a previous blog I wrote about the avoidance of fake KPIs when judging performance. I use the term ‘fake’ as in most cases they aren’t the true key performance indicators, just the ones we’ve become used to reporting on. In the same post I also highlighted the need to observe averages and not absolutes.

I want to take this recommendation a step further and to coin a phrase from <intent>’s Richard Harris, encourage you to de-average your users. In other words, go beyond a single average, and explore multiple averages based on behaviour.

Basic explanations of this principle could lead you to believe it’s just another way of talking about segmentation, as it aims to find a range of answers related to customer type. Those who simply browse, vs. high value shoppers, vs. loyal customers and so forth and judging their web behaviour against conversion rates and traffic sources.

However, <intent> proclaim that segmentation and de-averaging are not the same:

  • First, de-averaging is done in real-time. It’s not an asynchronous segmentation/ execution model. It has to happen in milliseconds.

  • Second, it requires predictive analytics to make the decision on what to do... In most cases the consumer you are applying the decision to does not even exist in your database (or if they do, you don’t know it).

If you were to segment your web traffic into percentiles you could plot a line/curve which showed conversion rate increasing from a low level (bouncers) to an extremely high level (converters). To achieve your average site conversion rate you will be drawing upon the performances of these clusters.

segmenting web traffic into percentiles and plotting conversion rate

What de-averaging of your user base does is create additional opportunities to communicate with different levels of investment in real-time.

best means of communicating to different web traffic segments

The trick is to know which cluster someone is in the moment they land on your site (in real time) even when they’re a first time visitor. To gain access to personalisation tools like <intent> your website needs at least 200,000 users per week in order for the machine to learn and predict. For most businesses this is far beyond the realm of possibility so how you can apply this logic if you have less?

Google Ads offer in-market audience targeting which can be applied as bid adjustment levels on your paid search ads. They have clustered audience groups for you based on intent to convert, which is a step up from targeting everyone with the same intensity.

Remarketing Lists for Search Ads is a way to use your own traffic data to cluster audience groups and bid higher or lower depending on recent behaviour on your site, or based on customer purchase history.

Tagging your website to generate new audience clusters based on dwell time, page depth or scroll depth, plus factoring in source of traffic, could open up new avenues to explore high worth / low value visitors which could then be injected back into campaign optimisation.

Marketing automation tools like Hubspot, Force 24 and Dot Digital can flag site visitors and tag them based on behavioural patterns which can then be used to automate email campaigns to follow up on observed behaviour or adjust landing pages to personalise the content for maximum impact.

Whilst the above ideas may not be de-averaging as <intent> intends, by applying the principles of what this is trying to achieve into your marketing strategies, it could open up new avenues for optimisation and personalisation which reap greater rewards.

Back to blog
Meet the author ...

Kherrin Wade

Strategy Director

Kherrin works with clients to develop effective marketing strategies, whether that's introducing brands to digital for the first time or pushing the boundaries with more ...