What even is Measurefest?
Measurefest is an annually held conference all about the wonders of Web Analytics, Big Data, Business Intelligence, and Conversion Rate Optimisation (CRO). This year the conference was held at the beautiful Barbican centre in Central London. Once we had all navigated our way through the maze of corridors, caffeinated up and devoured the weight of a small horse, in muffins, we descended into the rather lava lamp-esque theatre. Pencils, laptops, and twitter accounts at the ready, it was time to learn.
Data > Gut instinct
We all know that person who has been in the biz that bit longer and thinks they know best. They just have that gut feeling, so their argument wins? Yes? Well only if the Data agrees. Welcome Guillaume of Moonpig, to the stage, to equip us against these gut instinct gurus who think their experience gives them the commanding power.
But first who is this man? Mr Guillaume Lombard is Moonpig’s “Product analyst and Optimiser”, their pro AB Tester, Data Master one may say. This is an important role to consider in a data driven company as each team within a company should have knowledge of their own data but may not understand that of another. By having a purely data analyst position, this person (or people) can understand each team’s needs, ideas, and the priority of these to benefit the overall business evolution. Further to this, testing can happen more often meaning faster optimisation and better faster results.
So, our weapon you ask? DATA. You can’t argue with the facts. But you must be careful too. Data must be of significance, of importance and relevance. Thus, it is important to follow a well-structured strategy to testing:
The Approach1. Formulate hypothesis and establish the importance of the outcome
E.g. If the design want to try a fancy new button in the checkout process but the SEO Team want to test a new landing page for a product, which is more important to test. End the argument by looking at the data. If the checkout process has a low drop out rate yet the product page is losing users all over the place, it is clear that the SEO team win this round.
3. Report results
Tailor what you are looking for to each test:
- What are the KPIS?
- What is good to one team may appear bad to another. Thus, establish this beforehand to give a common understanding
- What classifies as statistical significance?
Keep a record of how this test was done and why. This gives a future understanding on similar or repeat tests
- Let the data do the talking to win an argument
- Test, test and keep testing
- Never underestimate a test
- Data rules
Why you shouldn’t trust GA with your life... or account
Step up Mr Tom Capper, Analytics Consultant from Distilled, to blow our minds, and revaluate our relationships with Google Analytics.
For starts here is a little taster of MISLEADING Metrics:
Average Time on page:
One may be silly enough to think that GA would define this metric as:
“The Average amount of time users spent viewing a specified page or screen, or set of pages or screens”
Ha! What a fool! – just like the rest of us in the room. GA actually defines the following
“Average time on page” = (“Time on page”) / (Pageviews – Exits)
Thus, the following example will destroy any of your logical/mathematical sense:
By laws of (mean) average the GA Time on page would be:
Average time on Page = (20s + 10s + 0s) / 3 = 10.
But in fact, GA calculates this metric as:
Average time on Page = (20s + 10s + 0s)/ (3 Pageviews – 2 Exits)
= (30s)/ 1
So, GA concludes that the AVERAGE time on page is greater than any session time involved in the calculation. I may as well throw my Engineering Maths degree out the window now.
Sessions, and average session duration are more metrics to be wary of. The midnight strike of the clock can cut one session into being recorded as two separate sessions. Getting a cuppa tea or leaving a tab open for a set duration of time will also lead to one session recording as two. Though if one person does this, then no fret, but if a thousand people disappeared to grab a cuppa and got distracted by the biscuit tin…this could cause disastrous effects to your data.
Though these times can be changed, you will always have a “midnight” splitting time and the issue of balancing real returning user sessions to those still scrolling their twitter feed.
Tom’s presentation explained a few other metrics to be aware of:
- Avg. Session Duration
- Avg. Time on Page
- Bounce Rate
- Pages/ Session
- Exit Rate
- Social Interactions
- Scroll Depth
- Micro-Conversions (e.g. Signup)
Further key points:
- Don’t assume what GA is on about – you will end up misleading yourself and not interpreting the data correctly
- Know your metrics, and what ones are important to your account.
- Nerd up on all things GA.
In this blog, I have merely dipped my toe into the pond lake ocean of knowledge that was shared that day. The number of amazing insights, tips, and tricks I learnt would cause me to write a blog nearly as long as my dissertation, and I don’t believe anyone out there really wants to troll through all my musings. So, I strongly advise you go check out all the MeasureFest speakers' talks, and head down to MeasureFest next year! I look forward to seeing you there!