Chris Cherrett presented his keynote from Attention: The Digital Marketing Summit.
Usability and testing are explored so we can question why something works the way it does and create tests to find the answer.
Confirmation bias is the tendency to selectively search for information that confirms our beliefs; the idea being that if you think you already know, you won't test an issue or trend.
The end of the build is the start of the project, build, measure and learn about different and better ways of doing things.
Watch the keynote here: https://www.youtube.com/watch?v=gd5UhPMJ5XE
7. CONFIRMATION BIAS
ARE YOU BEING OBJECTIVE?
• Your brain is trying to find evidence to prove what
you already think.
• The trap is to accept what you know in your head,
feel in your heart and just ”know” to be true.
• Confirmation Bias shows how limiting your own
perspective can be.
• Sometimes it’s not even your perspective, but the
perspective of your seniors
• HIPPO (Highest Paid Persons Opinion)
minimalbias.com
8. OTHER BIAS
ARE YOU BEING OBJECTIVE?
AMBIGUITY EFFECT
Choosing something with a known probability over something
unknown.
SELECTIVE PERCEPTION
The tendency for expectations to affect perception.
STATUS QUO BIAS
The tendency to like things to stay relatively the same.
BANDWAGON EFFECT
The tendency to do or believe things because many other
people do or believe the same.
minimalbias.com
9. Bias blind spot
a bias that allows you to view yourself as less bias than other people
Disagree?
10. YOU’RE NOT TESTING
YOU THINK YOU ALREADY KNOW
• If you think you already know, you might not test an
issue or trend.
• So convinced by your understanding of the problem
you miss an opportunity to see the truth.
HOW TO TEST
• Question why something works the way it does.
• Create and run tests to find the answer.
11. REINFORCEMENT
YOU'RE ONLY LOOKING AT DATA THAT
WILL REINFORCE YOUR BELIEFS
• Even if you are testing, confirmation bias can still
cause problems.
• If you believe users love your Buy Now button, you
might test changing the main image instead, rather
than testing your assumption.
• Even if real data from real tests reveal something
negative, wouldn't you rather know as early as
possible?
13. The end of the build,
the start of the project
Your first encounter with a design makes you
think it’s the best, and you’re less likely to test it.
16. DEFINE SUCCESS
DEFINE QUANTIFIABLE SUCCESS METRICS
• Are the things you’re moving and changing actually
helping your bottom line?
• Great tools to quantify your success metrics:
• Understand the user flow and identify bottlenecks
(drop out, bounce, don’t convert)
• Google Analytics
• HotJar
• Inspectlet
• Mixpanel
• Kissmetrics
24. START TODAY
GREAT TOOLS
• Always reset your metrics every time you make a
change.
• As with most things – use the tool that meets your
goals, but define your success metrics ahead of
time for optimal tool selection.
27. Reduce time and risk
Reduce the time it takes to deliver your best
performing variant to the majority of your traffic.
Reduce the risk of lost conversions because users
aren’t being served the better variant.
30. MULTI-ARMED BANDIT
SCENARIO
A gambler stands in front of a row of slot machines.
Each machine varies in probability of reward.
The objective of the gambler is to maximise the sum of
rewards earned through the minimum number of lever pulls.
QUESTIONS
• Which machines do they play?
• How many times do they play each machine?
• What order do they play them in?
N-ARMED BANDIT PROBLEM
31. Try the variant with the
highest average reward
THE RULE IS…
(conversion rate)
45. Loss in conversions
due to poorly
performing versions
Time taken for statistical significance
A/B TESTING
MULTI-ARMED BANDIT
46. • Watch out for Confirmation Bias. Are you limited by
your own perspective?
• Are you testing enough? Every design is a
hypothesis.
• The end of the build is the start of the project.
• What are your success metrics? How do you
measure success?
• Use the tools that meet your goals.
• Learn quickly and fail fast.
SUMMARY
TO TAKE AWAY