Ever tried to build a jigsaw puzzle without knowing what the picture is? It’s even harder than you think. There are so many pieces to choose from, it is difficult and overwhelming. Most attempts to start will prove to be wrong, sending you down the wrong path, and forcing you to start over. You get frustrated and mad, and probably end up saying “Screw it!” and throw all the pieces back in the box until another day – that never comes.
Understanding what you are looking at in Google Analytics can be just as frustrating, especially when all you want to know is whether your website, email or banner ads are doing what you want them to do.
Having made plenty of mistakes in the early part of my career trying to understand what’s happening online and what visitors are doing online, I have learned a lot. From those initial mistakes I’ve developed a few key guidelines that have helped me analyze the piles of data that I now have access to. Here are the benchmarks I use to make sense of it all.
1. What are your objectives? What data do you need to look at?
Most people just want to dive straight into the data to see the basics – how many visitors have landed on the site, how long they’re staying, what they’re looking at, and how many of them bounced. This is great, but essentially meaningless unless you know what success for your website, email or banner ad really looks like. Always start with clear objectives, so when you’re successful you’ll know it.
I had a client who wanted us to redesign their newsletter, providing us with content around lifestyle, technology, and how-to topics. Initially, they were very happy with the newsletter’s open rate – an impressive 53%! This is a very good open rate, especially since their mailing list was only 100,000 recipients per month. But the objective of the email was to engage the customer with content, so the open rate didn’t tell the whole story.
A deeper dive into the data found that only 5%-10% of customers who opened the email were actually clicking through to all that content, meaning the emails were not engaging the customers as the client had thought. They were only looking at a small piece of the puzzle and seeing success. Once we added another perspective, it revealed a bigger picture of the story. Setting your objectives and deciding what data you need to look at will help you better understand the full story of how successful your campaign is.
2. Understand what the definitions are.
I have to admit that the most confusing part of any analytics system is understanding what each data point really means. These definitions change how you look at the data in front of you. What’s the difference between a visitor and a unique visitor, a bounce vs. an exit page? Some terms are very close in meaning but provide a different view on the data.
I worked on the existing website of another client who was very pleased with the number of visitors to the site – a few thousand per day – but was concerned with their bounce rate, a pretty high 70-80%. Their understanding of a bounce rate was correct – visitors landing on the site who do not engage and quickly leave. So they wanted to know why visitors were leaving the page right after landing.
While they had clearly outlined objectives for the site, their understanding of their bounce rate data was flawed. The CTA (Call to Action) was actually sending visitors to a new site, a desired result that technically scores as a bounce. The client was still reaching their objectives but they weren’t taking into consideration that the CTA was forcing people to click away. Once we subtracted the visitors who were following through on the CTA, their real bounce rate decreased to about 25%-30%. Understanding how the definition of a bounce rate applied to their site and what the data they were looking at truly meant gave them a bigger view into their site’s performance.
3. Lies, DamnedLies and Statistics. Question what you are looking at.
I had a 2nd year Statistics professor who often cited this quote:
“There are three kinds of lies: lies, damned lies, and statistics.” – Benjamin Disraeli
It has stuck in my head ever since. What he was trying to teach us was that statistics are only as good as the data that it comes from. Questioning the data and knowing where it is coming from – understanding the percentage, index number or chart you are looking at – is critical.
I was developing a dashboard for a client showing the results of all the channels being used, giving them a top-line overview of whether the campaign was working or not. We had several animated banners going on, placed on different sites, and when the dashboard was completed and we reviewed the charts internally, we noticed that one site was out-performing all the rest. This didn’t make sense to me. We were only a few weeks into the campaign and the other banners were underperforming with a CTR (Click Through Rate) of 0.06% -0.08%. How was this one banner delivering a 1.08% CTR?
Reviewing the raw data, we discovered that the high performing banner actually had the lowest number of planned impressions to clicks compared the other banners. That’s like saying you have a 50% open rate but only sent the email to two people. In this case the data was giving us a statistical lie. The sample was too small to be relevant. By questioning the data and understanding where it is coming from you can avoid falling into a statistical lie and thinking your campaign is successful when it is actually failing.
There is still much to be learned about the capabilities and insights possible through analytics. I hope some of the lessons I have learned can help you avoid the same mistakes when reviewing your own data. Know what the objectives are for your communication, what data you need to review, how that data is defined, and be prepared to to question that data when it looks out of place. If you can do all that, you will have a good foundation for the rest of your analytics.