State of Web and Mobile Analytics 2018


My goal with this report and GrowthHop, in general, is to help companies make the most out of their data, understand the world of analytics and feel confident that they are not behind but headed in the right direction. I hope this report gives you an idea of what is possible and how you should be thinking about analytics for your company. 

I'd be thrilled if you left a comment or sent me an email on your thoughts. What did you feel was wrong, what did I miss, and what questions do you have?

Table of Contents

The report is set up with a brief introduction to give you some context on the world of analytics and why this report is needed at this time. Then it is followed up with 5 key pillars that every online business should be following when they are working through their analytics goals, strategies, data implementations, team structure, tool selection and so forth.

  • Introduction - Why is this report needed right now?
  • Pillar 1 - How should companies instrument and capture data?
  • Pillar 2 - Why is data distribution and automation so important?
  • Pillar 3 - What types of tools should be used for data visualization?
  • Pillar 4 - Why companies should be seeking advanced analysis?
  • Pillar 5 - What should the new era of business intelligence teams look like?

Introduction and context to this report

I've been doing analytics for online businesses my entire career. I've seen the analytics world evolve over the years with some amazing technologies emerging and driving innovation. If I were to boil it down, analytics is the means by which we make better decisions for our business. However, with the proliferation of data being created at an exponential rate in our businesses, there has been a great need for new understanding and structure to be put forth. The way I see many companies doing "data" has been a myriad of methodologies and tools that don't always make sense. This is a new era and I'd go as far to say that analytics deserves to be in its own category just like marketing, sales or customer service.

The scariest thing I have seen in recent years and what has led me to want to write the report is a lie that has been spread by various analytics products in the market; that by signing up with their data tool it will solve your businesses analytics problem. I see a lot of executive teams falling into this trap. Analytics isn't solved by tools, in the same way, marketing isn't solved by tools. Principles, strategies, and execution are what make the difference between a data driven company and one that pretends to be.

To add clarity to a new and comprehensive way of thinking about analytics in your business I have isolated 5 key pillars that every online business should consider when tackling their analytics and data management. To foreshadow, most companies skip right to pillar 3 and think that is what they need when in reality they are missing several important considerations. 


Pillar 1 | Instrumenting and Capturing Data

Your analytics are only as good as the integrity by which it is captured. It is also only as good as by how far you've gone to comprehensively capture the various datasets possible. This means that in this phase of capturing data you should place a premium on how thoroughly you QA the data. I've seen time and time again companies instrument some analytics tagging on their website or app and not take the time to QA the data and then months later find they had done it incorrectly. The nature of analytics is you can't time travel and fix bad implementations down the road. So spend the extra week or few days being thorough on the QA.

When it comes to data capture I see a lot of companies following out-of-the-box approach. If I pick on Google Analytics, the free analytics tool of the internet, you are not doing analytics by placing their tracking code on your website and calling it good. What you get out of the box with these analytics tools is less than 1% of what is possible for your business. As an analyst, I don't use Google Analytics unless I am forced to do so. For the small business with very limited resources, sure go ahead and use that. But for the savvy online business with funding invest in a better platform or take the time to at least do a custom implementation of Google Analytics. 

I want to cover some of the data sets you can track from your product that you may not even know about. If you follow the guide below on what data sets to track you will enrich your analytics and ability to gain deeper insights on your user behavior and business.

There is one more important thing I need to cover before we dive in on the specifics. That is the principle of how data is captured. There are events and there are properties. Both of these things, in the analytics world, get loaded into one payload or packet of data. For example, a Pageview is an event, but the URL, Referrer, Timestamp and so forth are the properties of that event. So most of what I talk about next is capturing events that have extra properties. To give an example of what this looks like in a tool - think of Google Analytics Events and then you can add Dimensions. 

Here is an example of Events (Pageviews) and a few dimensions (Page, Referral Path, etc.)

Here is an example of Events (Pageviews) and a few dimensions (Page, Referral Path, etc.)

Who - If you have the ability to track individuals then I would start here. What data do you have on your users? This could be as simple as a few attributes on email subscribers or as advanced as the data Facebook has on users. It may be sitting in your backend database or stored in a CMS of some sort. Try and push the meaningful data that you collect on your users to your analytics aggregator. Almost every analytics platform I have seen is able to ingest data from client, cloud or server-side data sources so if you store it somewhere odds are you can have an engineer push it to be served up to your tool. 

When you are thinking about what user attributes to serve up to your analytics think about what would be important to slice data by down the road. For example, with FarmLogs we sent the following attributes - farmer type, acreage, location, crop types, customer type, and so forth. As one example of what we were able to do with this data, we did a customer propensity research project and lifted sales conversion by over 40% - see the case study here.

Where - This is the most common analytics tagging on the web. All I have to say is one word - Pageviews. You want to know where users are located on your website or app. For apps the equivalent is Screenviews. Most companies leave it at that, just simply capturing pageviews and some simple data around pageviews like the timestamp, source of where users came from, and so forth. Most analytics tools will capture this data out of the box; here is a list of the standard metrics captured (Name of Page, Path, Referrer, Search, Title, URL, Keywords, and a Time Stamp)

From here there is more metadata you can capture that will add more value to your analytics. Think about what exists on the various pages that you can serve up to your analytics. Here are some examples of a few different industries:

ECommerce - Try and capture in your pageview analytics other metadata properties that will allow an analyst or person looking at the data to slice by the types of products your customers are looking at. Examples could include - Product Name, Brand, Price, Other Product Specs, Color, Size, etc. Still not convinced, think of one example for a dashboard, instead of tracking conversion overall, you could track conversion by any product attribute you track with this framework. 

Media Publishers - This is one of my favorite learnings in recent years. Most media sites who are heavy on anonymous traffic with no logged in experience feel like there isn't much data they can track - happy to share they are WRONG! Every media publisher that produces content can track the following data on their pageviews - Author, Content Date, Category, Topic, Series, and any other data you serve up through your CMS on your content. From here you can produce dashboards for all your authors to show them their distinct audience and how they react to their content. The other powerful set of metrics a lot of media sites don't track is, what I would say is the most important metric for editorial strategy - do your readers and/or viewers actually consume your content? You can't know if someone watched your video or read your entire article on a pageview alone you need a combination of scroll depth, active time on page, and the word count of the resource. 

SaaS and Consumer Products - I've worked with many SaaS and consumer products that had a ton of complexity and dynamic nature to their products because they were all custom for every user based on their data and use of the product. For example, static sites like blogs are all the same for everyone, but a product that is built for users like a Credit Reporting website or a Credit Card rewards tool is going to be very dynamic. These types of companies have to rely on richly capturing all the user data possible and it takes a very talented analyst to be able to understand the product inside and out to do meaningful analysis. At the pageview level of capturing data about the "where" for these types of products is critical to serving up any information that will be indicative as to what the user is looking at. For example in, a free credit reporting and lead generation website, we had different offerings for users based on their credit score and report so it was important to know if a user was mortgage qualified, poor credit, excellent credit. We could also serve up to our pageviews data on the specific offers they would see like mortgage companies, credit card companies, personal loan lenders and more. 

What - What are your users actually doing? Think about things like the click-stream, CTA clicks, form fills, search entries, and so forth. By this point, if you are following along, you know Who the user is and their user profile attributes. You know where they are in your product and the various attributes that are on the page or screen. Now you need to capture the specifics of what they are doing. This can get super complicated quickly. To give you a pro tip on how to capture click-stream data I wrote a post on this specific piece to save you from a bunch of a headaches down the road. 

The fundamental reason this is a complicated part of the tagging puzzle is that often times analysts aren't sure to go super wide on tracking granular events or to rely on properties to differentiate clicks. This post I referenced above will help you make sense of that.

Why - This is the part we can't capture in analytics. I put this here to serve as a reminder that we can't fully know "why" users do what they do. We have to rely on baselines, trends, intuition, customer surveys, interviews and feedback from users to interpret our analytics. There is an art form to analytics that you should always challenge your analysts on. It is dangerous if an analyst thinks he/she knows with 100% certainty the "why" behind user behavior. In my career, I have always been quick to admit when we are entering the "speculative" part of analytics. Often times I will be presenting data to executives and they will start to say things like "So if we just optimize this CTA or {you fill in the blank} then we can expect this result?" because executives love certainty in their business. It is a fool's errand to think analytics is the end all be all of knowledge. 


Pillar 2 | Distribution and Automation

Data on your users is like a river, it is fluid and will flow to many tributaries and endpoints. This is an important principle we can learn from. User data, since it constantly flows, needs to be automated. If you rely on manual data migrations and workflows you are creating a lot of hassle. Your user's data may end up in a marketing dashboard, an email service provider tool, a customer service portal or a sales CRM. All the core data, however, is linked from one true source. Where the original spring is creating that river which is where your source of truth is stored.

If you are following the pillars, after the first phase with pillar 1 of capturing data we now need to distribute it to all the right people, tools, and storages. Here are a few examples of where I typically send data to in online businesses:

  • Marketing - Email Service Provider, A/B Testing Tools, Ad Management Platforms, SMS and Push Notifications Tool
  • Sales - CRM, Sales Email Outreach
  • Customer Service - Chat Tool, Customer Service or Relationship Tool
  • Editorial - Custom Dashboards for Reporting
  • Executive Office - Custom Dashboards for Reporting, Data Visualization Tools
  • Product - Data Visualization Tool, and A/B Testing Tools
  • Business Intelligence - Data warehousing, Developer Environment for SQL and/or Python Analysis, and Data Visualization Tools

You may get to this point and think yourself, do I need to hire 10 engineers to go and write all the API's, ETL's, and engineering to do all this? I'll say, up until about 5 years ago, I would have said yes. I remember early in my career we wanted to send some web analytics data to a warehouse so we could connect Tableau to it and we bought a few ETL books and sat in a room for a few months with our best engineers to learn how to write an ETL on how to accomplish this. What a waste of time! I have good news on the distribution of data. I'd say most of the time you will need an engineer or a savvy data person but nothing like the old days. Here are a few recommendations to consider.

    Data Aggregation and Tagging Products - This is the method I use most often in online companies. This requires some engineering effort, but once you do it you will save a lot of hassle down the road and any non-technical person can connect various data sources, integrations and destinations together. These companies are great because they allow you to tag all of your data once then these companies have written all the API connections to send the data to a few hundred tools and destinations including data warehouses. Another great feature of these companies is their growth in capturing cloud sources and sending that data to your destinations. So if you think Salesforce CRM for example, you can send all your Salesforce CRM data automatically to a database to be able to build custom reports and connect to other data sources to enrich your view of your users.

    • Segment - I've used Segment personally over 6 times now and have collected nearly 5B rows of data through these guys. They have a battle-tested product and have continued to develop some great data products and features including their release of Personas in 2017.
    • mParticle - A company who is Segment's biggest competitor. Their bent is on mobile. They have a great product and if you are considering going this route I'd definitely compare them to Segment and as all great options.
    • - A similar company to Segment and mParticle but have focused on some interesting areas of analytics with their Embedded Analytics feature which has been very successful for them. 

    Data ETL Products - This is by far the quick and easiest way to send cloud data to a data warehouse. A non-technical person can set up an ETL in a matter of minutes. If you are looking to export all your data from email marketing, advertising, SEO, etc. then I'd recommend looking into these tools. I use them all the time because I'd rather have my data in raw SQL'ized format then having to conform to the platforms limited exporting options. Stitch DataFive Tran, and Segment are all great options in this category.

    Custom Tool Implementations - My least favorite option of the three, but still a good one. Some tools out there allow you to tag and collect data on their platform where you can then do some SQL or Python data manipulation or translation to prepare for sending data to other sources. Looker is a good example of this. 


    Pillar 3 | Visualization

    The most elusive part of analytics. Most analysts and executives jump right to the visualization and the infamous dashboards thinking this will help them run their business. It is like buying a car that doesn't have an engine or fuel in the tank. This isn't to say this isn't important, in fact, it is extremely important. The key is to not jump straight to this point, get the hard work done first of tagging and setting up your company for success by distributing data effectively first. Helpful data visualization is impossible without capturing good data first. I've been asked countless times to create reports and dashboards with data that the company never captured. 

    To help the readers of this report I have gone ahead and put together three tiers of sophistication on data visualization based on toolset. The sophistication is based solely on the capability of what is possible at each level. When you are looking for a tool you can use these tiers and questions to help guide your selection process. One thing to know, not all tools in each tier are created equal. 

    Tier 1 - Javascript or SDK level sourced data tools. The only way the tool can ingest data is by adding their custom javascript pixel or mobile software development kit on your website or app. This also means you can only view data from one place at a time specifically from a website or native app. If you have a very simple set up and are looking to get your feet wet in analytics this can be a great option for beginners. 

    Tier 2 - Database or data warehouse level data tools. You can only connect your visualization tools to a live database or warehouse. For companies who are able to get various data sets to a warehouse or database cluster than this is a powerful option to do some advanced visualization because it opens you up to an array of tools that have really powerful capabilities for dashboarding.

    Tier 3 - Multi-source data tools. These have been dominating the market because like most businesses online you have tons of data sources whether it be native apps, databases, cloud sources, etc. I won't say they are the best option, but a lot of businesses find themselves in a situation where they don't own all their data in a specific warehouse or database so they have to rely on a bunch of 3rd party integrations. 

    A few more questions to ask yourself when choosing between different tiers and tools are the following:

    • How many data sources do you have and where do they live?
    • What engineering resources do you have available to you?
    • What is your analytics tools budget?
    • How many MTU's (monthly tracked users) do you have? A lot of companies charge by the MTU rate.
    • Do you want to be able to write SQL or Python in your visualization tool?
    • Who will be using the tool to access the data? 
    • How sophisticated are the users who will be in the tool creating reports and dashboards?
    • Do you want to have live links with real-time reports?
    • How many custom events and properties can the tool ingest? Some platforms like Google Analytics only allows you to have a set number of properties before you upgrade,
    • Do you want to be able to create drag and drop reports?

    Pillar 4 | Advanced Analysis

    I can't stress enough how powerful it can be to go further than your standard analytics tool for the sake of gaining deeper insights about your users and business. My biggest career insights as an analyst did not come from creating a cool dashboard, Google Analytics custom report, Mixpanel view and so forth. It came from spending weeks at a time on data mining in our user databases and writing SQL against our raw web and mobile analytics data. There is a level of insight possible when you step out of the norm and move away from out of the box reports.

    If you don't have an analyst who can write SQL then consider contracting one or hiring a solid data analyst who can help add deep insights to various areas of your business. Then urge your product, marketing, and sales leaders to pester this person to research for insights and test hypothesis set forth by the strategies that are in place for your business. Time and time again I've come into a company who had a strategy which was based on a hypothesis put forth by a leader and then I disproved their strategy's effectiveness with data and we made pivots in our strategies. This isn't a negative thing either, it is why analysts exist. To test and challenge our assumptions with data we collect. 


    Pillar 5 | Business Intelligence

    I made a claim at the beginning of this report - "This is a new era and I'd go as far to say that analytics deserves to be in its own category just like marketing, sales or customer service is." This pillar is to address that point.

    I want to try and highlight what the key areas that this employee or team should be responsible for in the business. If a company is considering forming a business intelligence team or hiring their first BI employee then this can serve as some guidance on the scope of the team/role.

    Data Vision - It starts with a vision. If you have a non-data person trying to set the vision for what data looks like at your company then you are shooting yourself in the foot. The BI team should interpret the strategy and vision of the company and translate that into the current and future needs of the business. They should always be looking ahead for what will we ask of our data in the future? Pro Tip - If you don't have a clear vision because the stage of the company is still in search for its product market fit then air on the side of capturing more data.

    Semantic Data Tagging - Going back to the Who, What and Where data tagging, the BI team should clearly define tracking specs for their engineers for implementation and keep a close eye on documentation for what is and isn't being tracked along with the flight dates of those events. See here for a free tracking spec guide.

    Maintenance and Quality Control - The BI team should be diligently focused on QA'ing all new events and data tracked. This should be of utmost priority to ensure the integrity of their data. There are also methods to automate the testing of events on an ongoing basis.

    Data Pipeline Management - The BI team should have a clear understanding of how all their data flows together with documentation and testing methods and protocols to ensure there are no issues in the pipeline of data. Advanced data pipelines have tens of tools interacting together with API keys, connections, and warehouses that are working together. 

    Tool and Technology Selection - Based on the needs of the business, the number of data sources, goals for the data, product complexity, budget, team size and so forth the BI team will need to recommend a technology stack that matches the needs of the business.

    Data Requests and Support - When a BI team gets created there will be an influx of requests coming in from non-data team members. I've seen this time and time again, it is not that non-data team members don't want to have data for their area it is they don't know how to access it so when the expertise gets added there is an old backlog of ideas, questions and so forth for the team. Pro tip - treat BI just as you would engineering. Use a project management tool or ticketing system for requests to the team and prioritize them weekly because this is just one part of the BI team members job.

    Data Vizulatiation and Reporting - BI team members are always thinking in two categories for data work - is this ad-hoc or is this something that will be needed in the future. If it is the latter then it needs to become an automated report or dashboard. It will be important for the BI team to create reports and dashboards for various teams and executives so they can stop doing manual data pulling or living in spreadsheets.

    Business Discovery and Insights Analysis - By far the most fun part of any BI team members job is doing research on business questions that have been put forward or challenging a given hypothesis. As mentioned before, my biggest career wins and joys have come from this type of work. Usually, this is where the big growth comes in that can be attributed to the BI team. 

    Business and Data Modeling - At certain points in company's lifecycle there will be a need for modeling out ideas based on historical data or creating new business models altogether. A good analyst should be able to create models factoring in various inputs and using an algorithm to project results. 

    Teaching the Company - Last but not least, this can be the difference between a good BI team member and a great one. Most employees I've worked with couldn't tell me the difference between and a pageview and a session. That is not to be offensive, it is to paint the picture of the reality in which BI teams live in. The BI team should take pride in their ability to present and share information in a way that their parents could easily understand. The longer the BI team is around the savvier and more data-driven the entire company should become. To share another personal example, in the days of our engineers, customer service and other general team members loved data and would always be looking at our daily dashboard to see how the business was doing. Often times it was people not on the BI team who were catching errors or insights. That is exactly what you want to happen over time.