Apple’s Intelligent Tracking Prevention and Google Analytics

I’ve tried not to think too much about Safari’s Intelligent Tracking Platform (ITP) lately. The topic tends to get me worked up and grumpy. For those of you who don’t know much about ITP, I’ll link to a few resources here as this post is not meant to be an introductory description of the issue.

https://www.seerinteractive.com/blog/what-is-intelligent-tracking-prevention/

https://digiday.com/marketing/what-is-itp2-2/

The following short post is a friendly musing of me pulling some of our own client’s data to see what sort of practical impact it would have to live in a totally ITP world. Also, I note that this is not a post for how to circumvent ITP. Currently, Adobe is by far leading the pack as far as vendor’s go in terms of implementing a quality solution. Simo Ahava also put together a server-side solution which seems attainable though not trivial to implement. At Analytics Ninja, we tend to work directly with the development teams of our client’s to handle the cookie management piece of work with localStorage for a quicker client side patch. One of my major gripes is that for the majority of companies that I work with, implementing a “solution” to deal with the Apple being totally heavy handed is costly and time consuming.

Methodology

I ran queries across 10 client sites collecting data from October 28th, 2019 until June 1st, 2019. To make the data analysis easier, I tried to choose sites where I had the “First Visit Date” stored as a Custom Dimension.

The GA Client ID includes a timestamp indicating the moment of _ga cookie creation. This can be converted into a YYYY-MM-DD using javascript and then stored as a string. I’ve found this dimension useful for cohort analysis, especially as Google does not expose first visit date in their API. It’s worth noting that First Visit Date parsed from the clientId is not identical to the First Visit Date in GA (server-side). Their was certainly some minor “noise” in my data that I had to clean up due to people’s browser time being set to the distant past or future.

I focused on ITP 2.1’s seven day cookie-gaeddon as opposed to the ITP 2.2 cookie nuclear winter simply because it’s not clear to me yet if ITP 2.2 is going to wipe all cookies if the site has a visit with a gclid or fbclid or elqtrackid etc, or if Apple is just going to attack Google Ads, Facebook Ads, and Eloqua directly by deleted their cookies but sparing other first party cookies for an additional six days. For this ITP 2.1 analysis, I wanted to see what was the impact on tracking visitors who had a “first visit date” that was more than 7 days earlier than the current date. Ostensibly, if all browsers played by Webkit rules, those users become “new”, and attribution in GA is wiped clean.

I originally started my data exploration using Excel + PowerQuery. Even though my desktop computer is chock full of RAM, the millions of rows that made up the data set made Tableau seem like a better choice.

I did not, for the purposes of this post, break down the numbers by browser share. Site’s can have very different % browser share which differ significantly from global averages. As such, this means that “your mileage will vary.” I ran a theoretical “what would happen if ITP 2.1 was the norm tomorrow” exercise across all browsers.

The Data

The purpose of the first break down of the data was to look at each site, and visualize numerically the amount of sessions where the session that took place came before or after the 7 day cut off. The result showed the number of transactions attributed to sessions that would have occurred before or after the cut off (not all site’s had ecommerce), and the amount of revenue that would be classified as before or after the 7 cookie lifespan cut off.

I then ran the a quick table calculation to turn those numbers into some percentages to provide a bit more context.



With regard to sessions, sites range between about 15 to 40% of sessions stemming from a user with a _ga cookie creation date more than 7 days from the date of their current session. Not surprisingly, but really nice to see quantified, was how many conversions (for sites with online revenue) would have been snipped by the ITP 2.1 cookie scissors.

Some highlights:

  • Retailer #01 had about a third of their overall sessions with a cookie create more than 7 days from the date of the current session, while about 45% of their of their sales source from users with a 7+ day old cookie. $3.4M in revenue would be incorrectly attributed in GA during the 7 month period of the data set.
  • Lead Gen site #01 had about 45% of sessions attributed to 7+ day old cookie.
  • Retailer #02 had a greater percent of “new users” than Retailer #01. Their 25% of sessions with a 7+ day old cookie was responsible for 42% of revenue during the time period of the data set.
  • B2B company #01 has a longer sales cycle that represents itself in the data. While 45% of the sessions to their site were attributed to a 7+ day old cookie, a full 58% of their online revenue came from those sessions. Over $11M in revenue would have be incorrectly attributed in GA if cookies were wiped after 7 days.
  • The company I referred to as Something would be less impacted than the other companies by a 7 day cookie cap. About 15% of their sessions were attributed to a 7+ day old cookie. Still, that is almost 20M sessions in the date range which would have been negatively impacted by a cookie cap.
  • Retailer #03 saw 17% of their overall sessions attributed to a 7+ day old cookie, with 38% of transactions and 46% of revenue attributed to those same cookies. The average order value for the 7+ day old cookie holders is 38% higher than their newer counterparts. About $2.3M of their revenue would have been incorrectly attributed in GA if cookies were capped to 7 days.

Here is the same data as above, visualized using bar charts (and removing the largest site as it was so much bigger in scale to the others it made it very difficult to the smaller sites).

So far, I have not seen much of an impact or general roll out of ITP 2.1 across the 10 sites I’ve analyzed. Clearly, that will change over time, though it is important to note that currently Safari does not have a huge browser share and Google is not about to take an axe to cookies in Chrome until the develop a method of completely demolishing all competition first.


https://www.statista.com/statistics/544400/market-share-of-internet-browsers-desktop/

So, as a result of this exercise, I was able to confirm my belief that ITP 2.1 is a heavy handed, B.S. move by Apple in the name of “privacy” that will make a real difference to companies trying to measure user behavior and run A/B tests unless they invest in a solution which circumvents the Webkit logic in the browser.

Methods of Workflow Sanity with Google Tag Manager.

Also known as: a story of one ninja’s journey to tag management nirvana.

This blog post was co-authored by my good friend and esteemed, highly respected, super-expert colleague Sam Briesemeister. Any of the good ideas and “smarts” presented herein are definitely Sam’s. This post is his brain child. Any snark, poorly structured sentences, or failed jokes are solely my (Yehoshua Coren’s) responsibility. Same goes for the first-person voice in this post; Sam is not to blame.

Unembarrassed plug: you may obtain a license to the code that has changed the way my consultancy does implementations by contacting me directly; details at the end of the post.

Some background

If you don’t want to join me around the GTM campfire while I tell my story and just jump to the good stuff, scroll down to the section heading Implementation Manifesto.

Analytics Ninja was an early adopter of Google Tag Manager. I remember when a little birdy whispered in my ear that Google was going to be launching something called Google Tag Manager (I’ve never applied for GACP status, so haven’t ever had much lead time with regards to product launches) I basically shrugged and said “so what?” The bird then loudly chirped, “Google TAG MANAGER“. I still didn’t get it.

Fast forward a few months and I landed my first large enterprise client. When I asked them for access to their GA account, they said “which one?”.

“How many do you have?” asked a naive Yehoshua.

“50”.

“50!! Oh my! Why is that?

“Well…. One for US Men’s clothing, one for US Women’s clothing , one for EU Men’s, one for EU Women’s etc etc”.

I quickly realized that this brand had basically no online brand visibility and needed to consolidate all of their tags, and fast. BAM! Google Tag Manager had just launched. Perfect use case. I (smartly) waited about 4 months to make sure that the newly launched product was up to challenge for an enterprise client, and after a tremendous amount of planning and code work, we removed all the inline GA code from all of the multiple regions’ websites and I hit the publish button.

I was even smart enough to record a screencast of the launch; it was the GA Real Time interface lighting up with little circles of traffic as data from around the globe started populating the new GA property. A beautiful sight…

But I digress…. The way that my small team and I initially worked on an implementation was to use the tag management system as a way to inject custom code and then use GTM to send data to tags. In particular, I was a big fan of having one tag for GA Event Tracking. The dataLayer was the messaging mechanism to read values from {{gaEventCategory}}, {{gaEventAction}}, and {{gaEventLabel}}; and the {{gaEvent}} event was the trigger for the tags. We (which means Eliyahu at the time) even wrote a whole library that served as a wrapper to pass data to _gaq, ga(‘send’), and the dataLayer depending on whether or not the client had GTM.

The code itself was comprised of small code blocks that would push values from the DOM into the dataLayer based upon jQuery dependent CSS selectors and eventListeners. Versioning was managed through our own internally made UI based Jenkins / Grunt build system (NinjaBuild), fully integrated into GitHub, and files were hosted on my own S3 / Cloudfront buckets on AWS. Especially since GTM didn’t support large Custom HTML tags at the time, this seemed to be a good approach. Once GTM started supporting larger Custom HTML tags, I came to realize that it wasn’t in the best interest of my clients to have my company host the code that was being injected into their site (apart from the fact that I was paying for the nominal network fees).

To this day, I have a slough of former client code gathering dust within my S3 bucket that I simply don’t touch lest I torch their tracking and analytics. This is something that I am not proud of and did not do a good job communicating to clients at the time. As Analytics Ninja began to work with more and more large clients, I learned a lot along the way about governance and now am able to effectively steer my clients away from practices such as the ones I used to do. #AlwaysLearning. #ProfessionalGrowth.

Fast forward a few years…. I was having trouble staying on top of the tremendous demand of client work and was frustrated that when I was in a bind to get something done I didn’t have the coding skills to maintain or update the multiple hundreds of lines of jQuery code that was powering our implementations. So my friend Sam (mentioned in the preamble to this post) said, “What if I were to make this EASIER for you?”

And behold….. The following approach was gifted to me.

Our Implementation Manifesto

Continue reading Methods of Workflow Sanity with Google Tag Manager.

Guide: Facebook Pixel with Google Tag Manager

How to set up the Facebook Pixel using Google Tag Manager.



Recently I was tasked to deploy a Facebook Pixel on an ecommerce site. While deploying tags with Google Tag Manager is normally a relatively straightforward thing to do, I found the documentation around Facebook’s Pixel to be sub-par, a bit confusing, and in need of improvement. So I decided to write this guide both to help out those readers who are perplexed about the how to deploy the pixel, as well as spill some digital ink critiquing the FB Pixel documentation.

Also, I very humbly have found that this blog listed in “top analytics blogs” type articles, oftentimes with a caveat that “Analytics Ninja doesn’t write very often, but when he does….” A blog post from yours truly was well overdue, and I wouldn’t mind boosting up this site’s SEO as a result (Google being all keen on fresh content and such). Feel free to link to this post with an anchor-text rich “do follow” text. Maybe something like [Google Tag Manager Consultant]. You’ll be helping me cover the costs of raising 5 kids. 🙂

So, here we go….

Step 1 – Create a Facebook Pixel


Create a Facebook Pixel by clicking the button

Once you click the create pixel button, you’ll need to give your pixel a name and agree to Facebook’s Pixel Terms. As an informed reader, I went through the TOS that they have on their pixel pretty in depth and I still don’t really fully understand what’s going on in terms of the data they collect. As with most things in the modern digital advertising world, you basically accept that Facebook is probably going to track as much as they can via their pixel and hoard all of that data so that they can build better advertising algorithms for to make as much money for the company as possible. Facebook, like Google, is in many ways primarily a Big Data company whose largest strategic business asset is data. Indeed, the Facebook data set is really second to none because of the nature of the personal information that is given to them freely by almost 2 BILLION people.

One additional point about the Facebook Pixel which makes me raise my eyebrows is the following quote from their Help Center. “To improve your ads delivery, how Facebook measures the results of your ads, and in an effort to enhance the relevancy and usefulness of ads, we’re enhancing the Facebook pixel. The Facebook pixel will start sending more contextual information from your website to better understand and categorize the actions that people take on your site to optimize for ads delivery.
The additional information sent through pixel will include actions on your page, like “add to cart” or “purchase” clicks, and will also include information from your page’s structure to better understand context associated with these actions.”


Facebook then makes you dig into their developer docs to get an understanding of the impact of that statement. Because marketers are
A). Going to dig into developer docs and understand what they mean and
B). stop using the Facebook pixel because it is collecting too much information or
C). update their code to opt out of the data collection. PLEASE somebody remind me to update this post with an explanation of what this *really* means on a technical level once we can start inspecting hits to the FB servers. My guess is that FB is going to collect a HUUUGE amount of information about every click on the site. Kinda like Heap Analytics auto-tracking without providing the end-user access to the data.

To quote my own tweet:


And to quote Facebook:


Click data and page metadata? And you need to add a “set” command to your code if you don’t want to provide that info? Wow! To be honest, I’d love to be a part of that analytics team that has access to that much data just gobbled up from so many websites. If you’re listening, Facebook, for $220,000/yr I’d consider joining your team.

But I digress…

Step 2 – Install Your Pixel Code



As a part of the process for installing the pixel, you can give Facebook access to GTM via API and they’ll create a tag and trigger for you. Let’s “not” do that, and install the tag manually so that we have more control over configuration. Continue reading Guide: Facebook Pixel with Google Tag Manager

Tracking Brighcove Videos with Gooogle Analytics

This post is authored by David Vallejo.

Brighcove is an online video platform that allows to embed onDemand videos on your websites. It seems there’re out of there some people looking for a way to track it, so we’re going to learn how to track those videos within Google Tag Manager.

In order to be able to track our videos from BrightCove we need to have some things in mind:
  • API needs to be enabled within BrightCove Interface.
  • Embeds needs to have includeAPI and templateLoadHandler params in place.

First Step: Enabling the JS API for our videos

This needs to be done within BrightCove Site for all the different players that we may have, follow these steps:
  1. Go to your Video Cloud Studio Publishing Module and then select your player
  2. Click on the Setting link
  3. Under Web Setting option within the Global tab, select the option “Enable ActionScript/JavaScript APIs’
brightcove_tracking_1

Second Step: GTM Configuration

For being able to track BrightCove videos, we’re going to use 1 Custom HTML tag, 1 Trigger and 1 Variable. First let’s create the Variable, this one will allow us to know when a BrightCove Video is present on a page, to we just fire the tracking script when it’s needed. We don’t want to inject the tracking code were is not going to be used: brightcove_tracking_2 Now, we’re going to create the Trigger that is going to fire our tracking code. Just to be safe, we’re going to fire it on DomReady and when our previusly created Variable equals to “yes”. brightcove_tracking_3 Ok, now let’s add the tracking tag (you’ll be able to copy the js code from the post bottom):
brightcove_tracking_4
We’re done. Remember to test this code in preview before going live. This script will track the following events:
  1. Play
  2. Pause
  3. Complete
  4. Progress (25% steps, it can be personalized within the code)
  5. Seek
We’ve used an agnostic dataLayer push, with a lot of extra info about the video in the case you need it, like the related thumbnail, the video length, the video coded, the video name, the publisher info, check the following screenshot for a guide of all info you may use for your custom dimensions: brightcove_tracking_5

Google Analytics Custom Metrics & Calculated Metrics

 

Metrics, Metrics, Metrics, oh my!!


With the recent (very welcome) release of calculated metrics in Google Analytics, I thought that the time was ripe to write a post about custom metrics in GA.  Towards the end of this post you will see why I believe that custom metrics are quite a germane topic to discuss in relationship with calculated metrics.  I will also touch on how the launch of calculated metrics is another big value prop when it comes to GA Premium (even though they aren’t a GAP only feature).

I believe that custom metrics are painfully underused in Google Analytics implementations.  Believe it or not, my browser console is oftentimes open when I’m browsing the web (you should believe it).   I do a lot of spying investigating of websites’ implementations by looking at the data that they send to GA , and I do not see custom metrics being deployed almost at all.


Dimensions and Metrics

Before we dive into custom metrics and calculated metrics, let’s take a step back and define “metrics”.  There are a number of different articles which define dimensions and metrics.  I have chosen to quote Paul Koks and then go on to discuss in my own words (recognizing that my own understanding of dimensions and metrics comes from reading those who precede me in the industry).  In Paul’s words:
  • A dimension is a characteristic of an object that can be given different values —> a dimension describes data
  • A metric is an individual element of a dimension which can be measured as a sum or ratio —> a metric measures data
So, there we have it.  Dimensions describe data and metrics measure data.  Dimensions will make up the rows in a table report whereas metrics will populate the columns.  Metrics are the data.  Metrics increment, they count things.  Metrics will invariably be a number, be it an integer, a ratio, a percentage etc…  As of today (if I counted correctly), there are 189 metrics available in the Google Analytics API.



Custom Metrics & Event Tracking (Measuring User Interactions)


So here is where metrics begin to get interesting.  Or, as my friend Jacques Warren once said to me, custom metrics are “basically where reports really start making sense to a business.”  Standard metrics in Google Analytics are meaningless by themselves.  This is mostly true of custom metrics as well.  Remember, they are just counters, so when they are stand-alone they lack any context.  Meaning must be derived through segmentation, and that means drilling down into the dimensions which describe your data.  While custom metrics won’t solve the context problem for you completely, they do so partially, which is why I love them so much.

Monthly Active Users


Let’s start by taking a look at how some standard metrics in Google Analytics are used to measure user behavior.  I’m going to start with Event Tracking, using an ecommerce site as my example.  Event tracking is used to track interactions with a website or app.  Those interactions are usually some form of click, form submit, tap (for mobile), etc which translate into discreet actions that a user is taking.   


event action on product page

In the above example, my Event Category (a dimension) is “Product Page”.  It serves as a descriptor for the Event Action (which also a dimension), which describes the metric (total events).  Notice how total events is simply a number.  It counts how many times an action in rows 1-6 happened.

Now let’s say I want to get some more context about the Add to Basket event is being tracked.  One way that I can get at this information is via Custom Reports.  With a custom report, I can drill down into different dimensions as a way of getting more granular context in my data.

google analytics custom report

custom report drilldown




Similarly, by using the Pivot feature in reports will allow you to provide additional context to your metrics.


pivoted report



But in all of these examples, the metric itself, “total events”still lacks context.  In order to know “total events” of what, I need to refer to my dimensions.  

Custom Metrics in Google Analytics are unique in the fact that they are “named values”; they can also describe the interaction that they are measuring.  Just like custom dimensions, the name of the custom metric is set in the GA Admin section.  

custom metrics on hit level


Notice how each of the actions that the user can take on the product page is now being tracked as a metric.  This is really powerful as now I can build a custom report and pair these custom metrics with almost any dimension (limitations to be discussed below).



awesome custom report
Translation from Google Translate.  I don’t speak French

Another example is from a site that has has restaurant reviews.  On an individual restaurant details the user is able to view a menu, check hours, add a calendar reminder about their reservation, or even order food online.


place related actions

As you can see, custom metrics make the data model in Google Analytics much more flexible.  As far as event tracking goes, I highly recommend taking 10 or 15 most meaningful events (user interactions) and tracking them with a custom metric as well.  In my humble opinion, custom metrics can be seen as “event tracking on steroids” (almost the title of this post).  Context gives meaning to data.  These data points, since they are named, are now much more meaningful than just being a “total event”.  While the pivot function in certain GA reports can get you kind of close to what you see in the table above, it is by far not a replacement for have a native metric in GA.  


Tracking $$$



In addition to super charging your tracking of user interaction with your site, I find custom metrics to be wonderful when it comes to tracking financial data.  My favorite custom metric in this regards is gross profit, though with some hard work tracking net profit is possible in Google Analytics as well (though that would like require some calculated metrics also).
product discounts


In almost all of our ecommerce implementations, we try to track the following metrics when available.
  • Original Price / List Price
    • This is $119.99 for the item above
  • Displayed Price
    • This is $109.00 for the item above
  • Product Level Discount
    • This is product level savings, $10.99
  • Order Coupon Value per Product
    • If a 15% coupon is applied to the whole cart, that 15% is applied equally to every product.  For our product above that would $16.35
  • Product Revenue
    • This is a standard metric (not custom), that represents the actual value of transacted revenue per product
  • Cost of Goods Sold
    • FWIW, I bet a bunch of folks are making a killing on that Lego set.
  • Gross Profit
    • Product Revenue minus COGS


The results are totally terrific.  🙂

 money in google analytics

The above image shows the breakdown of sales by brand (including how much was discounted at point of sale by brand).   With the above custom metrics in place, I can quite clearly see the impact of discounting / sales / promotions etc on any product level dimension or on session and user level dimensions.  In other words, I could run reports to see how much money was being left on the table by channel, by geography, by customer type, by gender, by age group.  The list goes on and on.

Some Technical Points



WARNING:  If you don’t like reading about technical aspects of Google Analytics and implementations, just skip to the next section.

For reasons I understand, but don’t particularly like, custom metrics are “scoped” (as are custom dimensions).  In other words, they need to be set to a particular data scope in the Google Analytics admin in order to be processed into reports.  Custom metrics support two scopes, Hit and Product.  Metrics will be processed and then reported on as either integers, currency, or time (measured in seconds).

scope and formatting type

When hit level data is sent to Google Analytics, the parameter / value combination matches the following format:  cmXX={{value}} where XX is the index of the custom metric.  The actual name of the metric that you see in reports is configured in the admin, just as you would configure a custom dimension (as in the image earlier in this post).  So let’s say I wanted to track an Add to Wishlist interaction with a custom metric, I would use the following code:
ga(‘send’, ‘event’, ‘Product Page’, ‘Add to Wishlist’, ‘Expensive Stars Wars Lego Set’,  {‘metric2’: 1});


data endpoint for hit level custom metric

In Google Tag Manager, you can configure the event set a custom metric index and value directly in the event tag.



custom metrics in GTM


Unfortunately, GTM is using a “set” command for a custom metric, which means that the value will persist for the same tracking object.  If you’re using a named tracker, you’re in trouble because all future hits will have that value set.  Even though most people default to not using a named tracker, this still creates problem for custom metrics that are sent with pageviews, as page timing hits will use the same tracking object even when no default tracker name is chosen.  It can get really nasty with single page sites (think angular.js)  🙁  

custom metrics set in GTM

Hashtag #booooo.  (There are workarounds using hitcallbacks, but I still #boo the need to resort to those implementation acrobatics, when pure hit level manipulation in GTM should be available).


Product scoped custom metrics are coded as part of the product actionFieldObject.  This makes perfect sense as the metric is meant to track a particular value associated with a particular action.  For example, the value of the product added or removed from cart (a good article if you’re interested in a practical walkthrough of how product level custom metrics are implemented).  Or any of the discount / profit metrics that I mentioned above for the “transaction” action (say that five times fast).  Since multiple products can be associated with a particular hit, you can pass through the product level discount for each product on a “purchase” action that uses a pageview hit as the data transport mechanism.


The data is sent to GA as in a &prXXcmXX format, which stands for product {{number}} and cm {{number}}


product scoped custom metrics data sent


Limitations of “product scope”



One particular thorn in my side when it comes implementing custom metrics for Enhanced Ecommerce implementations is the inability to track interactions with products outside of the predefined “dictionary” that Google supports when it comes to interactions with products.  Those actions are (I’m quoting):

  • click
    • A click on a product or product link for one or more products.
  • detail
    • A view of product details.
  • add
    • Adding one or more products to a shopping cart.
  • remove
    • Remove one or more products from a shopping cart.
  • checkout
    • Initiating the checkout process for one or more products.
  • checkout_option
    • Sending the option value for a given checkout step.
  • purchase
    • The sale of one or more products.
  • refund
    • The refund of one or more products.
  • promo_click
    • A click on an internal promotion.


That means that Adding to Wish List, Adding to Registry, Viewing Product Video, Clicking on Cross-sell, Social Sharing, Writing a Review, Asking A Question, etc etc cannot be tracked on a product level in the same way that the other items are tracked. As of this post, changing the actionFieldObject to some use other name (so that you don’t inflate “adds” or “clicks”) and setting a product level custom metric to capture the interaction with the product will simply fail.  This means that in order to track information about product interactions that are not part of the Enhanced Ecommerce dictionary, you’ll need to use hit level tracking (events).  As such, collecting important data such as Wish List Adds per product or per brand request will require the use of “hit level” custom dimensions, not product level.  


Calculated Metrics



So far we’ve seen how custom metrics are super useful, that they add a particular flexibility to the data you collect in GA (pivoted data galore), and how they are indeed critical for tracking the metrics that really matter (profit!).  The reason that the release of calculated metrics was a catalyst for me to write about custom metrics is because in order create calculated metrics, you need to use metrics.  “HUH?”  Yes, that’s correct.  I did just say that.  Let me illustrate what I mean with a few examples.

My first example is from my friend Peter O’Neill’s post about using calculated metrics to measure conversion funnel completion rates.  

Peter O'Neill is awesome. You should hire him.

Image ripped without written permission from Peter



To see the percentage of sessions where users viewed a product out of the total sessions where users viewed the ecommerce portion of the site (i.e. the store as opposed to the blog), you need two metrics, sessions with views of store and sessions with views of product page.  The simplest way to get these two metrics is to create goals for each of these steps (something that you should be doing in any case).  You then divide Goal X by Goal Y. Pretty straightforward, relatively benign. Create goals to get measures of number of sessions where a core action was taken

My second example is from my friend Charles Farina, who has a nice article with 25 calculated metric examples which includes some basic “how-to” in terms of setup.  The example is “video completion rate.” He mentions, as an aside, that in order to create this calculated metric, you’ll need custom metrics.  Why are custom metrics needed, you ask?  Well, let’s look at how I would normally calculate video completion rate.

video completion rate
Note the numbers here.  The number of sessions where the video was completely viewed was 20,451 times.  That metric, 20,451, in Google Analytics is called “unique events”.  It is not “number of video completions”.  A calculated metric in GA cannot divide the value in row 7 by a value in row 1.  In can only calculate values (add, subtract, multiply, divide) between columns. I also need to stress that custom metrics, since they can be used for a wide variety of purposes (including creating custom visit scoring!!), will not have any “uniques” applied to them, pageviews, content views, or events do.  This is a pretty big limitation, though I don’t see it see it changing any time soon.

In any case, the ability to calculate metrics in Google Analytics has made the platform much more powerful, but a huge amount of that power now lies in properly implementing custom metrics.


Custom Metrics, Calculated Metrics, and Google Analytics Premium



The launch of calculated metrics creates another major differentiation between Google Analytics Standard and Google Analytics Premium.  For a long time, I have believed that the number of custom dimensions available for GAP (200!) vs. GA Standard (20) was one of the most core selling points for larger companies who would be considering Premium.  Then I began to fall in love with custom metrics and I’ve added that to the list too (again, 200 vs. 20).  With calculated metrics, the GA Premium folks get a healthy 50 metrics to get them going, while the rest of world must settle with a meager 5.  Yup, just 5.  :-/


Closing Thoughts



Over the past 2 years or so, I have become a bigger and bigger fan of custom metrics. The release of calculated metrics in GA makes the possibilities of what the tool can do only more awesome. To be clear, the release of calculated metrics doesn’t change the final output that analysts such as myself or my colleagues would be providing to our clients. And I don’t expect tools like Excel to drop significantly in their utility just because GA launched calculated metrics. But I totally adore the direction Google Analytics is going in allowing for better and better analysis capabilities natively in the tool. To whatever extent GA ends up developing into a one ring to rule them all central BI hub, well… I, for one, welcome our analytics overlords.

Using Google Analytics to Grow Your Business

The following post is an email that I wrote for the Traffic1M course presented by SumoMe.com.  It is geared to folks who are just getting started with Google Analytics, and online marketing in general.  A big thanks to Noah Kagan and team for the opportunity to participate and provide some content to a wide readership.  If you are a reader of this blog and don’t know Noah, I highly recommend checking out okdork.com and sumome.com; there’s great stuff there.


Introduction to Google Analytics

If your website is like millions of other websites, then you have Google Analytics installed.  

When I say “millions”, I’m referring to data from from BuiltWith.com that estimates almost 30 million (!) websites are using GA.  

I also note that Google Analytics’ accounts are numbered sequentially, so as of the writing of this email more than 67 million accounts have been created.  Not all are in use, so the number of active accounts is probably somewhere in between.

New Google Analytics Account


So you have GA on your site, and tons of data is being collected.  What now?  How can you use that data to help grow your business?


Why Use Google Analytics?



This Traffic1M course is all about getting that first million visitors to your site.  Put simply, Google Analytics is the most popular tool you can use to measure those visitors.  You shouldn’t use a tool because it is popular, though.  

Most folks who choose GA do so because

 
  • It is really powerful (“enterprise-class”) software.  
  • The Standard version is free.
  • Google’s tremendous cloud infrastructure allows them to rip through very large data sets rather quickly.
  • It has a nice, clean User Interface.
  • They want to use data to make informed business decisions.

That last point is the most important one.  Let’s not be naive; data on it’s own is meaningless.  But when used properly, data will help you see what’s working well, what’s not working, and help you make decisions about “what to do next.”



Where do I start?


One of the biggest challenges I have seen for businesses / website owners is that they don’t know where to begin when it comes to analytics.  People find themselves looking at an imposing mountain of data, and don’t know how to begin climbing.

Continue reading Using Google Analytics to Grow Your Business

Smooth Google Analytics Migrations using Google Tag Manager

This post is authored by David Vallejo.  At the current juncture, this blog is not configured to support for multiple authors.   I hope to remedy that in the future.  ~~Yehoshua

 

Since Google Tag Manager was released, here at Analytics Ninja we have faced a number of problems when migrating a “hard-coded” Google Analytics implementation.

The most common problem relates to actual moment of deployment, when the old Google Analytics code has to be removed from the site. This is not much of a problem if we have access to the site and we can directly remove the hard-coded snippet at the same time you publish your new container.  But let’s be real, this is not the usual scenario. We usually rely on some other company or the client’s IT department to try to synchronize the deployment. This leaves us in a challenging situation because if they remove the code and we don’t publish the container right away, some data may get lost.  Or, on the other hand, if they don’t remove the code and we publish the container the hits will be sent twice.  Or even worse … if we are migrating from Classic to Universal at the same time we’re moving to Google Tag Manager we could end even messing up the sessions/users/bounce rates and any other metrics/dimensions.

This problem is aggravated if we are working on a multi-domain implementation where each of the domains is being ran by different business groups… if it was hard to synchronize with one team, just imagine if there are 2 or more business groups in different time zones and having everyone required to make changes at the same time …

Even if we are able to get everyone involved in the migration, there will always be some little time gap between the code removal and the container publication.

One solution is to use a piece of code that will allow us to block all our new tags if the old code is still on the site.  We will just need to schedule a date for the migration and using a simple macro and rule we’ll be just firing our new tags on the pages that have the old code already removed. This way even if there is a long timeframe to get everything sorted out, the GA data won’t be affected at all.

For this to work, we need a macro to get the current Google Analytics status on the page.  For this we’ll configure in our Macro the UA property names (we’re using an array because we may have a dual tracking implementation, or the page may have a third party GA tracking and we don’t want to mess up with them). Then we’ll loop through all the trackers available to get their configured UA account, and if it matches our properties arrays we’ll return true.   See following flow diagram to see what’s the macro’s logic:

Google Tag Manager Migration Macro Flow Chart

Google Tag Manager Migration Macro Flow Chart

The next step we need to take is to setup a new firing rule, that will allow us to block our tags if the old trackers are still on the pages.

Captura de pantalla 2015-03-29 a las 1.51.46

Macro Code

We then add a blocking rule to our Google Tag Manager tags so they are not executed if any previous tracker initialized on the page.

We’ll need to keep one more thing in mind. As GA/UA code is asyncronous it may happen that Google Tag Manager tags get fired before the old code get executed, so we’ll need to delay our current tags while we’re migrating . This can be done setting the firing tag to DomReady event ( gtm.dom ), or even better for Window Load ( gtm.load ). This is not the best way to run an analytics implementation as we normally want our analytics tag to get fired ASAP, but we’ll be changing our tags firing rule to gtm.js/All Pages when we check the old code is already gone from the pages.

Let us know what you think about this implementation method in the comments section below.

REAL Time On Page in Google Analytics

A better way to measure content engagement with Google Analytics

This post is inspired by a conversation that I had with my friend and colleague Simo Ahava at Superweek as well as a recent work request from a well-established Italian publisher. In short, the publisher was quite challenged by the fact that they had an 85% bounce rate, and that their time on site was so low. Their articles tend to the get many hundreds, if not thousands, of Facebook likes, so “how could it be that users were spending so little time on site?!” Their average time on page was around the three minute mark, so how could be that average session duration was significantly lower?

bounce rate, time on page


  • Challenge 1:  Google Analytics tracks time on page / on site by measuring difference between time stamps of hits.  If the page is a bounce, no time will be recorded.
  • Challenge 2:  Even if the page viewed is not the bounce/exit page (and thereby has a time greater than zero), GA doesn’t distinguish between time on page/site if the browser window is in a hidden or visible tab
 

After a lengthy explanation to the client informing them of the way the Google Analytics tracks time on page (and by extension, time on site), they were still stuck without a way to accurately measure content engagement.  First of all, there are a number of different ways to measure engagement besides time on page / site.  Many posts have been written about this and I urge readers to seek those out since time metrics gain too much undue focus as it is.  As things stand, since this publisher’s site was not configured with any event tracking (a scroll tracking module would be great), they were seeing many users come to their site, view one page, and then leave. Unfortunately for them, “out of the box GA” does not provide very good insights into the nature of how users are interacting with their content. “Are they even reading the content?”
Continue reading REAL Time On Page in Google Analytics

Advanced Remarketing with Google Analytics & Google Tag Manager

From Data Layer to Dollars…

Some visitors are more profitable than others, and thoughtfully created Remarketing Lists can help businesses focus their ad spend on the most valuable visitors.  This can improve revenue, reduce costs, or both.

The key is discovering the common characteristics of visitors which make them more valuable than an average visitor, and then preferentially delivering ads (i.e. bid higher) to users who have these characteristics.  In other words, if you can segment your visitor base to identify which users have a higher potential value, you’ll be able to make smarter decisions with your advertising budget.  Utilizing features of Google Analytics and Google Tag Manager provides the opportunity to do this.

One of my favorite features of the Google Analytics / DoubleClick integration is the ability to add users to Adwords Retargeting lists with the click of a button.  Here’s an example of how I might come up with a good remarketing list:

Let’s start with a curious question –> How long does it take users to convert on the site?  The first place I would go to begin answering this question is by applying a “converted” segment (in this case, a purchase) to the Session Duration report.

transaction segment

transactions duration segment



Right away I notice that it takes a large percentage of users over 10 minutes in order to make a purchase, and over almost 13% require a half an hour or more.  While I very much like segmenting the Engagement Reports, in this particular case I’m going to look at the User Timings report as I believe the data visualization is more helpful there (you can expand the histogram). Continue reading Advanced Remarketing with Google Analytics & Google Tag Manager

Google’s Universal Analytics is Out of Beta – Time to Switch?

Universal Analytics


The big news last week (at least for folks like me) was that Universal Analytics finally came out of beta.  Is it time for you to switch?  

Short answer  –> yes, soon.  🙂

What exactly is the big deal about Universal Analytics? My current take on the product’s features is what follows:

UserID

One of the most touted feature improvements over Google Analytics Classic is the introduction of UserID.  Google lists 4 benefits of using UserID.

  1. More accurate user count
  2. Analyze the signed-in experience
  3. Access the User ID View and Cross Device Reports
  4. Connect your acquistions, engagement, and conversions.


While I see the move towards a most person / customer centric view by the GA team to be a big step in the right direction, I think that at the current juncture the UserID reporting (and data model) falls flat.  Full disclosure: I’ve only had access to the UserID reports for a few days.  AND I am acutely aware that the GA team is constantly innovating and improving their product at a dizzying pace.  That means that the only thing I can truly count on when it comes to GA is that the product will continue to improve (and hopefully not make this blog post completely irrelevant in the next 3 days).

So, why does UserID currently fall flat?  Doesn’t the ability to connect all the dots sound like a marketer’s dream?

Continue reading Google’s Universal Analytics is Out of Beta – Time to Switch?