Bounce Rate in Google Analytics



Every few months of so, I see a (re)tweet pointing to this infographic from KissMetrics.

Here’s a snippet:   Kiss Metrics Bounce Rate Infographic

The thing that frustrates me the most about this infographic is that the definition of Bounce Rate is wrong.   (Well, at least for GA).  Yes, I know that the definition is directly from the Google Analytics Help Center.  But a bounce in Google Analytics is NOT a visit with a single pageview.  A bounce is a visit with a single engagement HIT.  (Justin Cutroni has a great post explaining these hit types and how to understand Google Analytics time calculations based upon undertstanding how these hit types work).  To briefly summarize here, there are 6 types of hits that can be sent to the Google Analytics server.

  • Pageviews (sent via _trackPageview)
  • Events (sent via _trackEvent)
  • Ecommerce Items (sent via _addItem)
  • Ecommerce Transactions (sent via _trackTrans)
  • Social (sent via _trackSocial)
  • User Defined deprecated, though functional (sent via _setVar)


As Justin explains, 5 of these hit types are used in calculating some form of engagement, thereby impacting time on page / time on site calculations as well as bounce rate.  With regards to bounce rate in particular, an additional Pageview, Event which hasn’t been set to non-interactionor Social Media share (that is configured to be tracked in GA) are all things that can impact your bounce rate.
Here is an example of why understanding this technical principal is important when it comes analysis.  In the example below, we see that this client’s Paid Search campaigns have a particularly low bounce rate.

bounce rate analysis 1

However, you might have noticed that something is a little bit fishy here.  
Hint:Pages/Visit

bounce rate analysis 2

The fact that GA was reporting less than one page per visit is a clear indication that this site has problems with their implementation.  Indeed, when looking at the site’s bounce rate over time, we see sharp changes.  Big Problem City

Bounce Rate Over Time

Knowing that bounce rate is impacted by the technical issue of more than one engagement hit getting sent to GA is critical to making sure you’re getting your analysis right.  It is quite common that developers of software come up with a GA integration that doesn’t take bounce rate into account.  The most common culprits I’ve seen are live chat (where the auto-invite sends a Pageview or Event) and auto-plays of videos that are tracked by virtual pageviews or events (where non-interaction was not invoked).  The use of a particular live chat application is what caused bounce rate to plummet in the above example.  Bounce rate is also impacted by cookie integrity issues which cause sessions to get reset.  Additionally, and I’m still surprised by how many times I see this, having the GA tracking code more than once on a page is a sure fire way to bring your bounce rate down to zero.  As my friend Caleb Whitmore puts it, “A 3.8% bounce rate isn’t really good, it’s broken.”

An important side note:  As web developers have an uncanny tendency to break GA implementations, make sure you use GA Intelligence Alerts.  (After this post, read yet  another great post by Justin about data alerts).

Bounce Rate Alerts

Bounce Rate and SEO



Based upon what we’ve seen above about how bounce rate can be a). Broken and b). Impacted by code, I just want to say that I totally take Matt Cutts at his word when he says that search rankings do not take Google Analytics into account.  Google Analytics’ metrics are far too easy to manipulate for the Search Quality Team to use them in rankings, IMHO.  Furthermore, there are so many broken implementations that it would be foolhardy to consider pages/visit or bounce rate metrics on a global scale to be reliable as a search quality signal.   </my two cents>.

Matt Cutts Google Analytics    

Bounce Rate in Context

 

One of the most often quoted lines about bounce rate, is Avinash Kaushik’s famous definition, “I came, I puked, I left.”  While this definition does hold water a lot of the time, I believe that ultimately it is too simplistic.  Avinash certainly makes great points in the video above.  If you haven’t seen it before, it is 4 minutes and 45 seconds of classic Avinash awesomeness.  I appreciate that with that quote Avinash is trying express something deeply powerful through a crystallization of a concept.  Nevertheless, when bounce rate is looked at in a monolithic fashion without exploring the nature of the site or page types, it is quite possible to draw incorrect conclusions about user behavior on one’s site.

For example, the following site publishes a lot of content multiple times per day.  They get a lot of traffic, and have a “high” bounce rate.  Even the Direct Traffic to the site bounces at a high rate.

bounce rate direct traffic

 

Creating an advanced segment for Direct Traffic that Bounced and applying it to the Frequency & Recency reports, reveals a completely new perspective on the nature of this traffic.

direct that bounced  

Almost 30% of the Direct Traffic that bounced was from visitors who had been to the site 9 or more times.  It certainly does not seem like these people were puking and leaving.  Indeed, most of the time that I find myself at Avinash’s blog, I spend a significant amount of time reading an article and then “bounce.”  A measure of success for content sites is not necessarily if the reader bounces or not, but whether or not they read the article and (more importantly) come back.  

Eivind Savio has a great post where he shares a script (originally from Thomas Baekdal) that helps add a tremendous amount of context to traditional bounce rate metrics; namely, it is a relatively complex (and extremely elegant) script that tracks user scrolling behavior.  And yes, another shout out to Justin Cutroni who wrote extensively about this in a two part post.  Personally, I like Eivind’s approach of using non-interaction events instead of messing with bounce rate.  I’m curious what any of you readers think.  Perhaps it really is appropriate to change the way that GA “normally” treats bounce rate based upon indications that a person is indeed reading an article.  Not sure…  

Eivind has put together a wonderful Excel worksheet that allows one to pull all of the scroll data  into a dashboard using Next Analytics.  Here is some data from this blog.  

   

My most popular posts (about the change in how Google Analytics defines a session, about unique visitors, and about multi-touch attribution) all have bounce rates of above 60%.  However, between 23-33% of the pageviews had users who scrolled to the bottom of the content section of the post in more than 30 seconds.  Indeed, for those posts, more the 75% of the pageviews had some sort of scrolling behavior.  If I didn’t have scroll tracking on these pages, I would be stuck with a low time on page metric and high bounce rate.  At least now I know that at least a few people are reading my blog posts.  :)  

Bounce and Exit Rates  

Interestingly, the bounce and exit rates for “Content Readers” is practically the same as the overall average for those posts.  This is just another indication that Bounce Rate is not the “be-all and end-all” metric when it comes to understanding user behavior.  Always keep things in context.

I want to end this post by pointing you to an excellent piece by Kayden Kelly from Blast Advanced Media about Bounce Rate.  He touches on a number of points that were brought up here, and a lot of other issues (like the difference between bounce rate and exit rate), which I specifically didn’t address because he did such a good job.  It is really a worthwhile read.