Uploading an SSL certificate to an AWS load balancer

So you've got an SSL certificate for the domain name you want to use to collect data and you want to use it. How do you do that?

  • Open the AWS console
  • In the top-left select Services > EC2
  • Click "Load balancers"
  • Select the load balancer and click [Actions] > Edit listeners
  • Add a listener for HTTPS (port 443)
  • Click Change under "SSL Certificate"
  • For Certificate Type select "Upload a new SSL certificate to AWS Identity and Access Management (IAM)"
  • Give the certificate a name
  • Add the pem-encoded Private Key, Public Key Certificate and then Certificate Chain.
    • The Private Key was created when you made the Certificate Signing Request you sent to the SSL certifier.
    • The Public Key Certificate is what your your SSL certifier sends you back.
    • Certificate Chain defines the signatures between your certifier upwards to the root certificates for SSL. Search your certifier for "intermediate certificate".
  • All these things need to be "pem encoded". If your certificate doesn't look like that, try this with the OpenSSL command-line tools.

beaconWatch: Prototype of automated testing for web analytics

I've finally got my tool for testing web analytics beacons running reasonably well.

What is does is open a browser via Selenium, spin up a proxy server and tell Selenium to use it, then browse to your URL. The proxy parses out the URLs of beacons according to some basic rules.

It's very rough, but I want to get the concept out there and hopefully better developers than I can help me make it suck less!


Stripped referrer header on iOS Facebook app

I worked with Jethro recently to diagnose an odd issue we had with Facebook referrers across a couple of publications. One of the breakthroughs came through, as expected, with Facebook referrer mostly intact and our analytics tools picked it up as social referrals. The other came through with a huge chunk of "Direct", when clearly (judging from the campaign code) it wasn't direct. What was going on?

We only worked out the cause by hooking up an Android and iOS device to proxy through Charles on a wireless network and inspecting the traffic as it flowed through.

What we discovered was interesting. Facebook links that bounce through bit.ly will lose their referrer header on iOS clicks made from the Facebook application. Facebook actually goes to some lengths to pass through referrers without leaking identifiable information (try not to snigger at the line "As part of our continued efforts to protect users’ privacy" from Facebook) but it seems Safari doesn't cooperate when it's redirected through something like bit.ly.

Moral of the story: obviously, always use a campaign code and (for non-GA) remap your sources based on what is in the campaign code. Also, avoid things like bit.ly if you get a lot of traffic from the Facebook app on iOS.

Data layer standard released

After over a year of work, the W3C Customer Experience Digital Data Community Group has released the Customer Experience Digital Data Layer specification document. This is about as close as we're likely to get to a standard in the web analytics space. It's been a long slog and much of the credit has to go to to Viswanath Srikanth at IBM for herding the cats to get the job done.

This is an exciting time in web analytics. A data layer standard allows our industry to move to the next layer of abstraction. Once this gets implemented, we'll be able to focus on more interesting things than basic implementations. For example, with the common standard you shouldn't need to do anything special for regular things like an ecommerce implementation. Shopping cart software vendors will implement the data layer once, and you just pull what you need from there in your tag manager.

There's still much to be done, in particular the data layer helper library being built by Brian Kuhn at Google. The current data layer is static, rendered with the page at page load time echoing old school web analytics.

Brian's helper library makes the data layer dynamic, using the Google Analytics queue mechanism to enable changes to be made during the lifetime of a page, essential for modern web applications. The crucial change here is that while Google Analytics replaces the push() method of _gaq once loaded and handles everything itself, the helper library allows multiple listeners to register and be notified of any new updates to the data layer. So once an update occurs to the data layer, multiple analytics tools or tag managers are able to react to the change and do things. Very cool.

It's a great day. Really excited and planning my first implementation right away. 

Data layer standards

I just listened to the latest episode of Rudi and Adam's Beyond Web Analytics podcast, all about data layer standards. This may sound like an esoteric subject but it's going to become really important in the future, and is key to out industry moving to the next level.

As technologies mature, there is always a tendency to standardize so that we can move to the next layer of abstraction. It means we've worked out the details of things that practitioners have embedded in their practices and we can move on to bigger and better things at higher levels.

So what is a data layer? In "traditional" web analytics implementations, you push information to your web analytics platform using its own platform-specific mechanisms. To record a "Newsletter signup" event in Google Analytics you might use:

_gaq.push(['_trackEvent', 'Newsletter', 'Subscribe', 'Customer list']);

in Omniture you might set:


If you wanted to switch between vendors or have something like an ad server conversion beacon inserted on the page, you have to write yet more platform-specific code. More code means more scope for error and for the events to fire on subtly different criteria, so your numbers never line up across systems.

A standardized data layer means instead you'll record things in a common manner and if you're using tag management you can very easily set up whatever analytics, ad server or other tools you want to fire on the same criteria. If well adopted we'll see platforms like Shopify, BigCommerce, Magento, CMSes and the like all supporting it and having turnkey web analytics implementations for 80-90% of use cases. Now that's a good thing so we practitioners can start working on more cool stuff and less tedious implementations and reimplementations.

This is a fantastic initiative and incredibly important. Check out the W3C community: http://www.w3.org/community/custexpdata/wiki/Main_Page

I'll be giving a talk on this next month at Web Analytics Wednesday Sydney.