As we head into 2025, I find myself preoccupied with the idea of measurability. Early in my career, I was drawn to the work of W. Edwards Deming, and in particular I took this quote to heart:
If you can’t measure it, you can’t manage it.
This concept has framed my thinking countless times. I’ve built products based on it, I’ve taught classes about it. Arguably, I chose a career which would allow me to live it every day.
The problem is, he never really said it. What he actually said was:
It is wrong to suppose that if you can’t measure it, you can’t manage it – a costly myth.
Yeah, that’s pretty much the opposite of what I hung my hat on.
I only recently learned the full quote, but my confidence in measurability has been waning for a while. At first it eroded as I began to understand how inadequate user metrics are, because we can only really track a browser/device with an analytics tag.
I also spent years of my life on the sisyphean task of identifying and excluding bot traffic from measurement.
Along the way, I told myself and others that I could extrapolate from the fraction of data that was real to arrive at general truths. You’ve probably heard something along these lines from me or someone else, “it is directionally accurate.”
But, with the expansion of privacy regulations and controls, and the growing number of devices, websites, platforms, etc. that people use to interact with businesses, I have finally come to accept that only a small fraction of what matters is measurable. If only I’d comprehended the full Deming quote back when.
Where does this leave us? In an exciting place, in my opinion.
Some people I talk to have given up on GA4 (and by extension digital analytics in general) for the reasons I describe, and are favoring institutional knowledge and instinct over data. Others cling to the hope that Server-side tagging, or BigQuery or an alternative to GA4 will somehow fix the problem.
I believe the path forward incorporates both perspectives. Market experience, what we can measure and even wisdom and instinct all have a place. We absolutely do get valuable data from GA4, Google Search Console, Meta, TikTok, etc., and it is worth doing what we can to make it as accurate and integrated as possible. But we also have to accept that it will only tell part of the story.
I’m excited to delve into frameworks and methodologies for incorporating everything we know into optimization and decision making. 2025, here we come!
Google hasn’t announced any significant updates this month, but Michele Pisani spotted a cool new section in the Admin interface: Consent Settings. I checked, and I’m seeing it in my properties as well, below ‘Data collection and modification’.
This section tells you whether or not your Google Consent Mode signals are working for GA4 tags, which is helpful, but be warned that it does not tell you whether or not your site is in compliance with relevant privacy regulations.
My favorite part about this feature is that it explains why the ad_storage, ad_user_data, and ad_personalization checks are necessary for Google Analytics tags, and how they impact features. I was previously a bit confused as to why Google Tag Manager would automatically apply those checks.
The Looker Studio team was a lot more productive than the GA4 team this month, with multiple noteworthy releases. Here are a few highlights:
I remember years ago wondering “why the heck can’t I preview my data in Data Studio?!?” (now Looker Studio). They finally added it, at least for a few types of data sources. Much appreciated.
It took me a minute to figure out what this feature does, but now that I’ve figured it out, I like it! Previously, the scorecard chart-type was primarily useful to show metrics. But you can now select to display a dimension value in a scorecard. I’ve included an example below – I used the YouTube Analytics connector and selected Video Title as the dimension, then sorted by Views descending. Voilá, my top viewed video! I changed the dimension name to make the scorecard title more meaningful.
You can now create filters with date conditionals, for example you can filter a data source to only include dates before and after certain dates. What makes this different from setting a custom date range for a chart or dashboard is that the dates can be relative. I’m sure people will come up with all kinds of crafty tricks with this feature, but one I discovered is that you can use it to create a sortable date comparison column in a chart. This is very useful if you want to see which keywords, ads, pages, etc., had the most growth in the last X days, weeks or months.
I wrote a blog post demonstrating how to make sortable data comparisons here.
Google Dataform is a toolset that works with BigQuery to develop, test and schedule data workflows. I’m a big fan, but every time I create a new repository, I stumble over some aspect of the process for setting up Github integration. This article from Measurelab is way clearer and more concise than Google’s documentation:
How to set up a Dataform repository with GitHub & Google Cloud integration
Thank you Measurelab, this is going to be one of my most dog-eared bookmarks!
And if you aren’t using Dataform yet, Measurelab also recently published a simple guide for converting scheduled queries to Dataform workflows – this is a great way to get started!
Speaking of Dataform…
If you follow me at all, you know that I am a big fan of Johan van de Werken. His website ga4bigquery.com got me started with GA4 data in BigQuery several years ago, and his Simmer course took me to the next level. I’ve also learned a lot from the folks at Data to value.
Well, they’ve joined forces (along with a few others) to build an open-source Dataform repository for transforming raw GA4 event data into data models suitable for reporting. I’ve written before about Pia Riachi’s GA4 dataform repository, which is also an amazing resource, but if I had to choose I like this one even better. The code is a bit easier to follow, which makes it a better starting point for adapting to custom needs. It also supports a wider variety of reporting needs out of the box. And I think this new resource is likely to evolve more quickly due to the vaunted place the authors hold in the world of BigQuery and GA4.
So if you’ve set up the GA4 export and aren’t sure what to do next, I recommend checking it out. I was able to do the full setup in less than 30 minutes. That’s just to do the automated install, I’m still learning the ins-and-outs of how the transformations work and how the data is structured.
One tip: check out the Post Installation Guide before selecting the ‘Let GA4Dataform process your data now’ option – if you have created custom event parameters or added custom user parameters, you will want to configure them before building the workflow.
(If you want all of the GA/BigQuery goodness, but don’t have the time or skills – please reach out.)
This is another useful, free resource. Siavash Kanani built this Looker Studio dashboard for auditing a GA4 property. It literally took me about a minute to get it set up to evaluate twooctobers.com (and don’t worry, it doesn’t share access to your data with anyone). It is far from a comprehensive analytics audit, but we will definitely be incorporating it into our standard audit process. Some of the things it checks are quite ingenious!
Sign up for our newsletter to get Nico’s monthly Analytics Roundups delivered to your email box.
Nico loves marketing analytics, running, and analytics about running. He's Two Octobers' Head of Analytics, and loves teaching. Learn more about Nico or read more blogs he has written.
This month we cover an intriguing study on search behavior and a fascinating trove of…
Looker Studio's comparison date range columns are nice, but frustratingly not sortable. Learn how to…
Google tests AI Sales Assistant with conversational advice; TikTok launches an AI tool to create…