GA4 is now our de facto analytics tool. Regardless of how familiar we were with the previous tool, GA4 is here to stay so we may as well get good at using it.
I’ve got just the person to make the transition relatively painless for us.
Our guest’s love for analytics was a happy accident after she worked in marketing at a company with a sales director. They told the executive team that marketing’s budget would be put to better use hiring new salespeople. But beyond having a warm fuzzy feeling in one’s tummy, it wasn’t clear marketing’s impact could be articulated in the way that executives talked.
Not willing to watch her department disappear, she dug in and found the data that showed marketing was having an impact. She took the evidence to the next board meeting and her department was able to continue with its work.
She chose to go out on her own so she could empower marketers to do as she’d done. She now heads up The Colouring In Department, a consultancy that has completed close to 230 GA audits now, and has trained thousands of people on how to get good at their analytics.
Whenever your marketing is being assessed by an analyst, they will use one of two approaches.
The first is called Multi-touch attribution, which takes a customer who’s made a purchase decision, then puts weights on the touchpoints they had on various channels (Google calls their model ‘Data-driven attribution”) on the way to that point, to say which touchpoints were most influential.
The other approach they may use is Media Mix Modeling. From what previous podcast guest Kevin Hartman told me about MMM, it’s a ‘tremendous undertaking.’ It involves collecting and analyzing historical data in different geographies at different times of the year: sales figures, both legacy and digital marketing channels, and external factors like economic indicators and even weather. It has its own jargon: Incrementality, ratios, betas, impact on objectives. Then there’s the math. It uses regression methods, both linear and non-linear, Frequentist vs Bayesian statistics.
I get so overwhelmed with these modeling solutions, it’s like the old Who’s On First skit. I needed someone who would sort this out for me.
Our guest has been a consultant in the marketing and digital analytics space for 15 years. I’m currently focusing on helping clients quantify the impact of their marketing efforts using Marketing Mix Models, experimentation, and various attribution methodologies.
He is so passionate, he started a newsletter called MMM Hub
He graduated from Carnegie Mellon with a Masters degree in Information Technology, focused on Business Intelligence & Data Analytics.
Jim is great at showcasing other people in the analytics community -He truly believes that all of us are smarter than any one of us. He, along with Simon Poulton, co-host the MeasureUp podcast.
He talked with me from his home in Pittsburgh. Let’s meet Jim Gianoglio.
Market Mix Modeling (MMM) 101 – This is a good intro-level article highlighting the important high-level concepts of MMM
A Complete Guide to Marketing Mix Modeling – although this article/site is littered with a bunch of ads, the content is actually pretty good. It touches on the concepts as well as providing some code snippets for R, Python and SAS.
Videos / Courses to help get started with modeling:
MASS Analytics – Marketing Mix Modeling Master Classes – (free) 14 courses (YouTube videos) – very well done, starts at a beginner introduction to MMM and goes all the way through advanced modeling techniques. It’s about 3 hours in total.
Marketing Mix Modeling 101 – (free) online course (YouTube videos). This is 2.5 hours over 5 courses that focuses on MMM using Robyn, so is good if you’re comfortable using R.
Johan van de Werken thrives best at the sweet spot between data, business & technology.
Graduating with a philosophy degree from the University of Utrect, my guest started his career as a journalist for several Dutch publications, writing about everything from events and pop culture to media, politics and economics. Around 2014 he switched from letters to numbers, working in CRO for several European e-commerce businesses. That led him to building dashboards and leveraging cloud platforms to turn raw data into usable marketing insights.
Working at an analytics firm that exposed him to BigQuery, he thought about sharing what he was learning. Seeing that the domain GA4BigQuery.com was available, he registered it and started posting there as a side gig. It got noticed by Simo Ahava, the founder of Simmer. That led Johan to release the GA4 and BigQuery course on their training platform. As we fast forward to 2023, GA4BigQuery is now a well-known resource for marketers. And its creator is now consulting full-time on data analytics under his own brand, Select Star. Except for when he’s having fun playing in a punk rock cover band.
Data warehouses are amazing things: you can toss all kinds of information into them then pull mind-blowing insights out the other end. This feat can happen because you’re connected to outside systems holding their own database tables. A copy of whatever has recently gone into the table is taken out and shot through a data pipeline and pushed into your data warehouse. But today’s data stacks contain Multiple clouds, hybrid environments, and so many data pipelines the programs in charge of monitoring and logging the flows almost can’t manage them. It becomes overwhelming to manually check and ensure the quality and integrity of the data. The more sophisticated the systems, the more errors creep into the data. If we rely on flawed data, the outcomes and insights we generate will be equally flawed. This is where data observability comes in.
In this episode you will hear about something called an observability platform. It identifies real-time data anomalies and pipeline errors in data warehouses. Now there’s a twist here because we’re in a cloud computing environment that charges by number of computing cycles. You don’t want an observability tool that’s another pipe accessing client data and running up the meter. The good news is there’s an easier way to detect when data has gone awry, by comparing log files – basically metadata – they are just as effective at alerting you to problems.
If you’d like what this is doing described in a completely non-technical way, think of Hans Christian Andersen’s Princess and the Pea. There is a girl who comes to a castle seeking shelter from the rain claiming to be a princess. The queen doubts whether she is truly of noble blood, and offers her a bed, but this bed has twenty mattresses and twenty down-filled comforters on it. A pea is placed underneath the bottom mattress to test if this girl detects anything. The next morning, the princess says that she endured a sleepless night; there must have been something hard in the bed. They realize then and there that she must be a princess, since no one but a real princess could be so delicate.
I spoke with Yuliia Tkachova, the co-founder and CEO of Masthead Data, a company which recently received $1.3M in a pre-seed round. Originally from Ukraine, Yuliia came to found Masthead after work that convinced her of the need for an observability solution. She had roles as a Product Manager roles at OWOX BI and Boosta, where their data solutions encountered problems. Prior to that, she did marketing for RAGT. She has Bachelors and Masters degrees from Suma State University, specializing in MIS & Statistics. She also serves as an Organizer at MeasureCamp, a volunteer community where analytics professionals come together to learn.
Analytics is something that everyone says they want, and some brag that they can analyze very well. But few people know what investment’s required to build a quality analytics function, and even fewer are good at justifying its value.
Our guest Martin McGarry is so passionate about analytics, as you’ll see from his backstory, if anyone can articulate the business value of analytics, it’s him.
After completing a Bachelor of Science from The University of Manchester and studying at the University of Cambridge as a Doctoral Candidate, our guest worked in the UK analytics practice of a global Management Consultancy. Due for a change after 6 years of that, he moved to Ottawa Canada and founded his own consultancy so he could offer a more independent approach. A while later started the firm he’s been leading for nearly 15 years, Bronson Analytics.
In 2018 he began a recurring event in Ottawa called Beer & Analytics, which draws hundreds from the field together for learning and socializing. In 2022, the event went outside of Ottawa for the first time, being held in Toronto.