HASSAN ELBAYTAM:
Alright, good afternoon, everyone.
12:
45 on the mark, I like to be on time. My name is Hassan Elbaytam. I'm from Toronto, Canada. I have a slide about me, so I'll just use this as a background slide. I'm a electrical engineer by training, never worked on it because what I learned, I ended up going into marketing by mistake. I met a friend, someone who led me into digital marketing, and six years later, here I am talking about Google Analytics. And I work at Bounteous. Bounteous is a co-innovation digital transformation company. We help our clients just be more digitally focused. We help them build apps and websites and we use Drupal for a lot of clients websites. So I'm going to start this off with a bit of history because it's a fun topic. So just Google Analytics as it is, it started back in 2005 where it was called urchin backrub. A little fun fact, there is a term UTM source, UTM medium term to where the traffic sources come online. UTM stands for Urgent Tracking Module and the term just stuck up until today. So Urchin was acquired by Google back in 2005.
It got rebranded to Google Analytics in 2007, and back then, apps weren't so popular, it was still relatively new. This is right before or right around the time when Steve Jobs got on stage and introduced the concept of mobile apps. So Google Analytics kind of like took its time to adapt to apps, so it wasn't until 2012 where it launched an SDK to track mobile apps. And it was a problem because Google Analytics was more focused towards web. So web interactions kind of the thought process behind web interactions is very different how app interaction would happen. So was always this kind of they're not cohesive because it was a different SDK and a different kind of tracking model, they ended up splitting up becoming Firebase Analytics. So Apps was Firebase Analytics, Web was Universal Analytics, they both were under the Google Analytics run, but if someone had a website on an app, they track in two different places. Back then also, there was the Cambridge Analytica. That kind of news broke out and then a lot of issues with data privacy started coming out.
And then, the EU kind of dropped the GDPR thing and concept and kind of ruined the dreams of the perfect marketing world where we can collect all information on everyone and know what your grandmother likes to have for lunch. Yeah, that's not true, but we can. So GDPR got accepted back in 2016, and that posed a problem for Universal Analytics because it utilized a lot of third-party cookies, so companies weren't ready to kind of adopt this just on set. So it was a process to get to a place. The solution back then by Google Analytics was kind of rebranding Firebase into Google Analytics for apps. Basically, they did nothing other than change the name and it works. Fun fact, even at 20 something is when I went in the field and started learning more about Google Analytics and understanding how it works. So what we didn't know back then was Google in the background was developing this new thing called web and app tracking. We didn't know what it was. It wasn't available for public, but at the time a safari came up with the ITP 2.0, which kind of just stopped everything and its tracks.
If you're not aware, ITP, I don't know what it stands for, but I know it just stops you from tracking people after seven days on safari. So good luck if the conversion funnel is longer than a week, you're out of luck, you have to start over every single time. This worked well for eCommerce. It was fine for e-commerce, but for B2B and bigger purchases where people would need up to three weeks or a month to kind of convert, that would pose a big problem especially for analytics. In 2020, GA4 was dropped for the public and it was rebranded to Google Analytics 4. Why four? No one knows because there was no three or two or one, but it's like Windows 7, eight, and then ten. So Google Analytics 4 launched and it kind of changed everything in terms of how it's tracked. And I'll explain what I mean by changed everything, because the thought process, it unified both web and app and the experience of tracking kind of got unified. So it's a whole new model of thinking of how we track interactions on websites.
Now the biggest news to come out was that the Universal Analytics will be sunsetting in July 1st of 2023. So counting today we have 66 days left to transfer from Universal Analytics to GA4. I'm hoping everyone who works on a website already transferred just for the year over year tracking, but you have 66 days left, so hopefully there's some time. So the main question is what changed? Why the big shift from universal to GA4? And the answer is the data model. The data model and universal analytics kind of focused on one main thing, the user. And then it bucketed all the user interaction into sessions. So once the website loads, it's called a session, starts. So everything that the user does during that time is called a hit. So a session starts, I open a page, it's a page view hit, I scroll, I click on any element on a button or I click on a form that's all under event hit. So it's a page view hit and then everything else is an event hit. And then if I close the page and come back the next day, a new session starts, another page view hit will be sent and then another event hit on your site.
Whereas in GA4 all this goes away. It's all still focused on the user, but it's everything is event-based. So it kind of fits well with apps since every interaction on app is an event and there is no concept of page use on an app because it's a screen view, you can say. So the concept of a page, you hit kind of went away, so they had to unify this, everything became an event, even a session. So once the page loads or a screen on the app loads, a session event would start, indicating that this is the start of a session. There will be a session ID that connects all those events together. So in the back end, GA4 kind of says, oh, I know this event is traced back to this session. This first time user that came to visit my site has the first visit event, it connects to the session. They view the page or view the screen, it all connects in one session because of a unique session ID. Now, let's say I close the page and then visit the next day, it comes with another string and use session start event which is separate from the first one and encompasses all those events.
They string together in a series so that in the backend, Google Analytics connects it all to a session, does similar calculations, but the concept instead of being a session base and starts a counter for a session, everything encompasses under the session, it's in a series of events that shows you the path, OK, they started with this and then this and then this. Which kind of fits well with tracking on web, still works, but also fits better when it's tracking an app, especially in a world where apps are king. The other thing that changed is how an event is structured. Now, in Universal Analytics, an event was very structured, and you have only those three or four things that you have to fill out, it's all fit under category, a type category of event. And then you have the action, the label and a value. A value would be something if I scroll, how much did I scroll? The percentage i scroll 10%, 20%, 30, so it all fit under value. What page did I scroll on? Something I would want to make up to fill out the label section.
So it was very rigid and I had to fit the event into the structure. Google Analytics 4 said, well, this is too rigid, we need to make it easier for users to create an event, and event should be more easy, be understandable just by the name. So they put in something called event label and then parameters. Now parameters work similar to action, label and value, but they give a lot more depth to that event. So let's say, you had a button on a website or an app and you wanted to measure how many people clicked on that button. So you'd say, OK, this is an X button click, the parameters here are what time of day, how this was fired, what pace was this button on? The type of user? Are they logged in, logged out? It gives a lot more depth to what happened with that event. So you can build a story just from that. The parameters, you can send them with an event. This is where it kind of opens up what you can track and what you can understand from each interaction, each event happening. So I'm going to show you just a live demo.
But in general, let's just get going. You know what? I'm going to show a quick overview of Google Analytics. So on the homepage, just gives you a quick snapshot of a number of users, how many events were fired, if there's any conversions and how many users came in. And users is also an event. So the first visit or first open for an app is how Google advertises a new user. So when they come back again, it just says, oh, this person already came in, so it's a returning user. So it measures everything differently. And you can see it's 19,000 users, 15K are new. It gives you suggestions based off of how you use Google Analytics on what you'd like to see. For me, I care more about where traffic is coming from, so it shows me this, which countries in the world and just set of event counts. The main thing here is events, and under reports you'll find events, all the events are coming in. And to showcase what those parameters do, so we know that there's a view promotion event or people are seeing a promotion on the site.
But which promotion are they're seeing? What pages are they seeing this promotion on? We can click on it and it gives me a lot more depth and detail about this event. So it tells me where in the world are the users when this event happens? Are they male or female? How many times to those events via procession? So, do they see just once or do they see it five or six times? So roughly 5.3 times. The title of the page where this event is firing. The location, as in, the URL. If there's a payment type, because it's a few promotion and they're using specific type of payments, so we can see all this information. Now I'm using Google's demo account, so you will see a lot of (INAUDIBLE ), but it's the concept, it gives a proper understanding of the concepts of type information you can share within that specific event. And you can track all this depending on your analytical needs.
MAN:
Do you have to set up those variables independently or Google give those to you out of the box?
WOMAN:
You have to set it all up, I don't know.
HASSAN ELBAYTAM:
Yes, so there are some that are recommended by Google that you set up and they will tell you, set those up, please. And there are others that are custom that you can do whatever you want. This is all customizable, you can let your imagination run wild because we within limit, because we still have GPT to worry about and (UNKNOWN) in Canada, I'm not sure what the states, if they even care about data privacy, I know that California does. (LAUGHS)
WOMAN:
We pretend to care. Yeah, we put a little notification.
HASSAN ELBAYTAM:
So it is fully customizable. There are set of events, especially for e-commerce, there are a set of recommended events that Google says please set them up because Google first and foremost built this for e-commerce, but you can play around with it and customize it in a way where you can track SaaS product all through Google Analytics. It takes a little finessing, but it does work. Yeah, basically there's a lot more information that you can send in to an event outside of just the category, action, label, and value structure that you had in Universal Analytics, which is great. The other thing that GA4 introduced, and this is way before the chatGPT and all the fun AI talk, is machine learning in GA4. This was the crux of why Google said we wanted to create a new thing because this is a feature that required an overhaul of the backend of Google Analytics. What now you can do is, so I talked about suggestions of base of my activity. This is basically learning how I use Google Analytics and suggest those cards because it knows, oh, you're a marketer, so you care about where the traffic is coming from or what countries are the users in.
But if someone was more on the technical side, they might see something different, different cards. And it's basically just the machine learning aspect, learning of how you're utilizing or using GA4. Another thing you can learn is automated insights. Where it just learns from your data and tells you right off the bat, oh, we noticed there's something happening here that's worth your look. And you can customize it too, basically saying, I want to track this specific metric to see if it increases or decreases by a set percentage. And to show you this, there is insights and recommendation. So in this section you'll see this is all the automated insights. It's telling you that the organic channel drove 33% more revenue than the rest. For me as an analyst, I would have to go in and check every channel to find this out. But Google Analytics 4 just told me, OK, here's your job done for you. I'm not going to be replaced yet, I hope. And also shows you other information like the revenue last week versus this week, there's certain insights that someone would want to go in and dig deeper.
Just knowing where to start because of this machine learning aspect is really nifty feature. Another thing that they have is just the added more predictive metrics. This is auto calculated by Google, it's already pre-set up, just you need to set up the recommended events like e-commerce.
SPEAKER:
Add to cart and begin checkout all those fun recommended events. Based off of those events, the machine learning aspect can tell you, is there a probability of purchase? And if this specific user set have higher probability of purchase or lower probability of purchase. Churn. Churn means that there are users who buy your product but may end up stopping and they will stop buying your product. So churn, it predict, OK, this subset of users may end up churning in the next week or 60 days or so on. So it's a good, nifty metric to kind of understand and can create audiences for marketing use to kind of remarket bring them back in, entice them with the promotion and so on. This takes a lot of crunching of numbers to kind of figure out and GA4 does just in like two minutes and also predict revenue. So it'll say based off of your current trend, we expect that you will make $1 million next month, maybe. So it's really good, in terms of, you don't have to export the data and kind of utilize it in another tool, it's all done automatically by GA4 with those kind of predictive metrics.
And the last thing is if you're building charts and graphs to showcase to stakeholders, it'll show you automatically in those time-series chart where the anomalies are. We were expecting this metric to have, let's say, we're expecting revenue to be at 100 K, but for some reason it dipped to 50. It'll show you that anomaly explaining, OK, this is the kind of traffic that we were seeing and this is what happened. Or if there's an influx, it will show you, OK, there was an influx, something happened here. It just gives you the entry point for you to start your analytical process rather than discovering this yourself and kind of figuring out where to start to analyze. So this is a lot of the kind of main thoughts behind why Google went the route of rebranding. But we are in a drupal conference and we're mostly a developer crowd, so I do want to make it more relevant. So one thing as an analyst I help developers do is analyze how the website is doing. Not by the number of users coming in or how many people clicked on this button, but one of the things that we used to face a lot back when I was in house marketing was why is our website slow or why is some pages slow but some aren't and what is causing the slowness?
So, like anyone would, I jumped in to Google speed insights plugged in the website URL there and it told me oh, you're doing well, but I'm still getting reports, no, there is slow pages. So I tried to figure out what the issue was and then I discovered core web vitals. And it's much easier to kind of analyze with GA4 and kind of shows the power of GA4 and what you can do with GA4. So core web vitals are just a set of performance metrics that tells you how your website is doing. Google tracks this on a high level with the page insights speed test, and it focuses on something called CLS, LCP and FID. I'll explain what those mean in a second, but those are the key three metrics that we look at to kind of analyze as the page load fast, slow, what should be fixed to kind of improve the speed. So, this core web vitals was developed by Google. It's a library that can be added to any website and it's essential for if you want to improve your SEO rankings, if market is saying we're ranking too low, just throw in this and tell them, OK, because of your pop up, the website is low, so blame them.
As a previous marketer giving you steps to kind of avoid marketing requests. So the main three metrics, CLS, LCP and FID. So Google created this library, they have all the code on how documentation on GitHub. I don't want to go through it too much, but this is the main thing to get the library loaded. The slides are on the session page, so if you want to download them as a pdf and follow through, the link links to the GitHub or if you want to scan our QR code and just go through it yourself to your phone. So there's different ways where you can load them the library. I opted for this one because I use Google tag manager and it won't accept anything other than standard JavaScript, not even ES6, just this. The main concept of what this is doing is it's loading the library and API and then console logging CLS, FID and LCP. That's basically what it's doing. There's nothing too complicated. You can play around with it to do something else, but in the end, this is what's going to do. Log the metrics for CLS, FID and LCP, send them too much.
Now let's go into what they mean. CLS is the most complicated one, and even I don't understand what the point of it is. But basically it stands for Cumulative Layout Shift. In a nutshell, if you have an element that moves on the page, that can conceive, give the user an experience that, OK, this site is still loading or it's not fully loaded. So knowing what is moving on the page, what elements are moving best because of a CSS function or a JavaScript function or something, and knowing where it is and knowing how much is it moving is important to kind of improve usability. And it's not so that the website loads slowly, it's more a perception that the user sees, that, OK, because this is moving or is not fully loaded yet, it seems like the website is slow. A good score should be under 100. Because Google is so smart, they never told us what 100 represents, milliseconds, pixels, just 100. So we're just going with what Google said, under 100. Now, in the previous slide I showed you that the function on CLS console.log.
So what that does, it returns a JSON object with all the information and here is what it looks like. Means nothing, but there are specific points that is important to kind of take a note of. First of all, the name, of course, we know what the name is, but also the value. This is an amazing number to hit. It's way under 100, so perfect. Another thing to kind of look at is, what is the element and how did it move. So it gives you where it was initially and where it is right now. How much did it move? According to this example, it did not move by much, it just went down a couple of pixels or up a couple of pixels, but nothing too major. But also it gives you the ID or in this case the class of the element which that is moving. And it's important to know because then when you're analyzing what is going wrong, let's say this was 200 something, then, OK, I need to know what this element is that's moving and kind of figure out why it's moving and how can I improve on this metric, maybe lower how much it moves and so on.
So in a nutshell, that is CLS. Another thing we're talking about is the largest contentful paint. And usually, that went on homepages. It's the hero image, it's basically the biggest element of the page that is loading. And here we're talking about, we want to make sure that it flows as fast as possible because the slower this loads, even if the rest of the page is fully loaded, but this is still loading, it shows a perception to the user that the page is too slow. So this is why this is very important because it's more perception rather than actually it's loading slowly. The best score should be anything under 2.5 seconds and at least they gave us what 2.5 stands for this one, seconds. The problem is that when you look at the code of the JSON object, it shows you 786, the rating is good because this is milliseconds. No idea why Google does this, they love to kind of create drama and suspense, anyway. But here you see that it gives you the value and does the same thing. It gives you the element, class or classes, in this case, and also, if it's an image or a video, it gives you the URL that you should be worried about.
In this case, I'm not using CDM, so maybe that's the side effects. So those are kind of information that you can expect from the return and kind of make note of and kind of start collecting in GA4. Lastly, is the First Input Delay, and this is also another perception more than it is something that slows the website, it's when the user starts the first input and the response they get back from the front end. So if I scroll, if it takes more than 100 milliseconds for it to scroll down, so I scroll on my wheel because I live in the 90s and then it takes a while for the page to actually move down, I give them a perception that the website is too slow to respond and too slow to load. So having something under 100 milliseconds is perfect because it's too fast for the user to see that something is slower, faster, so on. And this is the JSON response that you might expect. The value that, it's three milliseconds, so that's perfect. It's what type of interaction that happened, and also, oh, wait, no.
Yeah, that's because it's an interaction. So it's just the value and the name of the interaction. So far I haven't had any trouble with this, so I don't know if there's any issue, how to fix this, which is weird. I haven't heard anyone have any troubles with this before, so maybe that's a good sign, but it's important to keep track of, nevertheless.
MAN:
Where is this being output?
SPEAKER:
So based off of the original code...
MAN:
Yeah? Would you load it in tag manager?
SPEAKER:
Yeah, loading in tag manager. It's console.log. Now we'll do a live demo on how.
MAN:
(INAUDIBLE)
SPEAKER:
Yeah.
MAN:
Yes, good.
SPEAKER:
I will do a live demo on how you can kind of push it into tag manager and kind of utilize it. But basically, for now, everything is being printed in the console. Alright, now we want to connect this to GA4. Now we start off with using this tag manager model from drupal. Trying to keep it very relevant, I'm hoping it's working. So based off of just using this module, you can just input, and here I'll show you what you need to do. OK, there it is. Just put in the tag manager ID and you're fully connected, that's all you need to do. Now I put preloaded everything. So first off, we need to load the core web vitals library. So what I did is... Is this clear enough, or should I zoom? So what I did, different thing from the original code is I put in the function datalayer.push. Now data is just an array of JSON objects and just a lot of information that your website holds. For drupal, you can send in stuff like information about the author of the page or the type of content of the page and so on.
All this can be sent to Google Analytics for further analysis, but in this specific case we're just focusing on the CLS, FID and LCP. And then based off of the return objects, I created a couple of variables in tag manager, where this just captures the value of black and state value for all LCP, FID and CLS. And then for the node it requires a bit of finessing, you have to go in to select the object and then it's an array, so you have to choose the first one. And the way GTM works, it doesn't accept the brackets, it has to be dots because it's weird. So .source.0.node. This is how you get at least the element of it. And then we added the value and for funsies I captured the ID just to separate myself, because I like collecting data, I just add more information to send them and I'm going to showcase this how it works, exactly. So when I click on preview and to my own personal blog that is empty, first, we can look at the data layer here to include. There we go. We got the FID, we got the value of 18, if in the entries, we have its first input, because key down, OK, so press the button, so that's the first input.
It's not the most user-friendly way. So there is another way in the tag assistant. It shows you all the messages coming in to the data layer. So we can see this one was actually an FID, and then there's another message, LCP. And if we look at the variables that I created, It didn't capture the node, fun, but it did capture the value. And there should be, there it is, the idea. And basically, this is how it captures information based off of what's in the data layer and how I set up the variables. It captures them in this laundry list of variables. And now I can utilize this to build an event and send it to GA4. Let's build it together. So as we said, this is a configuration, because of the structure, I mentioned, we can have event name and then parameters under it. So the event name for this would be drupal, we have the event parameters. So now this is just an example of how we can build the event parameters, we had the ID, the node and the value of
HASSAN ELBAYTAM|:
Those events and also we want to know the name of the specific event so we can build events for each and every single one, the LCP, FID, and CLS. So for now I'm just gonna say ID. Also, GA4 prefers snake case. So if anyone prefers camel case, I'm sorry, let's take the snake case. I know this is a touchy topic. Alright. And then we can add for value. Enough for the trigger. I'll just put all pages for now. I don't wanna go too deep on how to trigger the events and how Tag Manager works, but this is basically how we build the event and how the events are structured in GA4. Once those events are fired, they come into... I'm gonna go into another. OK. They start showing up on the list here. When I created this (INAUDIBLE) demo, I named it CLS, LCP, and FID to track each one separately and each one has their own parameters of values and the ideas, and the values coming in. Now, this is all well and good. We collected the values. We have everything stored in GA4. Now we want to understand what's happening.
So internalize the data.
SPEAKER:
So we just call you at that point?
HASSAN ELBAYTAM|:
Yes. (LAUGHS) Another way, so there is this dashboard that I built. You can scan the code and you can copy it yourself and kind of connect the data once you have everything set up. You can copy this dashboard and just connect it to your Google Analytics, it's all ready. This is kind of a freebie that I want to share with you. I'm gonna show you how it works. So here is (INAUDIBLE). So what the dashboard shows is that what are the three metrics that we're looking at and based off of the OK, good, and bad of them, what the metrics are. Anything lower than 100 is good. And between 100 and 250 is OK. And anything above 250 is bad and needs to be fixed right now. And the same thing for LCP and FID. Now, I started with looking at averages of overall what the metrics are and then next part would be an individual page, what is the metric for each individual one. Did the similar color scheme green, yellow, and red so that, you know, OK this is good, OK, and bad. For now, let's say we're looking at homepage as LCP is... So this needs immediate fixing.
So as we said, LCP is the largest content paintful. So it is important to kind of know what elements of this page need to be fixed. So here we're checking issues. The LCP for the home page comes from this specific element, the intro header, and this is the URL for this image. It's causing the problem for the home page. We do have another one for signup that's causing issues. So those are like the first two steps to kind of fix the LCP for those two pages and then you're back on point. We can look at the same thing for the CLS, the content shift. We are seeing which elements are moving around on the page a lot and then FID is... on which pages FIDs are slow and if there's any, what are the interactions and so on. I know I kind of ran through this quickly, but I did want to leave a lot of time for questions because I have a feeling that there will be. Yes.
SPEAKER:
So how much emphasis do you think that Google is putting on those three metrics? And what if you had to, like, say, like is it a part of your search engine rankings and how well your site's performing, et cetera, like, how much emphasis are they putting on that?
HASSAN ELBAYTAM|:
So basically all the emphasis is on them when it comes to speed. So for such search engine optimization, it's a topic on its own. But when they say sites speed should be this, it's basically referencing one of those three metrics, mostly LCP, a largest content paintful because it's the easiest fix and it's the biggest issue where we're like users actually see this. Others I've talked to users when we fix them CLS and FID and we're like, "I didn't notice that change." But the LCP because you see it loading, it's very important for them. And I've seen this and it affects your ranking on Google. And Bing and other search engines, I'm not sure how much of a change it does because they focus more on keywords and the type of content and so on. But for site speed, Google definitely wants you to improve that speed.
SPEAKER:
Ok. And I have one other really high-level question. So in this, I don't know if it's worth mentioning the way that the GA4 is collecting data because it's very different than UA. So it's collecting data in three or four different ways depending on how you set it up, signed in Google users, all of the data that they used to collect as far as the fact that you can't see it says, you know, unknown. You know, that's the reason why they're collecting data differently now is because they can't know a lot of that because of the issues that when people say opting out of collecting data, right?
HASSAN ELBAYTAM|:
Yeah.
SPEAKER:
So there's all these reasons why also the ways that they're collecting data is different now. So I guess, you know, given that that question, my next question is, is there another analytics platform that people are using now? We've all been using Google Analytics for the last 15, 20 years, you know, are people looking at other platforms?
HASSAN ELBAYTAM|:
From the clients I've worked with very few. So mostly what I've seen is like Google still holds a big chunk of the market, but there are other platforms popping up, mainly used in Europe for GDPR reasons. So there is a platform called Piwik PRO (UNKNOWN) where basically it does not do session-based tracking it hashes a user ID and then deletes it when the user signs off. So every time a new user comes in, it's a new user with a different hashed ID. And according to GDPR, as long as that is not saved anywhere, you're fine. So there is those types of platforms coming in with more privacy-focus. The reason why people still opt in for Google Analytics and I don't see them leaving Google Analytics is because of the connections with Google ads or ads platforms or mostly remarketing and marketing tools because of how easy it is to kind of connect Google Analytics to Google Ads or Google 360 Display Video. All this makes the user want to use GA4 because it's just easier rather than figuring out how to collect this data exported and put it into those ads and ads tools and so on.
Also, now Salesforce is jumping on the mix. There's a lot of integrations between Salesforce and Google Analytics, and a lot of the enterprises use Salesforce heavily. So it's easy to connect Salesforce data with Google Analytics data and do their customization and then personalized marketing stuff. Yes, let's start with you.
SPEAKER:
OK. Great job, by the way. I was always a Lighthouse fan to inspect some of those values. And then just getting hip recently to PageSpeed Insights. And I understand the difference was where Lighthouse is lab-only data in perfect conditions that they've set, PageSpeed Insights is a combination of the lab and the real-world Chrome user data. And so the values are different in that way. But I like the way that PageSpeed Insights reports it. When you were showing those values something I didn't understand and then they found in another Google blog was under Chrome Developer Tools because the Chrome UX CrUX report, which is a lot like PageSpeed Insights.
HASSAN ELBAYTAM|:
Yes. This is all based off the CrUX.
SPEAKER:
CrUX. I thought so because what you're not showing is that different experience between mobile and desktop. As PageSpeed Insights breaks it down between those two and you can get very different scores. Something is read on mobile but not desktop of course. So-
HASSAN ELBAYTAM|:
So I don't wanna jump too deep into this topic, but based off of exactly I heard, the parameters you can send them with the events. And also this is automatically done by Google, like telling you, this is desktop, tablet, or mobile. You can split the data into all three.
SPEAKER:
I'm sure you can. Yeah.
HASSAN ELBAYTAM|:
So you can see if it's mostly your... of course, this is all web experience. So you can basically say, OK, I wanna focus on mobiles and types of mobile devices, maybe a Samsung Galaxy S4 from way back in the day, loads the site in different speeds and different values come in. And that's why there's an increase in FID or increase in LCP. It's spiking because this specific device, people are allowed to sign in to the website and it's causing all this. So there is a lot of ways you can kind of slice and dice information to kind of dig deeper. And the difference between this and Lighthouse or PageSpeed Insights is that it's per user, per page information and it's like the rawest of raw data. Whereas PageSpeed Insights and Lighthouse, Lighthouse, as you said, perfect conditions and it just combines all this information averages out for you. And maybe it does break it down per page, but it's just an average of perfect conditions, this is what you should expect. The same thing with PageSpeed Insights, where it looks through all the pages on the site gives you an average and says, OK, this is the average total and this is a problem page.
But with this, you have access to the raw data. So there's a lot more information you can do and kind of a lot more insights you can grab out of it. Bec-
SPEAKER:
It's also historical because-
HASSAN ELBAYTAM|:
...because also historical. Yes.
SPEAKER:
Yeah, exactly. So when you see combined scores like that, is that just taking the average between everything, all the devices?
HASSAN ELBAYTAM|:
Yeah, it is. And we can build filters, kind of like break it down based on device. But for the purposes of this presentation and I only had an hour so I couldn't go deep, but we can break this down into, like, per-user level.
SPEAKER:
We use the CrUX thing for that. OK. Thank you.
HASSAN ELBAYTAM|:
Let's... Yes, you're standing so.
SPEAKER:
I'm a developer. The GA4 tool is owned by the (INAUDIBLE). As a developer, one is, how do I know if I'm using GA4? Two is how do I tell them to use GA4?
HASSAN ELBAYTAM|:
If they're not using GA4, I would go to Leadership and tell them to fire them, they're doing something wrong. (LAUGHS) Google Analytics is important overall and they're not using GA4, they're going to be losing data. So I would start off with the first slide of the timeline, telling them, "July 1st is the deadline. Use GA4 now." That's number one.
SPEAKER:
How do I know though?
HASSAN ELBAYTAM|:
How do you know? I'll show you something quickly.
SPEAKER:
Ask the question. (CROSSTALK)
HASSAN ELBAYTAM|:
So this, it's called Google Tag Assistant Legacy. If I enable it and load the page, it tells you all the tags that are on the site. And if you find anything called Global Site Tag or gtag.js, that means you're using GA4. That's how you know. And if they don't use it, just call me up. (LAUGHS) You have my LinkedIn, call me up, I'll talk to them for you. Alright. Yeah.
SPEAKER:
How often do you check in with developers to verify and fix this? Like, are you a part of that conversation or are you, like how often would you recommend us as developers to look at these and hand off for fixes?
HASSAN ELBAYTAM|:
It depends on how complex the site is and how complex the analytical information that we need is. There are times where I don't talk to developers at all because I can do everything through Tag Manager. So all I need from the developer is just input this code snippet and we're good. But when it comes to... there are other parts (UNKNOWN). I worked with a client where I had to basically become the developer. I would build a whole documentation of, OK, put this snippet code, send this function on that click, this is the information I need, and we would check in on a regular basis because everything needed to be done through the developers. And it felt like... I didn't wanna overwhelm them by letting them to do all the research so we would work together. Like, OK, this is what I found, does this fit the code base that you're trying to do? How can we improve this or clean up the code? And there was a lot of back and forth and kind of collaboration there. Of course, that was an expense of client hours, so would not definitely recommend that, but it's important to at least have a good information meeting in the get-go, setting the expectations and understanding what the marketing needs or the analytics people needs.
I feel like there's always that disconnect because we have like an initial point, OK, yeah, we kicked off the project, we know what needs to be done then we go. And then we find out, OK, this is not detailed enough or developers didn't ask us the right questions or analytics didn't give us any information. I know we do, ao I'm sorry. So having that in a long meeting in the get-go, just understanding all the requirements, which helps define how many meetings do you have and what kind of basis you need to meet with the developers meet with the marketing team or analytics team. Yes.
SPEAKER:
Just to build on what that gentleman was saying about how can you tell if a site is using GA4, can do that through inspect?
HASSAN ELBAYTAM|:
Yes, you can. So it is a bit more... I don't like it, but there is a way. Most of what I do is on Chrome, but I think it can translate over to Firefox and Opera. But if we go to network... just gonna reload this. And it just shows everything. I already have this filter, so it's great. Anything with collect... here, let me see this... no, I won't zoom this one, but.
SPEAKER:
So anything with collect and V equals two. This is basically the API call that the browser sends to GA4. So you can see it's a collect and the version two, which is GA4. And then you can see the second one, TID, the ID of the Google Analytics property. The G dash E something. This is how you can see all the API call that was sent from the browser to GA4. Is there anything you can find within the header code. Like, maybe, you know how they make the UA. There is a GTM equals UA or something like that. Yeah, so, there is but it depends on how Google Analytics was set up. So you can find here, for example, I don't have, I have Google Analytics set up to tag manager. So I do find this script saying that good Google tag manager is there, but that does tell you if GA4 is installed through tag manager or not. There are times where, as you said, there is a snippet of code built in the head element that tells you, OK, this is connecting to Universal or GA4. Yeah, the easiest thing is that, the tag assistant.
It's just something you add to your browser that takes just a couple of seconds to do. I'm interested to see if that's available on Firefox. It is. Oh, that's a good question. Oh, really? Yeah. OK. It's easy to do. If you have access to their tag manager, you can see that immediately. And that is not your responsibility as a developer. Like somebody should be able to log into their tag manager and see that there's... It is more like an upside. Do you have GA4 installed? Yep. I can see that you don't. I see, yeah. I apologize if I missed it. I got in pretty late, which is not cool but because you said in your first slide that you talked about a timeline. Are they migrating users to GA4 automatically? I saw at some point that they said they'd set up a GA4 property for you. Yes. It was bad. The reception was bad. You remember that communication? Yeah. And it's like, wait. So the whole point was and I'll show you, I can show you an example of this. The whole point was... This is my empty Universal Analytics property.
And a lot of people did actually kind of already migrate. But the problem was there is a specific setting in admin here. Actually, sorry. It's in admin here or they moved it. No, no. Here it is. Yeah. This should say connected. So even if you migrated but you do not connect this, you got a message saying we will migrate for you. By migrate for you means we'll just create the property for you. Create a property. They didn't do anything. I validated. Yeah. So this is where, like, why are you sending me this? I already created a GA4 property. I migrated. Why are you telling me? It's because of this pesky little thing that all people didn't notice. So that's why the reception of this was really bad and Google kind of said, oh, but you can opt out if you want to. And then it was automatically already opted out unless you opt in for us to migrate. Because we host and maintain a lot of our client sites and we often at times install Google Analytics or GA4 now for ourselves more than the client tracking their own sales data or anything like that, just to get the site core web vitals and whatnot.
And we're trying to kind of like have a discussion of, OK, to migrate, you're not very interested in your own analytics but we want them. But we also want to get paid for the effort of going through and setting this up. And it shouldn't take long but, you know, you have to validate and test it in your hosting environments whatnot. So I wasn't sure how much of that was going to carry over after July. Do you have any advice or strategies for moving clients to GA4? Is it something that you think should even be charged to a client for upgrading them? We charge them already in bounty, so you can. But it does need a structure. So understanding how the measurement works in GA4 can make it a bit easier how to migrate. So one thing we do is we analyze all the events of our client's Universal Analytics and kind of say, OK, based off of what you're tracking, this is what we recommend the event structure should look like. This is the event name. Those are parameters coming in. Some sample values of what the event would look like fleshed out and then we can start building that based off of what they already have.
Any triggers in tag manager, we build off of that. If they use on site code, we'll write the code for them and tell them, OK, tell your developers this is what they should put in place of Universal and kind of build from there. But it's something that clients do look for especially if they use Universal Analytics and understand the value of it, they would definitely jump on board to kind of migrate. Can I just add to that really quick? So the very first step of setting this up takes ten minutes. So, I guess, the way that we did it is we said we're going to take ten minutes and we're going to set this up for you and then we're going to analyze it just like you said, and tell you all the things that you need to do and quote out for you the amount of time that it will take for us to do the rest of it. Migrate goals. Migrate goals, set up events, because there's a lot that, like, that was one of the big changes too with this was that they did it because, like you said, it was very rigid and they were telling everybody what was important to them instead of us saying, these are the analytics that are important to us, right?
So we were able to say, hey, we're doing this ten minute task for you. That's how we did our upside. We're just going to do this for you because it takes ten minutes for us to do this. And then we're going to say, hey, it takes 5 hours or 20 hours for us to do the rest of your events and goals. And then whether or not they're in support or build, we would either say like in the support side, we would say like, hey, do you want to spend your hours this month on this and we'll set it up for you. That's very kind of how we handle that. Just to add to that, I mean, we're operating under the assumption that everything in Google tag manager has always have to be rebuilt for GA4. Not everything. Not necessarily. So, just as one of these, this example is only GA4. But one thing we do for clients is that they already have an event being sent based off of this specific trigger, like CE - Signup. We just use the same trigger for a different tag. So basically we're just rebuilding that one tag and putting in the information.
It's the same variables and the same triggers, it's just the tag that's sending to a different property that's different. So, another way of restating that is the tags in general have to be rebuilt. Yeah. Unless there's something completely different that you want to do for tracking. But if it's just migrating one event per event then this is basically you're just rebuilding the tag. So like the dashboard, right? So if you wanted to implement this, can you compare this data with last year's data? Yes. Where did it go? Oh, I didn't put it in? But yes, you can. Basically, it's just a checkbox where you chose you, OK, a week ago or last year it was this, and now it's this. You can see, like, it hasn't improved, not increased, decreased, better or worse, so on. So there is a lot you can do with those types of dashboards and a lot of filtering that could be built in. Again, for the sake of just presentation, kind of getting to understand the concept of how it can be used. It's very simplistic of what I did.
But basically there's a lot more you can do with this data because you're collecting the raw data of this vitals. So you're not depending on Google telling you what the data is or lighthouse or so on, and perfect conditions. This is real life data, real life raw data that's owned by you. So you can do everything you can. And it does not fall under GDPR or any data privacies because anonymized. So you're all good there. Collect as much as you want. Have you been suggesting to your clients once the migration is complete to disable their own property or you let them run in parallel until the end of July? Let them run parallel because you need to kind of see if, because one thing is you'll find that there is a difference of numbers in the metrics, certain number of sessions and number of users and so on in Universal versus GA4. If you see that the difference is between five, six percent, that's fine. If you see the difference of 20%, there's something wrong either in Universal or GA4. So this is kind of, most probably it's GA4.
So this at least helps you kind of debug and see any fix, any issues that pop up before the cutoff. And then you have the data already there. Now the question that should be asked and Google should answer is will the data go away? Like can we utilize the data from Universal Analytics or will it stay there just we can extract reports or not? It is still very unclear about how this will happen. We might get a definite answer by next year, July, that's when the 360s enterprise properties will be cut off. So maybe by then we'll have a more solid answer of what happened with the Universal Analytics data. But for now, it's side by side up until the cutoff just so that you can have something to fall back on if GA4 fails for any reason. Any other questions? Yes. Is there anything you could do proactively for data retention off the grid? Yes. If you're using the standard one, then I would say if there's reports that you know are being used regularly, just start exporting those reports, historical reports rather.
Like, you know, last month, last two months, last three months, so on. Saving them in a CSV file or saving them in any cloud hosting. Some BigQuery-like platform. As at least you have or you own this data and it won't be lost when UA eventually is wiped out. If you're using enterprise and there's a connection between Universal Analytics and BigQuery that you can utilize and just export everything for the past 30, no, 18 months. 18 months. For a year and a half. So that gives you a somewhat year data that you can look back on and kind of compare. There are tools that are paid that you can connect to the standard Universal Analytics and you get similar exports to what you would get from the enterprise one. It uses the API and just flattens out the tables for you. So there are different ways you can you can go about it. The simplest, not the simplest, the most time consuming but cheapest one is you figuring out what the reports that are being used and just export historical data from there.
All the way back to the beginning, we talked about that churn data point. What's an example of, have you seen marketing implemented to stop or, yeah, stop the churn? Stop the churn. Well, that's... Or improve it. Like if you've seen a high number and you're like, oh, that's bad. What did they do to help that? So churn is inevitable. No matter how good your marketing campaign is or marketing workers, there will be churn. It's just how big of a percent it is. 50% churn like every month is you're losing 50% and buying 50% new users. Then that's an issue because that's a waste of marketing dollars. The lower the better but there's no optimal like industry standard. Anything above this number. At least I haven't heard of it. What usually I've done over the years is basically coupons. We send out, we know that this section based off of some machine learning algorithm we put in, it says, oh, this group might be falling off and turning off. We send them a coupon. And you might see this if you sign up to a lot of, like I do, sign up to a lot of e-commerce websites or SaaS websites.
When they notice that you might be canceling, they'd say, oh, next month is on us or here is 20% off your next jeans and stuff like that. This is based off of the recommendations that come from the churn metric or the backend machine learning models that marketing teams deploy. Based on that too, do you look at what people leave in their cart and abandoned cart and stuff? Yes. That, it depends on the business. I've worked with NGOs that build their donation funnel as an e-commerce site. So adding to cart, like I add $5 to this cause and $10 to that cause adding to cart. So there I collected this information and what is being added to cart and then not checked out. And based off of that try to connect if, they didn't have a login feature so I would send the Google client ID to the backend and kind of connect it to someone who donated before. If it does, then I'd be like, we saw that you forgot this in your cart. They hated me for this because a lot of negative feedback. How did you get my email?
But they ended up checking out in the end. So this is one way I utilize it. For e-commerce, it depends on if they're logged in, not logged in and all that, it just gets a bit murky. Is there a specific nomenclature you use for parameter types? Yes. The naming, snake case. Yeah. The recommendation by Google is just use snake case. In Universal, it was camel case and I think someone changed so that they changed the whole nomenclature. So now we're thinking snake case. Yes, it is time. Thanks everyone for putting it up. Appreciate your patience and I hope I didn't bore you.
(APPLAUSES)