Dealing with Cookie Apocalypse: Data ETL and Performance Reporting

***Please note that Integromat evolved into Make in February 2022. All content for Integromat is also valid for Make.***

The ongoing advertising cookie sunset poses serious challenges to reporting your campaign results. If your site does not use any sneaky cookie consent designs, you can expect to lose around 60-90% of advertising cookies. This means that 60-90% conversions may not be reported in the tools you are using – be it Google Analytics, Google Ads, Bing Ads, Facebook Ads… The list can just go on and on.

If you are lucky, your legal department may allow you to use at least Google Analytics cookies without requesting consent. However, many privacy oriented companies will not even fire Google Analytics without collecting consent from their site visitors. When you don’t even have GA data, your reporting is simply broken, in better words incomplete. And the question is “now what”.

Apocalypse Edited

The solution of incomplete performance stats in your reporting is not easy, not universal for everybody, and will never be perfect. You will likely end up creating dashboards outside your current tracking/marketing tools. This will first require getting your data from many source systems into a common data warehouse.

In today’s article, we’ll look at how you can ingest data from..:

  1. A CSV email attachment
  2. Google Analytics (events + sessions)

…into Google BigQuery without writing any code.

Problem to Solve

Imagine this situation:

  1. You are not fully tracking even GA sessions.
  2. You know how many % of visitors give you tracking consent.
  3. Obviously, you are missing tons of conversions in your GA.
  4. Client sends you sales from his store via email in a daily CSV file.
  5. You are in a desperate need of justifying marketing spend otherwise you will be jobless. Basically, you to build a solid trustworthy and automated dashboard.

The first step is to get all your data into a data warehouse so you can build dashboards on the top and this is what needs to programmatically happen:

  1. You get an email with a CSV file attachment.
  2. You take the CSV attachment, parse it out, and pass the content as new rows to a BigQuery table.
  3. You download GA event data and pass it to a BigQuery table.
  4. You download GA session data and pass it to a BigQuery table

Prerequisites

Before we go deep into the data ingestion which is just a piece in a bigger puzzle, let’s talk more about the whole reporting problem because solving it may require you to level up your skills:

  1. You will need to become a pretty decent data analyst.
  2. You will need to understand basics of data warehousing.
  3. You will likely need to understand SQL basics.
  4. You will need to understand how to pass data between systems to get all the needed data into one place.

Solving #2 and #4 a few years ago would be a task for a marketer, an analyst, and an API programmer. Nowadays, you can solve it just by yourself. The IT world is moving fast and you can make use of publicly available tools even if you cannot write a single line of code. We’ll be using:

  1. BigQuery as a data warehouse. It’s a relatively cheap option well embedded into Google’s ecosystem and it talks nicely with Google Data Studio.
  2. Integromat to shuffle the data between systems. Integromat is a no-code platform which allows you to tap into more than 1000 APIs without writing a single piece of code. The beauty of Integromat is that it allows you to run very simple automations as well as extremely complex flows which require many steps, routers, and conditions. Today’s example is an “intermediate” difficulty.

Solution

The YouTube video below shows the Integromat scenario configuration of each module step by step. The bubbles are called “modules” and they represent “an action” (or “an API call” in most of the cases). The scenario is built mainly by dragging & dropping modules, by connecting them, and by mapping items from preceding modules to subsequent modules.

The scenario starts with a module which is monitoring my Gmail for emails with a specific subject. Assume the email contains the CSV attachment with sample sales data.

Whenever the email comes, a sequence of actions is triggered:

  1. The email is retrieved.
  2. A variable called “yesterday” is set to create yesterday’s date since I will need the date multiple times later.
  3. The the scenario is split scenario into 3 routes.
  4. The first upper route retrieves the attachment from the email, parses the CSV, and creates an INSERT statement into BigQuery which is inserting the sales data. At the end, I send an email confirmation to myself so I know the procedure has succeeded.
  5. The middle route is retrieving event data from GA and INSERTing it into a different table in BigQuery. An email is also sent when this part is finished.
  6. Finally, the bottom route is retrieving session data from GA and INSERTing it into yet another table in BigQuery. An email is also sent when this part is finished.

For the record, this scenario took about 30-40 minutes to create. When the scenario is ready, you can schedule it to check for new emails every 5 minutes, every hour, every day, every week…

Showtime

Showing the configuration is one thing, but the question is whether it really works… Well, watch the video below:

  1. At first, I wipe my BigQuery tables to start fresh.
  2. Then I show you the “sales” file waiting in my mailbox.
  3. Then I run the scenario.
  4. Scenario passes all the data into BigQuery by ingesting the CSV and GA data.
  5. Finally, you can check out the confirmation emails.

Conclusion

Yes, what I showed you today in Integromat may look frightening at first. I would feel exactly the same if I had not seen Integromat before. However, after you start playing with Integromat yourself, you will get it and you will realize it’s not that difficult. You only need to make the first “break through” scenario and then you will be able to build whatever you need. Some of you will actually like working with Integromat more than building yet another PPC campaign which will not likely work anyway. 🙂

As already mentioned, this article is solving only one piece of the puzzle – the data ingestion. I chose this part on purpose – solving this with Integromat in less than an hour is just amazing given you’d have to spend hours on API authentication, coding, and configuring your own server a few years ago.

Do you want to replicate this for your business yourself? 
GO TO MAKE

Or looking for someone to build your integrations?
HIRE ME

Scenario Blueprint

You can download this scenario blueprint and import it to your Integromat account but I don’t think the scenario will be applicable to your specific case because your column set will be different. The blueprint is just good for inspiration so you can play with something tangible right away. You will also need to get at least “Basic” Integromat account (9 USD/month) to be able to use the import feature.

Blueprint Import

One thought on “Dealing with Cookie Apocalypse: Data ETL and Performance Reporting

Leave a Reply

Your email address will not be published. Required fields are marked *