{"id":5563,"date":"2023-03-28T11:45:31","date_gmt":"2023-03-28T08:45:31","guid":{"rendered":"https:\/\/trackingchef.com\/?p=5563"},"modified":"2023-03-28T11:47:27","modified_gmt":"2023-03-28T08:47:27","slug":"optimizing-bigquery-reporting-with-looker-studio","status":"publish","type":"post","link":"https:\/\/trackingchef.com\/google-data-studio\/optimizing-bigquery-reporting-with-looker-studio\/","title":{"rendered":"Optimizing BigQuery\u00a0reporting with Looker Studio"},"content":{"rendered":"\n

Originally published in Hebrew on the Lixfix blog<\/a><\/strong><\/p>\n\n\n\n

Google’s Looker Studio, previously called Data Studio, was one of the most useful tools for anyone using Google Analytics.<\/p>\n\n\n\n

I write “was” because, on November 10, 2022, Google dropped a bomb when they applied multiple restrictions that make it difficult to build the reports.<\/p>\n\n\n\n

In short, Looker Studio uses the Google Analytics 4 API to fetch data, and Google applied usage limitations to the API which limit the number of daily queries you can run.<\/p>\n\n\n\n

For a lot of people, the reaction was “What’s the problem? We’ll fetch the data with BigQuery.”<\/p>\n\n\n\n

In case you’ve been living under a rock, Google’s BigQuery is a database that has a free native connecter to Google Analytics 4., All the data that reaches your analytics property is exported automatically to tables that allow you to access the data in its raw form (that is, before processing and without limitations).<\/p>\n\n\n\n

What many don’t know, is that using BigQuery incorrectly with Looker Studio can cost you a lot of money.<\/p>\n\n\n\n

Let’s look at an example.<\/p>\n\n\n\n

A few days ago, a client asked me to build a report that shows how users came to a certain page, and to which page they continued.<\/p>\n\n\n\n

In Universal Analytics, this report exists as a standard report, and this client wanted me to create a similar report for him in Google Analytics 4<\/p>\n\n\n

\n
\"image-4264093\"<\/figure><\/div>\n\n\n

This is what the report looks like in Universal Analytics<\/p>\n\n\n\n

Well, I immediately turned to BigQuery, and created the following query for it, which produced the exact same report:<\/p>\n\n\n

\n
\"image-2859936\"<\/figure><\/div>\n\n\n

My goal was to use BigQuery to display the data, using a custom query:<\/p>\n\n\n

\n
\"image-2158828\"<\/figure><\/div>\n\n\n

The problem was that this query, which I ran across only 30 days of data, cost 16GB!<\/p>\n\n\n

\n
\"image-2932051\"<\/figure><\/div>\n\n\n

What’s the problem with that?<\/h2>\n\n\n\n

The connection of Google Analytics to BigQuery is indeed free, but you only have 1TB of free credits each month for running queries. Any query beyond that will cost you money.<\/p>\n\n\n\n

If you do a back-of-a-envelope calculation – it’s enough for my client to open the report once a day, and each time play a little with the date ranges (every such change running a new query in BigQuery) – and pretty quickly he’ll finish the quota of the free 1TB.<\/p>\n\n\n\n

So although each terabyte only costs a few dollars, consider that you are creating an entire dashboard that is entirely composed of queries to BigQuery (to avoid Looker Studio’s new limitations), and each such report runs a query on a lot of data and for a rather short period of time. You will quickly find that these amount to many terabytes, which can bill for hundreds of dollars a month.<\/p>\n\n\n\n

And besides, if it can be done in a more economical way then why not do it?<\/p>\n\n\n\n

The solution – Scheduled Queries<\/h2>\n\n\n\n

When you use a custom query in Looker Studio, you query your raw data every time. In practice, this is a huge table that contains all the columns, for all your events (the rows in the table).<\/p>\n\n\n\n

Given the fact that most of the time you don’t need at least 80% of the data – it’s completely redundant.<\/p>\n\n\n\n

What you can do instead, is create a compact table that will contain only the relevant information, and make sure that it is updated daily. Then, every time you run your custom query – it will only query that table and not your entire dataset.<\/p>\n\n\n\n

To do this, I simply click on schedule and then create a new scheduled query:<\/p>\n\n\n

\n
\"image-6226710\"<\/figure><\/div>\n\n\n

Note that I did not choose a specific date, but rather wrote it as:<\/p>\n\n\n\n

events_*<\/code><\/pre>\n\n\n\n

and then<\/p>\n\n\n\n

where
     _table_suffix between format_date('%Y%m%d', date_sub(current_date(), interval 30 day)) and format_date('%Y%m%d', current_date())<\/code><\/pre>\n\n\n\n

That means the last 30 days.<\/p>\n\n\n\n

\n

A note for advanced users:<\/strong><\/p>\n\n\n\n

Today’s and yesterday’s table are in events_intraday until they are processed and only then are they transferred to events, so if you really want to be on the safe side it is better to use<\/p>\n\n\n\n

where REGEXP_EXTRACT(_table_suffix, r'[0-9]+’) between format_date(‘%Y%m%d’, date_sub(current_date(), interval 30 day)) and format_date(‘%Y%m%d’, current_date())<\/p>\n\n\n\n

Then it will query both events and events_intraday.<\/p>\n<\/blockquote>\n\n\n\n

Now, this screen will open up on the side:<\/p>\n\n\n

\n
\"image-4023262\"<\/figure><\/div>\n\n\n
    \n
  1. Name your query<\/li>\n\n\n\n
  2. Set a schedule of the query’s run: how often (I chose daily), at what time (pay attention to the time zone), and when the schedule will start and end.<\/li>\n\n\n\n
  3. Set the location of the new table – Mark the checkbox Set a destination table<\/em>, and then under Dataset <\/em>start writing the name of your dataset, and under Table Id give your table a memorable name.<\/li>\n\n\n\n
  4. Choose whether the new query will overwrite the existing table, or just add to it. I chose to overwrite so that if any new schema (i.e. columns) enters – it will be included in the table.<\/li>\n<\/ol>\n\n\n\n

    That’s it. Now click save and your query will appear here:<\/p>\n\n\n

    \n
    \"image-2249726\"<\/figure><\/div>\n\n\n

    After your query runs, you can find the new table under your dataset, and you can see that the new table only contains the few fields we selected:<\/p>\n\n\n

    \n
    \"image-5663212\"<\/figure><\/div>\n\n\n

    If you click on details you can see that the table is significantly smaller, in this case, it’s only 463MB:<\/p>\n\n\n

    \n
    \"image-7624480\"<\/figure><\/div>\n\n\n

    And this compared to the original table, which is 700MB for one day of data:<\/p>\n\n\n

    \n
    \"image-6691613\"<\/figure><\/div>\n\n\n

    And back to Looker Studio:<\/p>\n\n\n\n

    Now we have 2 options to bring the data to Looker Studio.<\/p>\n\n\n\n

    Option 1 – Select the new table we created<\/h3>\n\n\n
    \n
    \"image-5383561\"<\/figure><\/div>\n\n\n

    After selecting the new table, I went to Personal History to see how much the query “cost us”, and I can see this:<\/p>\n\n\n

    \n
    \"image-8891498\"<\/figure><\/div>\n\n
    \n
    \"image-6174383\"<\/figure><\/div>\n\n\n

    And this is what it looks like in Looker Studio:<\/p>\n\n\n\n

    Of course, I added an option to select dates, and every time I change a date – it uses the existing data that was downloaded when I connected the table and does not run a new query to BigQuery:<\/p>\n\n\n

    \n
    \"image-8531649\"<\/figure><\/div>\n\n\n

    Of course, you have to remember that if I select dates from before the last 30 days – the report will not show anything, because our BigQuery table only contains the last 30 days each time.<\/p>\n\n\n\n

    You can change this of course, but it means your basic query will consume more data.<\/p>\n\n\n\n

    Option 2 – run a custom SQL query on the new table<\/h3>\n\n\n\n

    We’ll just do a SELECT * from the new table, only this time we’ll have to insert the parameters @DS_START_DATE and @DS_END_DATE for our date to work, otherwise, it’s just <\/p>\n\n\n\n

    will take the table as it is and will not change anything even if we change dates:<\/p>\n\n\n

    \n
    \"image-7688110\"<\/figure><\/div>\n\n\n

    An important final note<\/h2>\n\n\n\n

    In the examples listed here, I only referred to a single report.<\/p>\n\n\n\n

    Although it was heavy, my table contained only 4 columns except for the date, which is a very simple use case.<\/p>\n\n\n\n

    When creating an entire dashboard that is based on data from BigQuery, it is, of course, advisable to plan it differently, and perhaps create a query that builds all the columns, then in Looker Studio you simply select the columns you want.<\/p>\n\n\n\n

    This will of course affect the way you want to bring the data to Looker Studio – whether to bring the entire table or use a custom SQL query.<\/p>\n\n\n\n

    Another important thing to note is that your scheduled query will run at the frequency you set and consume gigabytes, even if no one actually enters the reports.<\/p>\n\n\n\n

    That’s why you should optimize it from time to time and remove any redundancies.<\/p>\n\n\n\n

    In my case, it consumes about 15GB per day. It’s not much – but it’s a shame that it will just run if no one uses it.<\/p>\n","protected":false},"excerpt":{"rendered":"

    Originally published in Hebrew on the Lixfix blog Google’s Looker Studio, previously called Data Studio, was one of the most useful tools for anyone using Google Analytics. I write “was” because, on November 10, 2022, Google dropped a bomb when they applied multiple restrictions that make it difficult to build the reports. In short, Looker […]<\/p>\n","protected":false},"author":15,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[46],"tags":[],"_links":{"self":[{"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/posts\/5563"}],"collection":[{"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/users\/15"}],"replies":[{"embeddable":true,"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/comments?post=5563"}],"version-history":[{"count":5,"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/posts\/5563\/revisions"}],"predecessor-version":[{"id":5585,"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/posts\/5563\/revisions\/5585"}],"wp:attachment":[{"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/media?parent=5563"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/categories?post=5563"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/trackingchef.com\/wp-json\/wp\/v2\/tags?post=5563"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}