Google Analytics 360 Source
Compatible models
MorphL makes it easy to import data from Google Analytics 360 via Google Cloud BigQuery.
A common use case is predicting your website's visitors behaviour using this data, for example predicting when a visitor will add a product to the shopping cart.
This tutorial uses the Google Cloud Console to:
- Create a Google Cloud Storage bucket
- Create a Google Cloud BigQuery dataset
- Create a service account
Prerequisites
Google Analytics 360 account
This source requires a Google Analytics 360 account and website tracking ID.
Enhanced Ecommerce
Enhanced ecommerce enables the measurement of user interactions with products on ecommerce websites across the user's shopping experience, including: product impressions, product clicks, viewing product details, adding a product to a shopping cart, initiating the checkout process, transactions, and refunds. See the Enhanced Ecommerce doc for instructions.
Google Cloud BigQuery dataset with your analytics data
Your Google Analytics 360 data must be imported into a BigQuery dataset. See the Set up BigQuery Export doc for instructions.
Setting up access to Google Cloud BigQuery
Please access your Google Cloud Console and follow the below steps.
Google Cloud BigQuery destination dataset
A new BigQuery dataset must be created. This dataset will be used as a placeholder for running daily queries and creating temporary tables before the data is exported to a Google Cloud Storage (GCS) bucket. This dataset will be kept empty, as temporary tables will be deleted after the data is ingested by the MorphL pipelines. From Google Cloud Console > BigQuery, create a new dataset and leave it empty:

Google Cloud Storage bucket
A new GCS bucket must be created. This bucket will contain temporary .avro files that will be downloaded by the MorphL pipelines. From Google Cloud Console > Storage, create a new bucket with default settings and leave it empty:

Service account
A new Google Cloud service account must be created with the following permissions: BigQuery User and Storage Object Creator. See Step 4 for refined BigQuery permissions.
- From Google Cloud Console, go to the IAM & Admin > Service Accounts section.
-
Click the +CREATE SERVICE ACCOUNT button.
-
Add a name and description for your service account and click the CREATE button.
-
Select the BigQuery User and Storage Object Creator roles. See Step 4 for further restricting BigQuery permissions.
-
Add a condition on the Storage Object Creator role to allow access only to the empty bucket you have created at step 2. Please note that the bucket name must be prefixed with projects/_/buckets/, see this guide for details.
-
In the final step, press the CREATE KEY button.
-
Keep the default JSON format and press the CREATE button. Your key file will be downloaded automatically.
(Optional) Refine BigQuery permissions
You can further restrict BigQuery permissions by creating custom roles. This step is optional and recommended for advanced Google Cloud users. The required permissions for each dataset are:
BigQuery source dataset Contains Google Analytics 360 data |
BigQuery destination dataset Placeholder for exporting data |
---|---|
bigquery.tables.list bigquery.tables.get bigquery.tables.getData bigquery.tables.export |
All from bigquery.tables.* |
- From Google Cloud Console, go to the IAM & Admin > Roles section
-
Click the + CREATE ROLE button.
- Enter a title for your custom role and click the + ADD PERMISSIONS button.
-
In the Add Permissions window, search for all BigQuery roles and select them.
-
Further refine permissions by choosing only the ones you need. Below you can see an example of selecting permissions for the BigQuery source dataset.
- Press the ADD button and then CREATE to save your custom role.
- From the IAM & Admin > IAM section, edit your service account and attach your custom role.
- From the BigQuery section, go to your source dataset and press the SHARE DATASET button. See the Controlling access to datasets doc for more details.
- In the Add members box enter the name of your service account and select the custom role you have just created.
- Repeat the steps above for your destination dataset. Since permissions are different, we recommend creating two separate custom roles for your BigQuery source dataset and destination dataset.
- Once done, go back to IAM & Admin > IAM and edit your service account to remove the BigQuery User role.
- From IAM & Admin > Service Accounts, edit your service account by generating a new key and deleting the old key.
Add data source in the MorphL dashboard
- Log in to your MorphL account.
- From the left side menu, go to Data Sources.
-
Select Google Analytics 360 or Google Cloud BigQuery and activate it.
-
Enter the BigQuery datasets and Google Cloud Storage bucket.
-
Upload your service account key file.
-
Verify your settings.