Google BigQuery

Attribution ETL service supports daily sync between exported data and Google BigQuery.

πŸ“˜

This article is for Google Cloud Admin or DevOps

Please make sure you understand the steps described below with full responsibility.

Before you continue with this guide please make sure that you have completed the Data Export for Google Cloud Storage since that's pre-requirement for ETL for Google BigQuery.

Create Service Account with access to Cloud Storage bucket and BigQuery dataset

  1. Create new service account under your project with "Storage Insights Collector Service" and "BigQuery Job User" roles (roles/storage.insightsCollectorService, roles/bigquery.jobUser).
  2. Create a service account key and download it.
  3. At his point you should already have bucket created, in bucket Permissions grant access to new service account with "Storage Object User" role (roles/storage.objectUser).
  4. Create BigQuery dataset (e.g. attribution-dataset).
  5. Share your dataset to new service account with "BigQuery Data Owner" role (roles/bigquery.dataOwner)
  6. Send the following information to Attribution support:
    1. Your project id - attribution-1234
    2. Your bucket name - attribution-data-export
    3. Your dataset name - attribution-dataset
    4. JSON key file - send the attribution-1234-xxxxxxxxxx.json file

Once described steps above are completed we will enable ETL service for your project.