# Google BigQuery Google BigQuery can be used as a data source or a data destination in Zuar Runner. .. image:: assets/bigquery-1.png :alt: Google BigQuery Icon Source plugin example: [Query](/jobs/query/) Destination plugin examples: [CSV](/jobs/csv/), [Salesforce](/connectors/salesforce/), [SQL](/jobs/sql/) ## Google BigQuery as a data destination Zuar Runner automatically - Creates Google BigQuery datasets if they don't exist - Creates Google BigQuery dataset tables if they don't exist - Determines data types for Google BigQuery columns - Adds new columns to Google BigQuery tables based on new fields in source systems - Adjusts Google BigQuery tables based on changes in source data ## Google BigQuery specific setup The [Google BigQuery API requires using a service account key](https://cloud.google.com/docs/authentication/getting-started) which is downloaded as a JSON file and stored in Zuar Runner. More information on how to generate that key is [here.](https://m2msupport.net/m2msupport/generate-service-account-key-in-google-cloud-platform-gcp/) ### How to generate service account key in GCP? 1. Login to your Google Cloud Platform. 2. Go to the GCP Console home page - https://console.cloud.google.com/. 3. Click on 'Go to project settings'. 4. Click on the 'Service Accounts' menu on the left navigation bar. 5. Click on 'Create Service Account'. 6. Enter the service account name and description details, grant the service account access to the project and create the account. 7. Select the newly created service account and click on the 'KEYS' tab. 8. Click the 'ADD KEY' dropdown, select 'Create New Key' and select JSON as the key type. The service account key JSON file is automatically downloaded to your local machine. Below is an example of what this key file contains: ```json { "type": "service_account", "project_id": "mitto-183418", "private_key_id": "...", "private_key": "...", "client_email": "...", "client_id": "...", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://oauth2.googleapis.com/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_x509_cert_url": "..." } ``` The JSON key file contains the `project_id` which is used in the database url. Add the role `BigQuery User` to the service account. .. image:: assets/google-bigquery__userrole.png :alt: BigQuery User Role Below is the [database url](/databases/database-urls/) structure for connecting to a Google BigQuery database: `bigquery:///` Here's an example of using Google BigQuery as a destination in a [CSV](/jobs/csv/) job: .. image:: assets/google-bigquery__input.png :alt: BigQuery Destination .. NOTE:: When outputting to Google BigQuery, leave "**Schema**" blank and append the dataset name to the end of the output database URL. ## SQL Zuar Runner can send [SQL](/jobs/sql/) statements to Google BigQuery. Use Google BigQuery syntax in these Zuar Runner SQL jobs.