I have set up my service account and I can run queries on bigQuery using client.query()
.
I could just write all my scheduled queries into this new client.query()
format but I already have many scheduled queries so I was wondering if there is a way I can get/list the scheduled queries and then use that information to run those queries from a script.
Advertisement
Answer
Yes, you can use the APIs. When you don’t know which one to use, I have a tip. Use the command proposed by @Yev
bq ls --transfer_config --transfer_location=US --format=prettyjson
But log the API calls. for that use the --apilog <logfile name>
parameter like that
bq --apilog ./log ls --transfer_config --transfer_location=US --format=prettyjson
And, magically, you can find the API called by the command:
https://bigquerydatatransfer.googleapis.com/v1/projects/<PROJECT-ID>/locations/US/transferConfigs?alt=json
Then, a simple google search leads you to the correct documentation
In python, add that dependencies in your requirements.txt
: google-cloud-bigquery-datatransfer
and use that code
from google.cloud import bigquery_datatransfer client = bigquery_datatransfer.DataTransferServiceClient() parent = client.common_project_path("<PROJECT-ID>") resp = client.list_transfer_configs(parent=parent) print(resp)