I have a JSON file with BigQuery credentials. To connect with Python to BigQuery I need to give the file path in service_account.
from google.cloud import bigquery from google.oauth2 import service_account cred = service_account.Credentials.from_service_account_file(filename="credentials.json") client = bigquery.Client(credentials=cred, project=cred.project_id)
The JSON looks like a dictionary:
{ "type": "xxxx", "project_id": "xxx", "private_key_id": "xxx", "private_key": "xxxxxx", "client_email": "xxxx@xxxxx.iam.gserviceaccount.com", "client_id": "xxxxxxxxxx", "auth_uri": "xxxxxx", "token_uri": "xxxxxx", "auth_provider_x509_cert_url": "xxxxx", "client_x509_cert_url": "xxxxx.iam.gserviceaccount.com" }
I don’t want to use a file in the project. Is there a way instead of a path to file to use JSON string from the dictionary to connect to BigQuery?
Advertisement
Answer
you can use the constructor service_account.Credentials.from_service_account_info(sa_dict)
instead.
be careful if you upload the code to a public repo though. One of the reasons to use a separate JSON file is so you can exclude it from repos.
sa_dict = { "type": "xxxx", "project_id": "xxx", "private_key_id": "xxx", "private_key": "xxxxxx", "client_email": "xxxx@xxxxx.iam.gserviceaccount.com", "client_id": "xxxxxxxxxx", "auth_uri": "xxxxxx", "token_uri": "xxxxxx", "auth_provider_x509_cert_url": "xxxxx", "client_x509_cert_url": "xxxxx.iam.gserviceaccount.com" } cred = service_account.Credentials.from_service_account_info(sa_dict)