Skip to content
Advertisement

Tag: google-cloud-platform

Connecting to Cloud SQL from Google Cloud Function using Python and SQLAlchemy

I read all documentation related to connecting to MysQL hosted in Cloud SQL from GCF and still can’t connect. Also, tried all hints in documentation of SQLAlchemy related to this. I am using the following connection The error I got was: (pymysql.err.OperationalError) (2003, “Can’t connect to MySQL server on ‘localhost’ ([Errno 111] Connection refused)”) (Background on this error at: http://sqlalche.me/e/e3q8)

service account does not have storage.objects.get access for Google Cloud Storage

I have created a service account in Google Cloud Console and selected role Storage / Storage Admin (i.e. full control of GCS resources). gcloud projects get-iam-policy my_project seems to indicate that the role was actually selected: And documentation clearly indicates that role roles/storage.admin comprises permissions storage.objects.* (as well as storage.buckets.*). But when I try using that service account in conjunction

Add GCP credentials to airflow via command line

Airflow allows us to add connection information via command-line airflow connections. This can help with automated deployment of airflow installations via ansible or other dev-ops tools. It is unclear how connections to google cloud platform (service accounts) can be added to ariflow via command line. Answer Pre airflow 1.9 the following example outlines how to use a DAG to add

Python3 BigQuery or Google Cloud Python through HTTP Proxy

How to route BigQuery client calls through HTTP Proxy ? Before Posting this, I tried following but it is still not routing through http proxy. And the Google Cloud service credentials are set through shell environment variable GOOGLE_APPLICATION_CREDENTIALS Outgoing traffic ( 172.217.x.x belongs to googleapis.com ) not routing through HTTP Proxy , Answer Answering the question myself as I found

Advertisement