I am trying to deploy a python cloud function on GCP, using a python package pushed on the artifact registry of another project. I followed the instructions off the google cloud documentation: The service account that deploys the cloudbuild has the role: Artifact Registry Reader The error in cloudbuild: the requirements.txt the setup.py of the package the cloudbuild.yaml that deploys
Tag: google-cloud-functions
How to trigger async job from a Cloud Function
I have a Cloud Function (Python) which is triggered by http from web client, it has to calculate something and respond FAST. I would like to save the http request parameters into a database (for analytics). If i just initiate a WRITE to my postgresql, the function will have to wait for it, and it will be slower. Using PubSub,
Google PubSub – Ack a message using ack_id
I have an architecture made of: PubSub Topic ‘A’ Subscription ‘B’ on topic ‘A’ that pushes messages to the endpoint ‘X’ Cloud Function ‘C’ triggered by the endpoint ‘X’, runtime in Python Everytime a new message is published on the topic ‘A’, the subscription ‘B’ pushes it to the endpoint ‘X’ that triggers the Cloud Function ‘C’. The problem I’m
type hints in a Python Google Cloud Function?
In a Python Google Cloud Function with a lot of sub-functions in the “main.py”, I added type hints (= return value annotation as part of function annotation) in pep8 style like this: Union is taken from here, it is needed if there is more than one type hint. The function cannot get deployed, there is no log about what is
Callable Cloud Function error: Response is missing data field, when trying to call cloud function written in python from flutter app
I’ve been stuck for a few days trying to call a cloud function that was written in Python that takes no parameters from my flutter app, but I keep getting an error that says ‘Response is missing data field’. This is confusing me because the function takes no parameters so I’m wondering if I’m missing something. This is what the
How to combine multiple files in GCS bucket with Cloud Function trigger
I have 3 files per date per name in this format: ‘nameXX_date’, here’s an example: ‘nameXX_01-01-20’ ‘nameXY_01-01-20’ ‘nameXZ_01-01-20’ where ‘name’ can be anything, and the date is whatever day the file was uploaded (almost every day). I need to write a cloud function that triggers whenever a new file lands in the bucket, that combines the 3 XX,XY,XZ files into
Cloud Build service account no access to storage.objects.get
I am trying to get a Google Cloud Functions to print something from a file in a storage bucket. I have the file stored in a bucket, an authenticated service account with Storage Admin, Cloud Run Admin, Service Account User and Cloud Functions Admin and the following python script. I try to deploy this with the following code: To which
Content-Security-Policy problems Google Cloud Function
I’ve build a google cloud function that takes an input and sends it to Firestore. I’ll need to call it inside a Firefox WebExtension, that I’ll use on Twitter. Not the first time I’m doing this type of project but this time I’m having issues regarding the “Content-Security-Policy”. I’ve try to call my GC function with the GC testing tool,
Get environment variables in a cloud function
I have a Cloud Function in GCP that queries BigQuery in a specific project/environment. As I have multiple environments I would like to get the current project/environment of the cloud function. This is so that I can access BigQuery in the corresponding environment. Of course I could just hardcode the project_id, but I would like to do this programmatically. According
Unknown error has occurred in Cloud Functions
First, it looks like this thread but it is not: An unknown error has occurred in Cloud Function: GCP Python I deployed a couple of times Cloud Functions and they are still working fine. Nevertheless, since last week, following the same procedure I can deploy correctly, but testing them I get the error “An unknown error has occurred in Cloud