I created a GCP service account and assigned the needed permissions for listing the projects inside the organization. When I’m using the gcloud cli, everything works: But when I try the “same” with the Python Client for Google Cloud Resource Manager, I receive a 403 The caller does not have permission error message. Does the Python Client for Google Cloud
Tag: google-cloud-platform
Apache Beam Python: returning conditional statement using ParDo class
I want to check, if the CSV file we read in the pipeline of apache beam, satisfies the format I’m expecting it to be in [Ex: field check, type check, null value check, etc.], before performing any transformation. Performing these checks outside the pipeline for every file will take away the concept of parallelism, so I just wanted to know
How to write BigTable table data into a pandas dataframe?
I am trying to read a GCP BigTable – table to a pandas dataframe, and currently, the function I am using to fetch rows from BigTable is read_rows(), which returns PartialRowData. Code: Output: <class ‘google.cloud.bigtable.row_data.PartialRowData’> Query: How do we read the values from PartialRowData obj? Answer There’s an example on how to call read_rows in this documentation: https://googleapis.dev/python/bigtable/latest/table.html#google.cloud.bigtable.table.Table.read_rows
How do I correctly add worker nodes to my cluster?
I am trying to create a cluster with the following parameters on Google Cloud: 1 Master 7 Worker nodes Each of them with 1 vCPU The master node should get full SSD capacity and the worker nodes should get equal shares of standard disk capacity. This is my code: This is my error: Updated attempt: I don’t follow what I
Need help understanding ‘no attribute’ error message in Google Natural Language API
Situation I’m trying to run Google’s Cloud NLP sentiment analysis on a text field pulled from the Twitter API with Tweepy and then turned into a pandas Dataframe. That Dataframe has a text field called text, which is the tweet content on which I’d like to run the sentiment analysis. This is my reference code: https://cloud.google.com/natural-language/docs/reference/libraries Expectation I was expecting
Permanent authentication from cloud function
I am trying to host some code inside a cloud function. This code tracks and parses new e-mails and write some information to a Realtime Database. It is almost 100% finished, but as I am very beginner, it’s been hard for me to deal with authentication. From my PC, it all worked when I authenticated just like it is shown
How do I list my scheduled queries via the Python google client API?
I have set up my service account and I can run queries on bigQuery using client.query(). I could just write all my scheduled queries into this new client.query() format but I already have many scheduled queries so I was wondering if there is a way I can get/list the scheduled queries and then use that information to run those queries
Google PubSub – Ack a message using ack_id
I have an architecture made of: PubSub Topic ‘A’ Subscription ‘B’ on topic ‘A’ that pushes messages to the endpoint ‘X’ Cloud Function ‘C’ triggered by the endpoint ‘X’, runtime in Python Everytime a new message is published on the topic ‘A’, the subscription ‘B’ pushes it to the endpoint ‘X’ that triggers the Cloud Function ‘C’. The problem I’m
type hints in a Python Google Cloud Function?
In a Python Google Cloud Function with a lot of sub-functions in the “main.py”, I added type hints (= return value annotation as part of function annotation) in pep8 style like this: Union is taken from here, it is needed if there is more than one type hint. The function cannot get deployed, there is no log about what is
Airflow GCSFileTransformOperator source object filename wildcard
I am working on a DAG that should read an xml file, do some transformations to it and land the result as a CSV. For this I am using GCSFileTransformOperator. Example: My problem is that the filename has is ending with a 4 digit number that is different each day (File_20220119_4302). Next day the number will be different. I can