I am writing a script that receives a Kubernetes context name as an input and outputs the different elements of the cluster -> Output: This will support only the three main providers (GKE, EKS, AKS). My question is: What are the different elements of EKS and AKS context names? Answer You need to differenti…
Tag: kubernetes
Airflow 2.2.2 remote worker logging getting 403 Forbidden
I have a setup where airflow is running in kubernetes (EKS) and remote worker running in docker-compose in a VM behind a firewall in a different location. Problem Airflow Web server in EKS is getting 403 forbidden error when trying to get logs on remote worker. Build Version Airflow – 2.2.2 OS – L…
Configure volumes in airflow GKEStartPodOperator operator
I have a google cloud composer environment. In my DAG I want to create a pod in GKE. When I come to deploy a simple app based on a docker container that doesn’t need any volume configuration or secrets, everything works fine, for example: But when I have an application that need to access to my GKE clus…
Authentication using python Kubernetes api in GCP is not working
I would like to be able to access GKE (kubernetes) cluster in GCP from python kubernetes client. I cant authenticate and connect to my cluster and i dont find the reason. Here is what i tried so far. Answer I’d like to get the configuration working I have it work where, the code is running off cluster a…
airflow on kubernetes: How to install extra python packages?
i installed airflow using bitnami repo: To install extra python packages i mounted an extra volume I prepared my requirements.txt file and then i created a ConfigMap using kubectl create -n airflow configmap requirements –from-file=requirements.txt after this i upgraded airflow using helm upgrade……
Do we still use api endpoint health check in Kubernetes
I want to use API endpoint health check like /healthz or /livez or /readyz. I recently heard about it, tried to search but didn’t find any place which tell me how we use it. I see that /healthz is deprecated. Can you tell me if have to use it in a pod definition file or we have to use it
How to access a content of file which is passed as input artifact to a script template in argo workflows
I am trying to access the content(json data) of a file which is passed as input artifacts to a script template. It is failing with the following error NameError: name ‘inputs’ is not defined. Did you mean: ‘input’? My artifacts are being stored in aws s3 bucket. I’ve also tried u…
Create Multiple Virtual envs for Multiple kubernetes contexts
As of now i do to get pod in specific cluster is there a python way to set kubernetes context for a virtual env , so we can use multiple context at the same time example : This should be equivalent of Answer This isn’t probably a rational solution, but anyway… At some time I used different kubectl…
How to write data to Delta Lake from Kubernetes
Our organisation runs Databricks on Azure that is used by data scientists & analysts primarily for Notebooks in order to do ad-hoc analysis and exploration. We also run Kubernetes clusters for non spark-requiring ETL workflows. We would like to use Delta Lakes as our storage layer where both Databricks an…
Unable to expand cluster by dask
I am very new to kubernetes & dask and trying to implement some kube cluster and have created minikube cluster with some services, further want to expand it with flexible dask functionality. I am planning to deploy it to gcloud somehow later, so i am trying to initialize dask cluster (scheduler and worker…