Skip to content

Tag: airflow

Airflow 2.2.2 remote worker logging getting 403 Forbidden

I have a setup where airflow is running in kubernetes (EKS) and remote worker running in docker-compose in a VM behind a firewall in a different location. Problem Airflow Web server in EKS is getting 403 forbidden error when trying to get logs on remote worker. Build Version Airflow – 2.2.2 OS – Linux – Ubuntu 20.04 LTS Kubernetes 1.22

Airflow – how to skip certain tasks

For a pipeline like below, sensor was set to softfail= True I’m trying to figure out how to only skip certain tasks when the sensor fails. For example only have B and D fail but still execute C and E. Many thanks for your help. Sensor A >> B >> C >> D >> E Answer I think you could

Flask blueprint already registered due to setuptools returns duplicate distributions

When trying a new Airflow version, I got this error: With Apache Airflow you can define a plugin using an entry_point. I managed to track it down to a call to importlib_metadata.distributions() which returns the same object twice. Why does it return twice? Answer The importlib_metadata.distributions() call uses your PYTHONPATH environment variable, accessible via sys.path in your python project. When I

Airflow is failing my DAG when I use external scripts giving ModuleNotFoundError: No module named

I am new to Airflow, and I am trying to create a Python pipeline scheduling automation process. My project youtubecollection01 utilizes custom created modules, so when I run the DAG it fails with ModuleNotFoundError: No module named ‘Authentication’. This is how my project is structured: This is my dag file: I am importing main function from the, however inside

Airflow run python script connected via gcsfuse using PythonOperator

I want to run a Python script that is stored in this gcp directory: I used the Bash Operator before to execute the script which works in theory but I’m getting some errors for some functions in some python libraries. Therefore I want to test the PythonOperator if it works. For the BashOperator I used the following code snippet: For