Skip to content
Advertisement

Tag: airflow

Docker compose missing python package

To preface I’m fairly new to Docker, Airflow & Stackoverflow. I’ve got an instance of Airflow running in Docker on an Ubuntu (20.04.3) VM. I’m trying to get Openpyxl installed on build in order to use it as the engine for pd.read_excel. Here’s the Dockerfile with the install command: The requirements.txt file looks like this: And the docker-compose.yaml file looks

Why airflow is returning error while requesting Rest API?

I have a python code which is requesting a Rest API. The API has more than 5000+ pages so i tried to request it but always i am getting error at 2000th request. The error is: “df = pd.json_normalize(json_data[“items”]) KeyError: ‘items'” How can i solve this problem ? P.S. In locally, the code is working clearly. Answer I found a

Airflow DAG script print the value in logs

Actually I was passing JSON {“Column”: “ABC123”} in Airflow before triggering it and in DAG script I have written the code as below in DAG script Actually I want to print the value as 123 in Airflow logs but it is not printing in the logs…DAG runs successful but not able to print the value in logs whatever I passed

How to run Airflow tasks synchronously

I have an airflow comprising of 2-3 steps PythonOperator –> It runs the query on AWS Athena and stores the generated file on specific s3 path BashOperator –> Increments the airflow variable for tracking BashOperator –> It takes the output(response) of task1 and and run some code on top of it. What happens here is the airflow gets completed within

Decode UTF-8 encoded Xcom value from SSHOperator

I have two Airflow tasks that I want to communicate. The SSHOperator returns the last line printed, in this case, “remote_IP”. However, the SSHOperator’s return value is encoded using UTF-8. How can the SSHOperator Read_remote_IP return value non-encoded? Also, how can the BashOperator Read_SSH_Output decode the encoded value? Answer My current solution is to introduce another Python operator to convert

How to install packages in Airflow (docker-compose)?

The question is very similar to the one already available. The only difference is that I ran Airflow in docker Step by step: Put docker-compose.yaml to PyCharm project Put requirements.txt to PyCharm project Run docker-compose up Run DAG and receive a ModuleNotFoundError I want to start Airflow using docker-compose with the dependencies from requirements.txt. These dependencies should be available by

Advertisement