Skip to content
Advertisement

How to execute multiple sql files in airflow using PostgresOperator?

I have multiple sql files in my sql folder. I am not sure how to execute all the sql files within a DAG?

  - dags
    - sql
      - dummy1.sql
      - dummy2.sql

For a single file, below code works

sql_insert= PostgresOperator(task_id='sql_insert',
                             postgres_conn_id='postgres_conn',
                             sql='sql/dummy1.sql')

Advertisement

Answer

With a list

sql_insert= PostgresOperator(task_id='sql_insert',
                             postgres_conn_id='postgres_conn',
                             sql=['sql/dummy1.sql', 'sql/dummy2.sql'])

Or you can make it dynamic

import glob
sql_insert= PostgresOperator(task_id='sql_insert',
                             postgres_conn_id='postgres_conn',
                             sql=glob.glob("sql/*.sql")]
Advertisement