Skip to content Skip to sidebar Skip to footer

Apache Airflow Dag Cannot Import Local Module

I do not seem to understand how to import modules into an apache airflow DAG definition file. I would want to do this to be able to create a library which makes declaring tasks wit

Solution 1:

Adding the sys path again worked for me,

import sys
sys.path.insert(0,os.path.abspath(os.path.dirname(__file__)))

Solution 2:

Are you using Airflow 1.9.0? This might be fixed there.

The issue is caused by the way Airflow loads DAGs: it doesn't just import them as normal python modules, because it want's to be able to reload it without restarting processes. As a result . isn't in the python search path.

If 1.9.0 doesn't fix this, the easiest change is to put export PYTHONPATH=/home/airflow/airflow/:$PYTHONPATH in the startup scripts. The exact format of that will depend on what you are using (systemd vs init scripts etc.)

Solution 3:

If you're working with git-sync and did not use at as an initContainer (only as a container or not at all) in kubernetes, then it is possible that the modules were not loaded into the webserver or scheduler.

Solution 4:

Simply put your local module in airflow plugin folder it will start working. to know location of your airflow plugin use command: airflow info

Post a Comment for "Apache Airflow Dag Cannot Import Local Module"