For any one who has a problem with airflow linked to this issue.
In my case, I've initialized airflow in /root/airflow
and run its scheduler as root. I used the run_as_user
parameter to impersonate the web user while running task instances. However airflow was always failing to trigger my DAG with the following errors in logs:
sqlite3.OperationalError: unable to open database file
...
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
I also found once I triggered a DAG manually, a new airflow resource directory was automatically created under /home/web
. I'm not clear about this behavior, but I make it work by removing the entire airflow resources from /root
, reinitializing airflow database under /home/web
and running the scheduler as web under:
[root@host ~]# rm -rf airflow
[web@host ~]$ airflow initdb
[web@host ~]$ airflow scheduler -D
If you want to try this approach, I may need to backup your data before doing anything.