airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the CeleryExecutor
I have "postgresql" as "enabled: true" so it should be using postgresql not sqlite. This happened with any other cli command such as "airflow list_dags" as well.
After trial and error it seems that explicitly setting AIRFLOW__CORE__SQL_ALCHEMY_CONN
as part of the "airflow: config" values to the default end-result value of postgresql+psycopg2://postgres:airflow@airflow-postgresql:5432/airflow
allows for Airflow CLI commands such as airflow create_user
to work, but this shouldn't be necessary. Airflow CLI commands should work with the default chart values.
What you expected to happen:
After running an Airflow CLI command such as "airflow list_dags" it should not error because AIRFLOW__CORE__SQL_ALCHEMY_CONN
should be set as an actual environment variable of the container
How to reproduce it (as minimally and precisely as possible):
Access the shell of the "airflow-web" container in the airflow-web pod:
kubectl exec -t -I AIRFLOW-WEB_POD_NAME -c airflow-web bash
enter the command:
airflow list_dags
The following traceback and error comes up:
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 25, in <module>
from airflow.configuration import conf
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/__init__.py", line 31, in <module>
from airflow.utils.log.logging_mixin import LoggingMixin
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/__init__.py", line 24, in <module>
from .decorators import apply_defaults as _apply_defaults
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/decorators.py", line 36, in <module>
from airflow import settings
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/settings.py", line 37, in <module>
from airflow.configuration import conf, AIRFLOW_HOME, WEBSERVER_CONFIG # NOQA F401
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 622, in <module>
conf.read(AIRFLOW_CONFIG)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 348, in read
self._validate()
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 204, in _validate
self.get('core', 'executor')))
airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the CeleryExecutor
Anything else we need to know:
It appears the problem is that export AIRFLOW__CORE__SQL_ALCHEMY_CONN=
is set as part of the entrypoint "args" in templates/deployments-web.yaml
. I think a solution would be to have this set as part of the "env" list but I don't know enough about Helm Charts to make the change myself. The same should go for AIRFLOW__CELERY__RESULT_BACKEND
and AIRFLOW__CELERY__BROKER_URL
as well.
I get the same error when running -
kubectl exec -it --namespace tamagotchi-orchestration service/gita-temp1-web -- bash -c "airflow list_dags"
or airflow create_user
I upgraded to 7.1.6 but I get the same error
This is caused bykubectl exec
not creating a login bash
shell and thus not sourcing .bashrc
. However, we can fix this by replacing our existing approach of using templated AIRFLOW__CORE__SQL_ALCHEMY_CONN
with AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD
.
When would this change take place in the official release?
@sungchun12
FYI, you can just use the method documented in the README.md, and run the following to get an interactive bash shell:
# use this to run commands like: `airflow create_user`
kubectl exec \
-it \
--namespace airflow \
--container airflow-scheduler \
Deployment/airflow-scheduler \
/bin/bash
But I will fix the environment variables at some point.
EDIT. chart version: airflow-7.8.0, app version: 1.10.12
I am having issues with the liveness probe related to this.
The variable is properly set as it prints the correct postgresql endpoint when kubectl exec -it bash
into the airflow-scheduler
container:
echo $AIRFLOW__CORE__SQL_ALCHEMY_CONN
But, the airflow-scheduler
container keeps restarting as it fails de liveness prove. kubectl describe pod <airfllow-scheduler-pod>
gives:
Warning Unhealthy 94s (x11 over 16m) kubelet, kube-node-0-kubelet.kube-dev.mesos Liveness probe failed: Traceback (most recent call last):
File "<string>", line 4, in <module>
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/__init__.py", line 31, in <module>
from airflow.utils.log.logging_mixin import LoggingMixin
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/__init__.py", line 24, in <module>
from .decorators import apply_defaults as _apply_defaults
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/decorators.py", line 36, in <module>
from airflow import settings
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/settings.py", line 37, in <module>
from airflow.configuration import conf, AIRFLOW_HOME, WEBSERVER_CONFIG # NOQA F401
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 731, in <module>
conf.read(AIRFLOW_CONFIG)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 421, in read
self._validate()
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 213, in _validate
self._validate_config_dependencies()
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/configuration.py", line 247, in _validate_config_dependencies
self.get('core', 'executor')))
airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the CeleryExecutor