Apache Airflow-Komponententests können nicht ausgeführt werden: sqlalchemy.exc.OperationalError: Datenbankdatei kann nicht geöffnet werden
Ich versuche, einen einzelnen Komponententest im Apache Airflow-Repository auszuführen (
) für Entwicklungszwecke.
Ich habe die empfohlenen Einrichtungsschritte befolgt, aber der Test schlägt durchweg mit zwei verschiedenen Fehlern fehl (Windows/Linux).
pytest airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py -q
============================================= test session starts ==============================================
platform linux -- Python 3.11.2, pytest-9.0.1, pluggy-1.6.0 -- /home/midohany910/habitat/airflow/.venv/bin/python3
cachedir: .pytest_cache
rootdir: /home/midohany910/habitat/airflow
configfile: pyproject.toml
plugins: asyncio-1.3.0, anyio-4.12.0, time-machine-3.1.0
asyncio: mode=Mode.STRICT, debug=False, asyncio_default_fixture_loop_scope=function, asyncio_default_test_loop_scope=function
collected 4 items
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT ERROR [ 25%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT ERROR [ 50%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT ERROR [ 75%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT ERROR [100%]
==================================================== ERRORS ====================================================
_______________________ ERROR at setup of test_overflow_caps_to_configured_max__HABITAT ________________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:3301: in raw_connection
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/pool/base.py:447: in connect
return _ConnectionFairy._checkout(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
-------------------------------------------- Captured stdout setup ---------------------------------------------
========================= AIRFLOW ==========================
Home of the user: /home/midohany910
Airflow home /home/midohany910/airflow
Initializing the DB - first time after entering the container.
Initialization can be also forced by adding --with-db-init flag when running tests.
[2025-11-30T20:05:55.919+0000] {db.py:1146} INFO - Dropping Airflow tables that exist
---------------------------------------------- Captured log setup ----------------------------------------------
INFO airflow.utils.db:db.py:1146 Dropping Airflow tables that exist
________________________ ERROR at setup of test_overflow_caps_to_internal_max__HABITAT _________________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:3301: in raw_connection
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/pool/base.py:447: in connect
return _ConnectionFairy._checkout(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/create.py:661: in connect
return dialect.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
_______________________ ERROR at setup of test_non_exponential_policy_unchanged__HABITAT _______________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/create.py:661: in connect
return dialect.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
____________________ ERROR at setup of test_small_interval_large_attempts_overflow__HABITAT ____________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:3301: in raw_connection
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
===================================== Warning summary. Total: 1, Unique: 1 =====================================
other: total 1, unique 1
runtest: total 1, unique 1
Warnings saved into /home/midohany910/habitat/airflow/devel-common/warnings.txt file.
=========================================== short test summary info ============================================
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
========================================= 1 warning, 4 errors in 6.01s =========================================
Einrichtungsschritte
Ich führe diese Schritte im Stammverzeichnis des geklonten Apache Airflow-Repositorys aus:
# Final working PYTHONPATH export
export PYTHONPATH="./airflow-core/src:./airflow-core/tests:.:$PYTHONPATH"
# Win: $env:PYTHONPATH = "$(Get-Location)\airflow-core\src;$(Get-Location)\airflow-core\tests;$(Get-Location)\devel-common\src"
# Added the test mode flag (often required for Airflow tests)
export AIRFLOWCOREUNIT_TEST_MODE="true"
Der Fehler
Die Testsitzung schlägt unter Windows fehl, startet aber unter Linux und versucht, das Testmodul zu laden, und schlägt dann mit dem oben genannten Fehler fehl.
Was mache ich falsch??
Ich habe die Fehler für die Beitragslängenbeschränkung gekürzt – wenn Sie den vollständigen Fehler I benötigen bereitstellen kann.
Apache Airflow-Komponententests können nicht ausgeführt werden: sqlalchemy.exc.OperationalError: Datenbankdatei kann nicht geöffnet werden Ich versuche, einen einzelnen Komponententest im Apache Airflow-Repository auszuführen ([code]airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py[/code]) für Entwicklungszwecke. Ich habe die empfohlenen Einrichtungsschritte befolgt, aber der Test schlägt durchweg mit zwei verschiedenen Fehlern fehl (Windows/Linux). [list] [*]Windows (lang, aber gekürzt): [/list] [code](.venv) PS C:\Users\Mido Hany\VS code Projects\Habitat\airflow> pytest .\airflow-core\tests\unit\models\test_safe_exponential_backoff__HABITAT.py -q Refreshed 97 providers with 1937 Python files. Traceback (most recent call last): File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\scripts\ci\pre_commit\update_providers_dependencies.py", line 188, in check_if_different_provider_used(file) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\scripts\ci\pre_commit\update_providers_dependencies.py", line 157, in check_if_different_provider_used imports = get_imports_from_file(file_path, only_top_level=False) File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\scripts\ci\pre_commit\common_precommit_utils.py", line 379, in get_imports_from_file root = ast.parse(file_path.read_text(), file_path.name) ~~~~~~~~~~~~~~~~~~~^^ File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\pathlib\_local.py", line 546, in read_text return PathBase.read_text(self, encoding, errors, newline) ~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\pathlib\_abc.py", line 633, in read_text return f.read() ~~~~~~^^ File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\encodings\cp1252.py", line 23, in decode return codecs.charmap_decode(input,self.errors,decoding_table)[0] ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 2148: character maps to Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\.venv\Scripts\pytest.exe\__main__.py", line 6, in sys.exit(console_main()) ~~~~~~~~~~~~^^ ... File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\.venv\Lib\site-packages\_pytest\config\__init__.py", line 876, in import_plugin __import__(importspec) ~~~~~~~~~~^^^^^^^^^^^^ File "", line 1360, in _find_and_load File "", line 1331, in _find_and_load_unlocked File "", line 935, in _load_unlocked File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\.venv\Lib\site-packages\_pytest\assertion\rewrite.py", line 197, in exec_module exec(co, module.__dict__) ~~~~^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\devel-common\src\tests_common\pytest_plugin.py", line 200, in subprocess.check_call(["uv", "run", UPDATE_PROVIDER_DEPENDENCIES_SCRIPT.as_posix()]) ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\subprocess.py", line 419, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['uv', 'run', 'C:/Users/Mido Hany/VS code Projects/Habitat/airflow/scripts/ci/pre_commit/update_providers_dependencies.py']' returned non-zero exit status 1. [/code] [list] [*]Ich habe es auf einer sauberen Ubuntu-VM in Google Cloud versucht und hatte einen weiteren Fehler (sehr lang, aber gekürzt): [/list] [code]pytest airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py -q ============================================= test session starts ============================================== platform linux -- Python 3.11.2, pytest-9.0.1, pluggy-1.6.0 -- /home/midohany910/habitat/airflow/.venv/bin/python3 cachedir: .pytest_cache rootdir: /home/midohany910/habitat/airflow configfile: pyproject.toml plugins: asyncio-1.3.0, anyio-4.12.0, time-machine-3.1.0 asyncio: mode=Mode.STRICT, debug=False, asyncio_default_fixture_loop_scope=function, asyncio_default_test_loop_scope=function collected 4 items
==================================================== ERRORS ==================================================== _______________________ ERROR at setup of test_overflow_caps_to_configured_max__HABITAT ________________________ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__ self._dbapi_connection = engine.raw_connection() ^^^^^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:3301: in raw_connection return self.pool.connect() ^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/pool/base.py:447: in connect return _ConnectionFairy._checkout(self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ... .venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file E (Background on this error at: https://sqlalche.me/e/20/e3q8) -------------------------------------------- Captured stdout setup ---------------------------------------------
========================= AIRFLOW ========================== Home of the user: /home/midohany910 Airflow home /home/midohany910/airflow Initializing the DB - first time after entering the container. Initialization can be also forced by adding --with-db-init flag when running tests. [2025-11-30T20:05:55.919+0000] {db.py:1146} INFO - Dropping Airflow tables that exist ---------------------------------------------- Captured log setup ---------------------------------------------- INFO airflow.utils.db:db.py:1146 Dropping Airflow tables that exist ________________________ ERROR at setup of test_overflow_caps_to_internal_max__HABITAT _________________________ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__ self._dbapi_connection = engine.raw_connection() ^^^^^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:3301: in raw_connection return self.pool.connect() ^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/pool/base.py:447: in connect return _ConnectionFairy._checkout(self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ... .venv/lib/python3.11/site-packages/sqlalchemy/engine/create.py:661: in connect return dialect.connect(*cargs, **cparams) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file E (Background on this error at: https://sqlalche.me/e/20/e3q8) _______________________ ERROR at setup of test_non_exponential_policy_unchanged__HABITAT _______________________ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__ self._dbapi_connection = engine.raw_connection() ^^^^^^^^^^^^^^^^^^^^^^^ ... .venv/lib/python3.11/site-packages/sqlalchemy/engine/create.py:661: in connect return dialect.connect(*cargs, **cparams) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file E (Background on this error at: https://sqlalche.me/e/20/e3q8) ____________________ ERROR at setup of test_small_interval_large_attempts_overflow__HABITAT ____________________ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__ self._dbapi_connection = engine.raw_connection() ^^^^^^^^^^^^^^^^^^^^^^^ .venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:3301: in raw_connection return self.pool.connect() ^^^^^^^^^^^^^^^^^^^ ... .venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file E (Background on this error at: https://sqlalche.me/e/20/e3q8) ===================================== Warning summary. Total: 1, Unique: 1 ===================================== other: total 1, unique 1 runtest: total 1, unique 1 Warnings saved into /home/midohany910/habitat/airflow/devel-common/warnings.txt file. =========================================== short test summary info ============================================ ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file ========================================= 1 warning, 4 errors in 6.01s ========================================= [/code] ⚙️ Einrichtungsschritte Ich führe diese Schritte im Stammverzeichnis des geklonten Apache Airflow-Repositorys aus: [list] [*][b]Klonen Sie das Repository und navigieren Sie[/b] [code]git clone https://github.com/apache/airflow.git cd airflow [/code]
# Explicitly install testing tools that were missing uv pip install pytest pytest-asyncio pyyaml [/code]
[*][b]Umgebungsvariablen festlegen (Verschiedene Versionen ausprobiert, um Importfehler zu beheben)[/b] [code]# Final working PYTHONPATH export export PYTHONPATH="./airflow-core/src:./airflow-core/tests:.:$PYTHONPATH" # Win: $env:PYTHONPATH = "$(Get-Location)\airflow-core\src;$(Get-Location)\airflow-core\tests;$(Get-Location)\devel-common\src"
# Added the test mode flag (often required for Airflow tests) export AIRFLOWCOREUNIT_TEST_MODE="true" [/code]
[*][b]Führen Sie den Test durch[/b] [code]pytest -q airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py [/code]
[/list] ❌ Der Fehler Die Testsitzung schlägt unter Windows fehl, startet aber unter Linux und versucht, das Testmodul zu laden, und schlägt dann mit dem oben genannten Fehler fehl. Was mache ich falsch??
Ich habe die Fehler für die Beitragslängenbeschränkung gekürzt – wenn Sie den vollständigen Fehler I benötigen bereitstellen kann.
Apache Airflow-Komponententests können nicht ausgeführt werden: sqlalchemy.exc.OperationalError: Datenbankdatei kann nicht geöffnet werden
Ich versuche, einen einzelnen Komponententest im Apache...
Ich möchte ein System erstellen, das virtuelle Maschinen automatisiert, um verschiedene Aufgaben zu erledigen. Alle paar Minuten möchte ich einen solchen Workflow auslösen, der einen VM -Ausführen...
Ich generiere dynamisch basierend auf JSON-Dateien einige DAGs. Sub-Dag (es ist technisch gesehen kein Subdagoperator , aber Sie erhalten die Idee) und erstellen Sie Aufgaben in dieser Unterdag. Ich...
Hallo nochmal, meine Freunde. Aus der entgegengesetzten Richtung scheinen die Kontrollen umgekehrt zu sein. Auch wenn ich mich mit der Maus umsehe. Das Spiel wird als Ego-Shooter (FPS) angesehen....