Skip to content

Scheduler crashing when migrated with 2.10.5 DAGRUNS #48844

@vatsrahul1001

Description

@vatsrahul1001

Apache Airflow version

3.0.0

If "Other Airflow 2 version" selected, which one?

No response

What happened?

Scheduler is crashing with the below error

Traceback (most recent call last):
  File "/opt/airflow/airflow-core/src/airflow/jobs/scheduler_job_runner.py", line 954, in _execute
    self._run_scheduler_loop()
  File "/opt/airflow/airflow-core/src/airflow/jobs/scheduler_job_runner.py", line 1235, in _run_scheduler_loop
    num_queued_tis = self._do_scheduling(session)
  File "/opt/airflow/airflow-core/src/airflow/jobs/scheduler_job_runner.py", line 1326, in _do_scheduling
    self._create_dagruns_for_dags(guard, session)
  File "/opt/airflow/airflow-core/src/airflow/utils/retries.py", line 93, in wrapped_function
    for attempt in run_with_db_retries(max_retries=retries, logger=logger, **retry_kwargs):
  File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 445, in __iter__
    do = self.iter(retry_state=retry_state)
  File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 378, in iter
    result = action(retry_state)
  File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 400, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
  File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 439, in result
    return self.__get_result()
  File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result
    raise self._exception
  File "/opt/airflow/airflow-core/src/airflow/utils/retries.py", line 102, in wrapped_function
    return func(*args, **kwargs)
  File "/opt/airflow/airflow-core/src/airflow/jobs/scheduler_job_runner.py", line 1389, in _create_dagruns_for_dags
    query, triggered_date_by_dag = DagModel.dags_needing_dagruns(session)
  File "/opt/airflow/airflow-core/src/airflow/models/dag.py", line 2412, in dags_needing_dagruns
    if not dag_ready(dag_id, cond=ser_dag.dag.timetable.asset_condition, statuses=statuses):
  File "/opt/airflow/airflow-core/src/airflow/models/serialized_dag.py", line 529, in dag
    return SerializedDAG.from_dict(data)
  File "/opt/airflow/airflow-core/src/airflow/serialization/serialized_objects.py", line 1795, in from_dict
    return cls.deserialize_dag(serialized_obj["dag"])
  File "/opt/airflow/airflow-core/src/airflow/serialization/serialized_objects.py", line 1698, in deserialize_dag
    raise RuntimeError(
RuntimeError: Encoded dag object has no dag_id key.  You may need to run `airflow dags reserialize`.

What you think should happen instead?

No response

How to reproduce

  1. Create a airflow deployment with 2.10.5
  2. create some dags runs
  3. migrate to 3.0.0
    4

Operating System

Linux

Versions of Apache Airflow Providers

No response

Deployment

Other

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions