Can some provide me with the schema to recreate dag_run table in airflow-db.?

随声附和 提交于 2021-01-28 12:36:46

问题


I have a google cloud composer environment on GCP and I accidentally deleted the dag_runs table due to which airflow_scheduler kept on crashing and the airflow web-server would not come up. I was able to re-create the dag_run table in airflow-db which stopped the crashing, but i think i did not get the schema right as i get the below error when i manually trigger a dag on airflow webserver.

Ooops.

                      ____/ (  (    )   )  \___
                     /( (  (  )   _    ))  )   )\
                   ((     (   )(    )  )   (   )  )
                 ((/  ( _(   )   (   _) ) (  () )  )
                ( (  ( (_)   ((    (   )  .((_ ) .  )_
               ( (  )    (      (  )    )   ) . ) (   )
              (  (   (  (   ) (  _  ( _) ).  ) . ) ) ( )
              ( (  (   ) (  )   (  ))     ) _)(   )  )  )
             ( (  ( \ ) (    (_  ( ) ( )  )   ) )  )) ( )
              (  (   (  (   (_ ( ) ( _    )  ) (  )  )   )
             ( (  ( (  (  )     (_  )  ) )  _)   ) _( ( )
              ((  (   )(    (     _    )   _) _(_ (  (_ )
               (_((__(_(__(( ( ( |  ) ) ) )_))__))_)___)
               ((__)        \\||lll|l||///          \_))
                        (   /(/ (  )  ) )\   )
                      (    ( ( ( | | ) ) )\   )
                       (   /(| / ( )) ) ) )) )
                     (     ( ((((_(|)_)))))     )
                      (      ||\(|(|)|/||     )
                    (        |(||(||)||||        )
                      (     //|/l|||)|\\ \     )

(/ / // /|//||||\ \ \ \ _)

Node: 38b47b3e06a1

Traceback (most recent call last):
  File "/opt/python3.6/lib/python3.6/site-packages/flask/app.py", line 1988, in wsgi_app
    response = self.full_dispatch_request()
  File "/opt/python3.6/lib/python3.6/site-packages/flask/app.py", line 1641, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/opt/python3.6/lib/python3.6/site-packages/flask/app.py", line 1544, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/opt/python3.6/lib/python3.6/site-packages/flask/_compat.py", line 33, in reraise
    raise value
  File "/opt/python3.6/lib/python3.6/site-packages/flask/app.py", line 1639, in full_dispatch_request
    rv = self.dispatch_request()
  File "/opt/python3.6/lib/python3.6/site-packages/flask/app.py", line 1625, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/opt/python3.6/lib/python3.6/site-packages/flask_admin/base.py", line 69, in inner
    return self._run_view(f, *args, **kwargs)
  File "/opt/python3.6/lib/python3.6/site-packages/flask_admin/base.py", line 368, in _run_view
    return fn(self, *args, **kwargs)
  File "/opt/python3.6/lib/python3.6/site-packages/flask_login.py", line 755, in decorated_view
    return func(*args, **kwargs)
  File "/usr/local/lib/airflow/airflow/www/utils.py", line 262, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/airflow/airflow/www/utils.py", line 309, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/airflow/airflow/www/views.py", line 929, in trigger
    external_trigger=True
  File "/usr/local/lib/airflow/airflow/utils/db.py", line 50, in wrapper
    result = func(*args, **kwargs)
  File "/usr/local/lib/airflow/airflow/models.py", line 3781, in create_dagrun
    run.refresh_from_db()
  File "/usr/local/lib/airflow/airflow/utils/db.py", line 50, in wrapper
    result = func(*args, **kwargs)
  File "/usr/local/lib/airflow/airflow/models.py", line 4439, in refresh_from_db
    DR.run_id == self.run_id
  File "/opt/python3.6/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3077, in one
    raise orm_exc.NoResultFound("No row was found for one()")
sqlalchemy.orm.exc.NoResultFound: No row was found for one()

回答1:


The DagRun SQLAlchemy model could've helped a bit

id = Column(Integer, primary_key=True)
dag_id = Column(String(ID_LEN))
execution_date = Column(UtcDateTime, default=timezone.utcnow)
start_date = Column(UtcDateTime, default=timezone.utcnow)
end_date = Column(UtcDateTime)
_state = Column('state', String(50), default=State.RUNNING)
run_id = Column(String(ID_LEN))
external_trigger = Column(Boolean, default=True)
conf = Column(PickleType)

but nevertheless, here's the MySQL DDL statement

mysql> SHOW CREATE TABLE `dag_run`;
...
CREATE TABLE `dag_run` (
  `id` int(11) NOT NULL AUTO_INCREMENT,
  `dag_id` varchar(250) DEFAULT NULL,
  `execution_date` timestamp(6) NULL DEFAULT NULL,
  `state` varchar(50) DEFAULT NULL,
  `run_id` varchar(250) DEFAULT NULL,
  `external_trigger` tinyint(1) DEFAULT NULL,
  `conf` blob,
  `end_date` timestamp(6) NULL DEFAULT NULL,
  `start_date` timestamp(6) NULL DEFAULT NULL,
  PRIMARY KEY (`id`),
  UNIQUE KEY `dag_id` (`dag_id`,`execution_date`),
  UNIQUE KEY `dag_id_2` (`dag_id`,`run_id`),
  KEY `dag_id_state` (`dag_id`,`state`)
)
ENGINE=InnoDB
AUTO_INCREMENT=177
DEFAULT CHARSET=utf8mb4
COLLATE=utf8mb4_0900_ai_ci

and the table description

mysql> DESC dag_run;
+------------------+--------------+------+-----+---------+----------------+
| Field            | Type         | Null | Key | Default | Extra          |
+------------------+--------------+------+-----+---------+----------------+
| id               | int(11)      | NO   | PRI | NULL    | auto_increment |
| dag_id           | varchar(250) | YES  | MUL | NULL    |                |
| execution_date   | timestamp(6) | YES  |     | NULL    |                |
| state            | varchar(50)  | YES  |     | NULL    |                |
| run_id           | varchar(250) | YES  |     | NULL    |                |
| external_trigger | tinyint(1)   | YES  |     | NULL    |                |
| conf             | blob         | YES  |     | NULL    |                |
| end_date         | timestamp(6) | YES  |     | NULL    |                |
| start_date       | timestamp(6) | YES  |     | NULL    |                |
+------------------+--------------+------+-----+---------+----------------+

UPDATE-1

Courtesy: @AyushChauhan, if you are trying to fix this for some playground environment (where you don't particularly care about Airflow backend-db's data of historical DagRuns, TaskInstances etc.)

  • then airflow resetdb CLI command can also be used to fix this

  • but if you didn't notice already, beware:

    it will delete all entries from the metadata database. This includes all dag runs, Variables and Connections.



来源:https://stackoverflow.com/questions/55262545/can-some-provide-me-with-the-schema-to-recreate-dag-run-table-in-airflow-db

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!