Issues after Apache Airflow migration from 1.9.0 to 1.10.1





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







1















I just upgraded my Airflow install from 1.9.0 to 1.10.1.



I use docker to install and run Airflow.
So I just updated my DockerFile with these lines:



ENV SLUGIFY_USES_TEXT_UNIDECODE yes



RUN pip install apache-airflow[crypto,celery,postgres,hive,jdbc]==1.10.1



And then run docker build, then docker-compose with the new image.



So far, so good.



In the airflow.cfg, I add the line:



rbac = True



Because I want to create users with specific roles, and allow them to access only their DAGs



The docker containers are running without errors, an error happens when I click on a DAG name in the UI, or when I try to launch a DAG:



Traceback (most recent call last):
File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
raise value
File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 69, in inner
return self._run_view(f, *args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 368, in _run_view
return fn(self, *args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/flask_login/utils.py", line 261, in decorated_view
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 372, in view_func
return f(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 278, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/airflow/www/views.py", line 1345, in tree
session, start_date=min_date, end_date=base_date)
File "/usr/local/lib/python3.5/site-packages/airflow/models.py", line 3753, in get_task_instances
tis = tis.order_by(TI.execution_date).all()
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2703, in all
return list(self)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2855, in __iter__
return self._execute_and_instances(context)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2878, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 945, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
compiled_sql, distilled_params
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
context)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception
exc_info
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
context)
File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column task_instance.executor_config does not exist
LINE 1: ...ued_dttm, task_instance.pid AS task_instance_pid, task_insta...


Thanks for any help.










share|improve this question





























    1















    I just upgraded my Airflow install from 1.9.0 to 1.10.1.



    I use docker to install and run Airflow.
    So I just updated my DockerFile with these lines:



    ENV SLUGIFY_USES_TEXT_UNIDECODE yes



    RUN pip install apache-airflow[crypto,celery,postgres,hive,jdbc]==1.10.1



    And then run docker build, then docker-compose with the new image.



    So far, so good.



    In the airflow.cfg, I add the line:



    rbac = True



    Because I want to create users with specific roles, and allow them to access only their DAGs



    The docker containers are running without errors, an error happens when I click on a DAG name in the UI, or when I try to launch a DAG:



    Traceback (most recent call last):
    File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
    response = self.full_dispatch_request()
    File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
    rv = self.handle_user_exception(e)
    File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
    reraise(exc_type, exc_value, tb)
    File "/usr/local/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
    raise value
    File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
    rv = self.dispatch_request()
    File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
    File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 69, in inner
    return self._run_view(f, *args, **kwargs)
    File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 368, in _run_view
    return fn(self, *args, **kwargs)
    File "/usr/local/lib/python3.5/site-packages/flask_login/utils.py", line 261, in decorated_view
    return func(*args, **kwargs)
    File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 372, in view_func
    return f(*args, **kwargs)
    File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 278, in wrapper
    return f(*args, **kwargs)
    File "/usr/local/lib/python3.5/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
    File "/usr/local/lib/python3.5/site-packages/airflow/www/views.py", line 1345, in tree
    session, start_date=min_date, end_date=base_date)
    File "/usr/local/lib/python3.5/site-packages/airflow/models.py", line 3753, in get_task_instances
    tis = tis.order_by(TI.execution_date).all()
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2703, in all
    return list(self)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2855, in __iter__
    return self._execute_and_instances(context)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2878, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 945, in execute
    return meth(self, multiparams, params)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
    compiled_sql, distilled_params
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
    context)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception
    exc_info
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
    raise value.with_traceback(tb)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
    context)
    File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
    cursor.execute(statement, parameters)
    sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column task_instance.executor_config does not exist
    LINE 1: ...ued_dttm, task_instance.pid AS task_instance_pid, task_insta...


    Thanks for any help.










    share|improve this question

























      1












      1








      1








      I just upgraded my Airflow install from 1.9.0 to 1.10.1.



      I use docker to install and run Airflow.
      So I just updated my DockerFile with these lines:



      ENV SLUGIFY_USES_TEXT_UNIDECODE yes



      RUN pip install apache-airflow[crypto,celery,postgres,hive,jdbc]==1.10.1



      And then run docker build, then docker-compose with the new image.



      So far, so good.



      In the airflow.cfg, I add the line:



      rbac = True



      Because I want to create users with specific roles, and allow them to access only their DAGs



      The docker containers are running without errors, an error happens when I click on a DAG name in the UI, or when I try to launch a DAG:



      Traceback (most recent call last):
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
      response = self.full_dispatch_request()
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
      rv = self.handle_user_exception(e)
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
      reraise(exc_type, exc_value, tb)
      File "/usr/local/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
      raise value
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
      rv = self.dispatch_request()
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
      return self.view_functions[rule.endpoint](**req.view_args)
      File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 69, in inner
      return self._run_view(f, *args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 368, in _run_view
      return fn(self, *args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/flask_login/utils.py", line 261, in decorated_view
      return func(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 372, in view_func
      return f(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 278, in wrapper
      return f(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/utils/db.py", line 74, in wrapper
      return func(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/www/views.py", line 1345, in tree
      session, start_date=min_date, end_date=base_date)
      File "/usr/local/lib/python3.5/site-packages/airflow/models.py", line 3753, in get_task_instances
      tis = tis.order_by(TI.execution_date).all()
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2703, in all
      return list(self)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2855, in __iter__
      return self._execute_and_instances(context)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2878, in _execute_and_instances
      result = conn.execute(querycontext.statement, self._params)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 945, in execute
      return meth(self, multiparams, params)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
      return connection._execute_clauseelement(self, multiparams, params)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
      compiled_sql, distilled_params
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
      context)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception
      exc_info
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
      reraise(type(exception), exception, tb=exc_tb, cause=cause)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
      raise value.with_traceback(tb)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
      context)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
      cursor.execute(statement, parameters)
      sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column task_instance.executor_config does not exist
      LINE 1: ...ued_dttm, task_instance.pid AS task_instance_pid, task_insta...


      Thanks for any help.










      share|improve this question














      I just upgraded my Airflow install from 1.9.0 to 1.10.1.



      I use docker to install and run Airflow.
      So I just updated my DockerFile with these lines:



      ENV SLUGIFY_USES_TEXT_UNIDECODE yes



      RUN pip install apache-airflow[crypto,celery,postgres,hive,jdbc]==1.10.1



      And then run docker build, then docker-compose with the new image.



      So far, so good.



      In the airflow.cfg, I add the line:



      rbac = True



      Because I want to create users with specific roles, and allow them to access only their DAGs



      The docker containers are running without errors, an error happens when I click on a DAG name in the UI, or when I try to launch a DAG:



      Traceback (most recent call last):
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
      response = self.full_dispatch_request()
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
      rv = self.handle_user_exception(e)
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
      reraise(exc_type, exc_value, tb)
      File "/usr/local/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
      raise value
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
      rv = self.dispatch_request()
      File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
      return self.view_functions[rule.endpoint](**req.view_args)
      File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 69, in inner
      return self._run_view(f, *args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 368, in _run_view
      return fn(self, *args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/flask_login/utils.py", line 261, in decorated_view
      return func(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 372, in view_func
      return f(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 278, in wrapper
      return f(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/utils/db.py", line 74, in wrapper
      return func(*args, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/airflow/www/views.py", line 1345, in tree
      session, start_date=min_date, end_date=base_date)
      File "/usr/local/lib/python3.5/site-packages/airflow/models.py", line 3753, in get_task_instances
      tis = tis.order_by(TI.execution_date).all()
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2703, in all
      return list(self)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2855, in __iter__
      return self._execute_and_instances(context)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2878, in _execute_and_instances
      result = conn.execute(querycontext.statement, self._params)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 945, in execute
      return meth(self, multiparams, params)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
      return connection._execute_clauseelement(self, multiparams, params)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
      compiled_sql, distilled_params
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
      context)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception
      exc_info
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
      reraise(type(exception), exception, tb=exc_tb, cause=cause)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
      raise value.with_traceback(tb)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
      context)
      File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
      cursor.execute(statement, parameters)
      sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column task_instance.executor_config does not exist
      LINE 1: ...ued_dttm, task_instance.pid AS task_instance_pid, task_insta...


      Thanks for any help.







      python docker airflow






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 22 '18 at 18:01









      Nicolas DufaurNicolas Dufaur

      534




      534
























          1 Answer
          1






          active

          oldest

          votes


















          1














          Try airflow upgradedb. This will create missing columns in your metadb.






          share|improve this answer
























          • Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

            – Nicolas Dufaur
            Nov 23 '18 at 10:02













          • I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

            – Nicolas Dufaur
            Nov 23 '18 at 11:10











          • And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

            – Nicolas Dufaur
            Nov 23 '18 at 11:24











          • I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

            – Nicolas Dufaur
            Nov 23 '18 at 12:31












          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53436208%2fissues-after-apache-airflow-migration-from-1-9-0-to-1-10-1%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1














          Try airflow upgradedb. This will create missing columns in your metadb.






          share|improve this answer
























          • Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

            – Nicolas Dufaur
            Nov 23 '18 at 10:02













          • I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

            – Nicolas Dufaur
            Nov 23 '18 at 11:10











          • And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

            – Nicolas Dufaur
            Nov 23 '18 at 11:24











          • I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

            – Nicolas Dufaur
            Nov 23 '18 at 12:31
















          1














          Try airflow upgradedb. This will create missing columns in your metadb.






          share|improve this answer
























          • Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

            – Nicolas Dufaur
            Nov 23 '18 at 10:02













          • I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

            – Nicolas Dufaur
            Nov 23 '18 at 11:10











          • And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

            – Nicolas Dufaur
            Nov 23 '18 at 11:24











          • I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

            – Nicolas Dufaur
            Nov 23 '18 at 12:31














          1












          1








          1







          Try airflow upgradedb. This will create missing columns in your metadb.






          share|improve this answer













          Try airflow upgradedb. This will create missing columns in your metadb.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 22 '18 at 19:52









          kaxilkaxil

          4,010928




          4,010928













          • Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

            – Nicolas Dufaur
            Nov 23 '18 at 10:02













          • I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

            – Nicolas Dufaur
            Nov 23 '18 at 11:10











          • And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

            – Nicolas Dufaur
            Nov 23 '18 at 11:24











          • I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

            – Nicolas Dufaur
            Nov 23 '18 at 12:31



















          • Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

            – Nicolas Dufaur
            Nov 23 '18 at 10:02













          • I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

            – Nicolas Dufaur
            Nov 23 '18 at 11:10











          • And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

            – Nicolas Dufaur
            Nov 23 '18 at 11:24











          • I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

            – Nicolas Dufaur
            Nov 23 '18 at 12:31

















          Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

          – Nicolas Dufaur
          Nov 23 '18 at 10:02







          Thanks for the answer. But when I launch this command, I have another error : sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated. [SQL: 'ALTER TABLE job ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']

          – Nicolas Dufaur
          Nov 23 '18 at 10:02















          I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

          – Nicolas Dufaur
          Nov 23 '18 at 11:10





          I see in the call stack this new error come from File "/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 110, in upgrade op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))

          – Nicolas Dufaur
          Nov 23 '18 at 11:10













          And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

          – Nicolas Dufaur
          Nov 23 '18 at 11:24





          And when I request the airflow database with 'select * from job where id = 128', I only got one single row...

          – Nicolas Dufaur
          Nov 23 '18 at 11:24













          I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

          – Nicolas Dufaur
          Nov 23 '18 at 12:31





          I finally delete this line (table job with id = 128) in the db and everything was OK after. Very weird

          – Nicolas Dufaur
          Nov 23 '18 at 12:31




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53436208%2fissues-after-apache-airflow-migration-from-1-9-0-to-1-10-1%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          How to change which sound is reproduced for terminal bell?

          Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents

          Can I use Tabulator js library in my java Spring + Thymeleaf project?