No se puede modificar mapred.job.name en tiempo de ejecución. No está en la lista de parámetros que pueden modificarse en tiempo de ejecución

Estoy intentando ejecutar un trabajo de hive en el flujo de air Hice una conexión jdbc de custome que se puede ver en la imagen. Podría consultar las tablas de Hive a través de la interfaz de usuario web de flujo de air (perfil de datos-> consulta ad hoc). También quiero ejecutar algún archivo dag de ejemplo desde Internet:

#File Name: wf_incremental_load.py from airflow import DAG from airflow.operators import BashOperator, HiveOperator from datetime import datetime, timedelta default_args = { 'owner': 'airflow', 'start_date': datetime(2019, 3, 13), 'retries': 1, 'retry_delay': timedelta(minutes=5) } dag = DAG('hive_test', default_args=default_args,schedule_interval='* */5 * * *') touch_job = """ touch /root/hive.txt """ # Importing the data from Mysql table to HDFS task1 = BashOperator( task_id= 'make_file', bash_command=touch_job, dag=dag ) # Inserting the data from Hive external table to the target table task2 = HiveOperator( task_id= 'hive_table_create', hql='CREATE TABLE aaaaa AS SELECT * FROM ant_code;', hive_cli_conn_id='hive_jdbc', depends_on_past=True, dag=dag ) # defining the job dependency task2.set_upstream(task1) 

Sin embargo, cuando ejecuto este trabajo en el flujo de air, recibí algunos errores:

ERROR jdbc.Utilidades: No se pueden leer las configuraciones de HiveServer2 desde ZooKeeper Error: No se pudo abrir el transporte del cliente para ninguno de los URI del servidor en ZooKeeper: No se pudo abrir la nueva sesión: java.lang.IllegalArgumentException: No se puede modificar mapred.job.name en tiempo de ejecución. No se encuentra en la lista de parámetros que se pueden modificar en tiempo de ejecución (estado = 08S01, código = 0) línea recta> USE predeterminado; Sin conexión actual

 [2019-03-13 13:32:25,335] {models.py:1593} INFO - Executing  on 2019-03-13T00:00:00+00:00 [2019-03-13 13:32:25,336] {base_task_runner.py:118} INFO - Running: ['bash', '-c', u'airflow run hive_test hive_table_create 2019-03-13T00:00:00+00:00 --job_id 19 --raw -sd DAGS_FOLDER/hive_test.py --cfg_path /tmp/tmphSGJhO'] [2019-03-13 13:32:27,130] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,129] {__init__.py:51} INFO - Using executor SequentialExecutor [2019-03-13 13:32:27,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,547] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/hive_test.py [2019-03-13 13:32:27,565] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0. [2019-03-13 13:32:27,565] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create DeprecationWarning) [2019-03-13 13:32:27,570] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'HiveOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0. [2019-03-13 13:32:27,570] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create DeprecationWarning) [2019-03-13 13:32:27,602] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,602] {cli.py:520} INFO - Running  on host name02.excard.co.kr [2019-03-13 13:32:27,625] {hive_operator.py:118} INFO - Executing: CREATE TABLE aaaaa AS SELECT * FROM ant_code; [2019-03-13 13:32:27,634] {logging_mixin.py:95} INFO - [2019-03-13 13:32:27,634] {base_hook.py:83} INFO - Using connection to: id: hive_jdbc. Host: jdbc:hive2://192.168.0.202:10000/big_info, Port: None, Schema: None, Login: hive, Password: XXXXXXXX, extra: {u'extra__jdbc__drv_path': u'/usr/hdp/3.1.0.0-78/hive/jdbc/hive-jdbc-3.1.0.3.1.0.0-78-standalone.jar', u'extra__google_cloud_platform__scope': u'', u'extra__google_cloud_platform__project': u'', u'extra__google_cloud_platform__key_path': u'', u'extra__jdbc__drv_clsname': u'org.apache.hive.jdbc.HiveDriver', u'extra__google_cloud_platform__keyfile_dict': u''} [2019-03-13 13:32:27,636] {hive_operator.py:133} INFO - Passing HiveConf: {'airflow.ctx.task_id': 'hive_table_create', 'airflow.ctx.dag_id': 'hive_test', 'airflow.ctx.execution_date': '2019-03-13T00:00:00+00:00', 'airflow.ctx.dag_run_id': u'scheduled__2019-03-13T00:00:00+00:00'} [2019-03-13 13:32:27,637] {logging_mixin.py:95} INFO - [2019-03-13 13:32:27,637] {hive_hooks.py:236} INFO - hive -hiveconf airflow.ctx.task_id=hive_table_create -hiveconf airflow.ctx.dag_id=hive_test -hiveconf airflow.ctx.execution_date=2019-03-13T00:00:00+00:00 -hiveconf airflow.ctx.dag_run_id=scheduled__2019-03-13T00:00:00+00:00 -hiveconf mapred.job.name=Airflow HiveOperator task for name02.hive_test.hive_table_create.2019-03-13T00:00:00+00:00 -f /tmp/airflow_hiveop_rXXLyV/tmpdZYjMS [2019-03-13 13:32:32,323] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,323] {hive_hooks.py:251} INFO - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 [2019-03-13 13:32:32,738] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,738] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 [2019-03-13 13:32:32,813] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,813] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 [2019-03-13 13:32:32,830] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,830] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 [2019-03-13 13:32:32,895] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,895] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 [2019-03-13 13:32:32,941] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,941] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 [2019-03-13 13:32:32,959] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,959] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper [2019-03-13 13:32:32,967] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,967] {hive_hooks.py:251} INFO - Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) [2019-03-13 13:32:32,980] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,980] {hive_hooks.py:251} INFO - beeline> USE default; [2019-03-13 13:32:32,988] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,988] {hive_hooks.py:251} INFO - No current connection [2019-03-13 13:32:33,035] {models.py:1788} ERROR - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) beeline> USE default; No current connection Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task result = task_copy.execute(context=context) File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs) File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli raise AirflowException(stdout) AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) beeline> USE default; No current connection [2019-03-13 13:32:33,037] {models.py:1817} INFO - All retries failed; marking task as FAILED [2019-03-13 13:32:33,546] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create Traceback (most recent call last): [2019-03-13 13:32:33,546] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/bin/airflow", line 32, in  [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create args.func(args) [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/utils/cli.py", line 74, in wrapper [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create return f(*args, **kwargs) [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 526, in run [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create _run(args, dag, ti) [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 445, in _run [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create pool=args.pool, [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 73, in wrapper [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create return func(*args, **kwargs) [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create result = task_copy.execute(context=context) [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs) [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create raise AirflowException(stdout) [2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create airflow.exceptions.AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 [2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 [2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 [2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 [2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 [2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create beeline> USE default; [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create No current connection [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:35,201] {logging_mixin.py:95} INFO - [2019-03-13 13:32:35,201] {jobs.py:2527} INFO - Task exited with return code 1 

Por favor ayuda a resolver el problema.

introduzca la descripción de la imagen aquí

Actualizar:

Añado hive.security.authorization.sqlstd.confwhitelist.append: mapred.job.name * en hive-site.xml.

Así que ahora tengo un pequeño error diferente:

Error: No se pudo abrir el transporte del cliente para ninguno de los URI del servidor en ZooKeeper: No se pudo abrir la nueva sesión: java.lang.IllegalArgumentException: No se puede modificar airflow.ctx.task_id en tiempo de ejecución. No se encuentra en la lista de parámetros que se pueden modificar en tiempo de ejecución (estado = 08S01, código = 0) línea recta> USE predeterminado; Sin conexión actual Traceback (última llamada más reciente):

 [2019-03-13 14:54:31,946] {models.py:1593} INFO - Executing  on 2019-03-13T00:00:00+00:00 [2019-03-13 14:54:31,947] {base_task_runner.py:118} INFO - Running: ['bash', '-c', u'airflow run hive_test hive_table_create 2019-03-13T00:00:00+00:00 --job_id 11 --raw -sd DAGS_FOLDER/hive_test.py --cfg_path /tmp/tmpGDjT7j'] [2019-03-13 14:54:33,793] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:33,792] {__init__.py:51} INFO - Using executor SequentialExecutor [2019-03-13 14:54:34,189] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:34,189] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/hive_test.py [2019-03-13 14:54:34,192] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0. [2019-03-13 14:54:34,193] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create DeprecationWarning) [2019-03-13 14:54:34,195] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'HiveOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0. [2019-03-13 14:54:34,195] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create DeprecationWarning) [2019-03-13 14:54:34,219] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:34,218] {cli.py:520} INFO - Running  on host name02.excard.co.kr [2019-03-13 14:54:34,240] {hive_operator.py:118} INFO - Executing: CREATE TABLE aaaaa AS SELECT * FROM ant_code; [2019-03-13 14:54:34,249] {logging_mixin.py:95} INFO - [2019-03-13 14:54:34,249] {base_hook.py:83} INFO - Using connection to: id: hive_jdbc. Host: jdbc:hive2://192.168.0.202:10000/big_info, Port: None, Schema: None, Login: hive, Password: XXXXXXXX, extra: {u'extra__jdbc__drv_path': u'/usr/hdp/3.1.0.0-78/hive/jdbc/hive-jdbc-3.1.0.3.1.0.0-78-standalone.jar', u'extra__google_cloud_platform__scope': u'', u'extra__google_cloud_platform__project': u'', u'extra__google_cloud_platform__key_path': u'', u'extra__jdbc__drv_clsname': u'org.apache.hive.jdbc.HiveDriver', u'extra__google_cloud_platform__keyfile_dict': u''} [2019-03-13 14:54:34,251] {hive_operator.py:133} INFO - Passing HiveConf: {'airflow.ctx.task_id': 'hive_table_create', 'airflow.ctx.dag_id': 'hive_test', 'airflow.ctx.execution_date': '2019-03-13T00:00:00+00:00', 'airflow.ctx.dag_run_id': u'scheduled__2019-03-13T00:00:00+00:00'} [2019-03-13 14:54:34,253] {logging_mixin.py:95} INFO - [2019-03-13 14:54:34,252] {hive_hooks.py:236} INFO - hive -hiveconf airflow.ctx.task_id=hive_table_create -hiveconf airflow.ctx.dag_id=hive_test -hiveconf airflow.ctx.execution_date=2019-03-13T00:00:00+00:00 -hiveconf airflow.ctx.dag_run_id=scheduled__2019-03-13T00:00:00+00:00 -hiveconf mapred.job.name=Airflow HiveOperator task for name02.hive_test.hive_table_create.2019-03-13T00:00:00+00:00 -f /tmp/airflow_hiveop_wNbQlL/tmpFN6MGy [2019-03-13 14:54:39,061] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,060] {hive_hooks.py:251} INFO - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 [2019-03-13 14:54:39,443] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,443] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 [2019-03-13 14:54:39,532] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,532] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 [2019-03-13 14:54:39,552] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,551] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 [2019-03-13 14:54:39,664] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,664] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 [2019-03-13 14:54:39,856] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,856] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 [2019-03-13 14:54:41,134] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,134] {hive_hooks.py:251} INFO - 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper [2019-03-13 14:54:41,147] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,146] {hive_hooks.py:251} INFO - Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) [2019-03-13 14:54:41,167] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,167] {hive_hooks.py:251} INFO - beeline> USE default; [2019-03-13 14:54:41,180] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,180] {hive_hooks.py:251} INFO - No current connection [2019-03-13 14:54:41,253] {models.py:1788} ERROR - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) beeline> USE default; No current connection Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task result = task_copy.execute(context=context) File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs) File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli raise AirflowException(stdout) AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0) beeline> USE default; No current connection 

Tengo el mismo problema. ¿Encontraste la solución? Supongo que el problema es la parte mapred.job.name . Como se ve en la siguiente línea que fue generada por el flujo de air, mapred.job.name es extraño. Hay varias formas de configurar tu propio mapred.job.name.

hive -hiveconf airflow.ctx.task_id = hive_table_create -hiveconf airflow.ctx.dag_id = hive_test -hiveconf airflow.ctx.execution_date xpxpxpxp.pxpxp.pxp.pxp.pxp.px.pxp.px.px.px.png 03-13T00: 00: 00 + 00: 00 -hiveconf mapred.job.name = Tarea de HiveOperator de flujo de air para name02.hive_test.hive_table_create.2019-03-13T00: 00: 00 + 00: 00-00/ 00 / 00 / 00 / 00 / 00 / 00 / 00 / 00 /00/00

También sugiero que lea estas páginas si aún experimenta este tipo de mensaje. Es posible que necesite cambiar la configuración de la hive.

“No se puede modificar mapred.job.name en tiempo de ejecución. No está en la lista de parámetros que pueden modificarse en tiempo de ejecución Reintentando 0 de 1 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected a data01: 10000 ”


https://community.hortonworks.com/content/supportkb/48746/changing-hive-properties-in-beeline-gives-error-it.html

https://community.hortonworks.com/content/supportkb/49437/how-to-setup-multiple-properties-in-hivesecurityau.html

Actualice cualquiera de hive.security.authorization.sqlstd.confwhitelist.append o hive.security.authorization.sqlstd.confwhitelist para incluir las propiedades que los usuarios pueden modificar.