hive on spark 同时只能提交一个任务错误处理

同时提交几个任务时报

FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 1e458098-9ef7-4709-a117-c1b0b6ea0eee_0: java.util.concurrent.TimeoutException: Client '1e458098-9ef7-4709-a117-c1b0b6ea0eee_0' timed out waiting for connection from the Remote Spark Driver

试了几种方式:

  • 资源配置(不起作用)
set hive.execution.engine=spark;
set spark.executor.memory=4g;
set yarn.nodemanager.resource.memory-mb=12288;
set yarn.scheduler.maximum-allocation-mb=2048;
  • 动态设置超时时间 (不起作用)
hive.spark.client.future.timeout 
hive.spark.client.connect.timeout
hive.spark.client.server.connect.timeout
  • 配置hive-site.xml (起作用)
<property><name>hive.spark.client.connect.timeout</name><value>300000ms</value></property><property><name>hive.spark.client.server.connect.timeout</name><value>300000ms</value></property>

你可能感兴趣的:(大数据,spark,hive)