启动hive时报错: org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark session

启动hive时报错:

Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session a2d32dbe-d48c-486f-be8b-2f5c75ffe182
 

日志监控:

java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.

解决:

hive.spark.client.connect.timeout的默认值是1000ms,如果执行hive的insert语句时,抛如下异常,可以调大该参数到10000ms

在hive的配置文件hive-site.xml中添加:

    



    hive.spark.client.connect.timeout

    10000ms

注意:千万要检查自己的配置文件有没有把目录或者其他信息写错!

你可能感兴趣的:(java,开发语言,后端,hive)