部署:hive on spark报错Failed to execute spark task(Failed to submit Spark work, please retry later)‘

  1. 场景
    使用hive on spark,开启hive的时候,查看数据库、表没问题,create表也没问题,但是执行insert语句用到spark引擎的时候报错如下以及如图:
    Failed to execute spark task, with exception ‘java.lang.Exception(Failed to submit Spark work, please retry later)’
    java.lang.Exception: Failed to submit Spark work, please retry later
    部署:hive on spark报错Failed to execute spark task(Failed to submit Spark work, please retry later)‘_第1张图片

  2. 分析
    show databses和create没问题,hive应该大概率没问题[不用启动元数据,元数据放mysql了]
    看报错是说spark未能执行任务,未能提交task,就是spark提交任务那一块出了问题,那得去找hive中给到spark提交任务的配置,想想配置里有内存限制,估摸着估计是内存不足

  3. 解决方案
    去hive的spark-defaults.conf文件,此文件就是hive中给到spark提交任务的配置,大方点把配置spark.executor.memory 从500改成1G试试看

  4. 结果
    重新启动hive后成功运行
    部署:hive on spark报错Failed to execute spark task(Failed to submit Spark work, please retry later)‘_第2张图片

你可能感兴趣的:(hive,spark,大数据)