Spark启动报错:Error initializing SparkContext

报错信息

java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1298 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.

解决方案

调整yarn相关参数配置,全部改为2G

  • yarn.scheduler.maximum-allocation-mb
  • yarn.nodemanager.resource.memory-mb

Spark启动报错:Error initializing SparkContext_第1张图片
Spark启动报错:Error initializing SparkContext_第2张图片
重启解决报错

[root@master cloudera]# spark-shell 
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/09/23 16:12:22 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
19/09/23 16:12:22 WARN lineage.LineageWriter: Lineage directory /var/log/spark/lineage doesn't exist or is not writable. Lineage for this application will be disabled.
Spark context Web UI available at http://master.cdh:4040
Spark context available as 'sc' (master = yarn, app id = application_1569226132926_0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.0-cdh6.3.0
      /_/
         
Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

你可能感兴趣的:(Hadoop)