Spark本地运行时出现java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200.

Spark本地运行时出现java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200.

  • 1.出现错误的代码
    • 1.出现该错误的原因
  • 2.解决方法

1.出现错误的代码

java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
	at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:217)
	at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:199)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
	at org.apache.spark.SparkContext.(SparkContext.scala:423)
	at test.WordCount$.main(WordCount.scala:9)
	at test.WordCount.main(WordCount.scala)
21/01/18 20:08:03 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
	at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:217)
	at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:199)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
	at org.apache.spark.SparkContext.(SparkContext.scala:423)
	at test.WordCount$.main(WordCount.scala:9)
	at test.WordCount.main(WordCount.scala)

1.出现该错误的原因

是因为spark在计算时内存不足引起的

2.解决方法

val sparkConf = new SparkConf().set("spark.testing.memory","2147480000")

在创建时加上.set(“spark.testing.memory”,“2147480000”)就可以解决。

你可能感兴趣的:(Spark,spark,大数据)