ERROR SparkContext: Error initializing SparkContext.java.lang.IllegalArgumentException: System memor

这是spark2.0之上的版本,也就是sparksql,创建配置的是时候使用SparkSession,。spark2.0之下的可以val conf= new SparkConf()          conf.set("spark.testing.memory", "2147480000")//后面的值大于512m即可

报错:

19/10/22 18:02:36 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
    at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:217)
    at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:199)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.(SparkContext.scala:424)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at spark.day6.SparkSessionTest2$.main(SparkSessionTest2.scala:13)
    at spark.day6.SparkSessionTest2.main(SparkSessionTest2.scala)
19/10/22 18:02:36 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
    at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:217)
    at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:199)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.(SparkContext.scala:424)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at spark.day6.SparkSessionTest2$.main(SparkSessionTest2.scala:13)
    at spark.day6.SparkSessionTest2.main(SparkSessionTest2.scala)

Process finished with exit code 1

 

 

解决方案:看转的文章。

1.种方案:

[转]https://blog.csdn.net/yizheyouye/article/details/50676022

2.方案:红色字体

val ss = SparkSession
  .builder()
  .appName("spark2,0")
  .master("local")
  .config("spark.testing.memory", "2147480000")
  //.config("fs.defaultFS","hdfs://admin:9000")//连接hdfs
  .getOrCreate()

你可能感兴趣的:(SparkSQL,SparkSession)