spark 1.6 idea本地运行错误

将spark maven更新至1.6,运行原有程序,发现以下错误:

java.lang.IllegalArgumentException: System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.
    at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:193)
    at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:175)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:354)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
    at org.apache.spark.SparkContext.(SparkContext.scala:457)
    at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
    at final_paper_time.main(final_paper_time.java:153)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
16/03/10 00:35:36 INFO SparkContext: Successfully stopped SparkContext

解决方案:修改运行配置,在VM中增加下述参数:

-Xms128m-Xmx512m-XX:MaxPermSize=300m-ea
spark 1.6 idea本地运行错误_第1张图片



 

转载于:https://www.cnblogs.com/zhoudayang/p/5260366.html

你可能感兴趣的:(大数据,scala,开发工具)