spark在yarn上运行报错:Yarn application has already ended

spark在yarn上运行报错:

Exception in thread "main" org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:124)
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
    at org.apache.spark.SparkContext.(SparkContext.scala:530)
    at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:29)
    at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

查阅了大量资料也没有解决,最后看到一篇讲可能和JDK1.8版本有些兼容问题,按照他的办法得以解决,感谢那位大牛...
解决办法如下:

打开yarn-site.xml添加以下配置信息!


    yarn.nodemanager.pmem-check-enabled
    false


    yarn.nodemanager.vmem-check-enabled
    false

----------------------------------从网上找到了关于配置文件的解释-------和jdk版本是否有兼容问题仍然未知-------------------------------------
yarn.nodemanager.pmem-check-enabled
是否检查每个任务正使用的物理内存量,如果超过默认值则将其杀死,默认是true。
yarn.nodemanager.vmem-check-enabled
是否检查每个任务正使用的虚拟内存量,如果超过默认值则将其杀死,默认是true。

你可能感兴趣的:(HPE大数据学习,零基础学习大数据)