spark运行任务报错:Container [...] is running beyond physical memory limits. Current usage: 3.0 GB of 3 GB ...

spark版本:1.6.0

scala版本:2.10

报错日志:

Application application_1562341921664_2123 failed 2 times due to AM Container for appattempt_1562341921664_2123_000002 exited with exitCode: -104
For more detailed output, check the application tracking page: http://winner-offline-namenode:8088/cluster/app/application_1562341921664_2123 Then click on links to logs of each attempt.
Diagnostics: Container [pid=8891,containerID=container_e12_1562341921664_2123_02_000001] is running beyond physical memory limits. Current usage: 3.0 GB of 3 GB physical memory used; 5.0 GB of 6.3 GB virtual memory used. Killing container.
Dump of the process-tree for container_e12_1562341921664_2123_02_000001 :
|- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE
|- 8891 8889 8891 8891 (bash) 0 0 112881664 367 /bin/bash -c LD_LIBRARY_PATH=/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64: /usr/java/jdk1.8.0_162/bin/java -server -Xmx2048m -Djava.io.tmpdir=/disk1/hadoop/yarn/local/usercache/hadoop/appcache/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001/tmp '-XX:+PrintGCDetails' '-XX:+PrintGCTimeStamps' -Dhdp.version=2.6.4.0-91 -Dspark.yarn.app.container.log.dir=/disk1/hadoop/yarn/log/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001 org.apache.spark.deploy.yarn.ApplicationMaster --class 'com.winner.wifi.WinWifiAPIncre' --jar file:/hadoop/datadir/guozy/jar/winnercloud-assembly.jar --arg '2019-07-02' --arg 'bg' --arg 'St_P00094#St_P00087#Xsj_00001P00016#Cqhm_P00091#Langold_00004P00001#Xcjt_00004P00010#Djzx_P00001#Xsj_00001P00035#Xsj_00001P00013#Whc_P00081#Sygc_00001P00001#Szc_P00017#Lyyysc_P00001#Longfor_00001P00013#Lgjt_P00040#Wfjjt_P00028#Qzkyss_00001P00001#Hfjt_P00002#Xxhjgjgc_P00001#Xsj_00001P00024#Wdl_P00103#Hualian_00008P00020#Xl_P00067#Bfc_P00114#Snjt_00001P00001#Lfjt_P00002#Xuhui_P00050#Bailian_00002P00051#Xinyuan_P00077#Wdl_P00105#Hualian_P00004#Wfjjt_P00095#Fxhyjt_00001P00001#Xsj_00001P00023#Xsj_00001P00020#Xhbh_P00089#Szc_P00001#Yxgw_00001P00001#Wfjjt_P00035#Mybh_P00001#Yhjt_00002P00001#Wdl_P00106#Nydtjt_00001P00001#Xsj_00001P00002#Xuhui_P00038#Xsj_00001P00017#Xsj_00001P00012#Xsj_00001P00027#Lfjt_P00001#Xsj_00001P00034#Xhbh_P00108#Wjjt_P00022#Hjgc_P00001#Zxtfgc_P00001#Xhzb_P00001#Txjt_P00001#Mybh_P00003#Hdjt_00002P00018#Bailian_00002P00026#Hrzd_00002P00001#Bailian_00002P00052#Hxmkl_P00007#Ayjt_P00002#Wdl_P00107#Bailian_00002P00004#Ayjt_00004P00013#Hxmkl_P00009#Xsj_00001P00006#Wfjjt_P00027' --arg '1' --arg '150' --executor-memory 2048m --executor-cores 1 --properties-file /disk1/hadoop/yarn/local/usercache/hadoop/appcache/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001/__spark_conf__/__spark_conf__.properties 1> /disk1/hadoop/yarn/log/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001/stdout 2> /disk1/hadoop/yarn/log/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001/stderr
|- 8906 8891 8891 8891 (java) 51299 2993 5230977024 787116 /usr/java/jdk1.8.0_162/bin/java -server -Xmx2048m -Djava.io.tmpdir=/disk1/hadoop/yarn/local/usercache/hadoop/appcache/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001/tmp -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Dhdp.version=2.6.4.0-91 -Dspark.yarn.app.container.log.dir=/disk1/hadoop/yarn/log/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001 org.apache.spark.deploy.yarn.ApplicationMaster --class com.winner.wifi.WinWifiAPIncre --jar file:/hadoop/datadir/guozy/jar/winnercloud-assembly.jar --arg 2019-07-02 --arg bg --arg St_P00094#St_P00087#Xsj_00001P00016#Cqhm_P00091#Langold_00004P00001#Xcjt_00004P00010#Djzx_P00001#Xsj_00001P00035#Xsj_00001P00013#Whc_P00081#Sygc_00001P00001#Szc_P00017#Lyyysc_P00001#Longfor_00001P00013#Lgjt_P00040#Wfjjt_P00028#Qzkyss_00001P00001#Hfjt_P00002#Xxhjgjgc_P00001#Xsj_00001P00024#Wdl_P00103#Hualian_00008P00020#Xl_P00067#Bfc_P00114#Snjt_00001P00001#Lfjt_P00002#Xuhui_P00050#Bailian_00002P00051#Xinyuan_P00077#Wdl_P00105#Hualian_P00004#Wfjjt_P00095#Fxhyjt_00001P00001#Xsj_00001P00023#Xsj_00001P00020#Xhbh_P00089#Szc_P00001#Yxgw_00001P00001#Wfjjt_P00035#Mybh_P00001#Yhjt_00002P00001#Wdl_P00106#Nydtjt_00001P00001#Xsj_00001P00002#Xuhui_P00038#Xsj_00001P00017#Xsj_00001P00012#Xsj_00001P00027#Lfjt_P00001#Xsj_00001P00034#Xhbh_P00108#Wjjt_P00022#Hjgc_P00001#Zxtfgc_P00001#Xhzb_P00001#Txjt_P00001#Mybh_P00003#Hdjt_00002P00018#Bailian_00002P00026#Hrzd_00002P00001#Bailian_00002P00052#Hxmkl_P00007#Ayjt_P00002#Wdl_P00107#Bailian_00002P00004#Ayjt_00004P00013#Hxmkl_P00009#Xsj_00001P00006#Wfjjt_P00027 --arg 1 --arg 150 --executor-memory 2048m --executor-cores 1 --properties-file /disk1/hadoop/yarn/local/usercache/hadoop/appcache/application_1562341921664_2123/container_e12_1562341921664_2123_02_000001/__spark_conf__/__spark_conf__.properties
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
Failing this attempt. Failing the application. 

报错截图:

    spark运行任务报错:Container [...] is running beyond physical memory limits. Current usage: 3.0 GB of 3 GB ..._第1张图片

主要资源配置:

  --driver-memory 2g

       --executore-memory 2g

临时解决方案:

      增大driver端的内存

       --driver-memory 3g

       --executore-memory 2g

问题解决。

补充(根本原因):

 之前一直从网上搜索了很多的问题解决方案,基本大多数都是让我们调整参数,比如,在尝试了各种参数的调整之后,还是没有效果,所以,就想应该是代码上出了问题,回去仔细查看了代码,找到了原因。

  1、代码中使用了for循环,并在在否循环中使用了大量的广播变量,而这些广播变量基本都是一些配置文件,完全是没有必要每次循环都要广播一遍,只需要广播一遍即可,因为,spark中,在对某个变量进行广播的时候,driver端会先将变量收到driver段之后,然后在进行往各个节点分发,这样每循环一次,driver端就会想变量收集一次,这样,随着循环次数的增多,driver段的内存越来越紧张,知道最后,导致driver段内存撑爆。

  2、虽然一开始的时候每次使用完广播变量之后,都调用了unpersist方法进行了释放广播变量,但是driver的gc实际上还是调用的系统的gc方式,即system.gc,而driver段的gc默认是没半个小时执行一次,所以即使我们调用了unpersist方法,也不是说马上就会进行垃圾变量清楚。

  3、其次就是,我在submit的时候修改了gc的算法,改为了G1回收算法,但是我觉得这个应该不是导致driver段内存爆满的原因。

  4、反思:遇到问题的时候还是思考的有点少,其实调整参数个人认为只能说是一些基本的辅助作用,根本原因还是在我们写的代码上面,争取将代码调整到最优之后,基本就没什么问题了。

 

   

 

转载于:https://www.cnblogs.com/Gxiaobai/p/11166986.html

你可能感兴趣的:(大数据,java,scala)