启动spark报错failed to launch org.apache.spark.deploy.master.Master

今天遇到一个启动spark的错误,启动日志如下:

[root@master spark-2.0.2]# ./sbin/start-all.sh 
starting org.apache.spark.deploy.master.Master, logging to /export/service/spark-2.0.2/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
failed to launch org.apache.spark.deploy.master.Master:
full log in /export/service/spark-2.0.2/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
worker1: starting org.apache.spark.deploy.worker.Worker, logging to /export/service/spark-2.0.2/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-worker1.out
worker2: starting org.apache.spark.deploy.worker.Worker, logging to /export/service/spark-2.0.2/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-worker2.out
worker3: starting org.apache.spark.deploy.worker.Worker, logging to /export/service/spark-2.0.2/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-worker3.out

master启动正常,但是只有一个worker正常启动,google相关问题之后找到原因:主节点(Master)所在的机器系统负载过重所导致。也就是说主节点系统配置太低

参考:
https://www.ibm.com/support/knowledgecenter/en/SSCTFE_1.1.0/com.ibm.azk.v1r1.azka100/topics/azkic_u_troubleshooting-spark.htm

你可能感兴趣的:(大数据)