failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker

报错如下:

$ startspark
org.apache.spark.deploy.master.Master running as process 14542.  Stop it first.
Desktop: org.apache.spark.deploy.worker.Worker running as process 14710.  Stop it first.
Laptop: starting org.apache.spark.deploy.worker.Worker, logging to /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out
Laptop: failed to launch: nice -n 0 /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://Desktop:7077
Laptop:   	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
Laptop:   	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
Laptop:   	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
Laptop:   	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
Laptop:   	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
Laptop:   	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
Laptop:   	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
Laptop:   	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
Laptop:   	at java.lang.Thread.run(Thread.java:748)
Laptop:   20/05/05 22:57:51 INFO util.ShutdownHookManager: Shutdown hook called
Laptop: full log in /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out
org.apache.spark.deploy.history.HistoryServer running as process 14800.  Stop it first.

集群环境:

台式机(Desktop)

笔记本(Laptop)

$SPARK_HOME/conf/spark-env.sh

/etc/hosts中的设置是:

192.168.0.103 Desktop
192.168.0.102 Laptop

两个办法:

$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP无论master还是slave全部改成0.0.0.0

台式机中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Desktop

笔记本中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Laptop

 

产生故障的原因:

大家都是把master的配置批量拷贝到slave,然后忘记修改slave的$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP了

 

你可能感兴趣的:(Scala与Spark)