黑猴子的家:Spark 初次启动 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1....

1、log日志

[ndadmin@bignode2 logs]$ tail -n 300 \
spark-ndadmin-org.apache.spark.deploy.worker.Worker-1-bignode2.out.1

18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
18/10/23 15:47:44 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
Exception in thread "main" java.net.BindException: 
Cannot assign requested address: Service 'sparkWorker' 
failed after 16 retries (starting from 0)! 
Consider explicitly setting the appropriate port for the service 'sparkWorker' 
(for example spark.ui.port for SparkUI) 
to an available port or increasing spark.port.maxRetries.

    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:748)

2、log 日志 重要点

Exception in thread "main" java.net.BindException: 
Cannot assign requested address: Service 'sparkWorker' 
failed after 16 retries (starting from 0)! 
Consider explicitly setting the appropriate port for the service 'sparkWorker' 
(for example spark.ui.port for SparkUI) 
to an available port or increasing spark.port.maxRetries.

3、解决问题

[victor@hadoop102 conf]$ vim spark-env.sh
SPARK_LOCAL_IP=192.168.2.102

[victor@hadoop103 conf]$ vim spark-env.sh
SPARK_LOCAL_IP=192.168.2.103

[victor@hadoop104 conf]$ vim spark-env.sh
SPARK_LOCAL_IP=192.168.2.104

你可能感兴趣的:(Spark)