Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service 'sparkMaster' failed 错误处理

1、搭建 Spark 的HA环境时报如下错误

Spark Command: /root/training/jdk1.8.0_144/bin/java -cp /root/training/spark-2.1.0-bin-hadoop2.7/conf/:/root/training/spark-2.1.0-bin-hadoop2.7/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host bigdata12 --port 7077 --webui-port 8080
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/05/13 21:17:54 INFO Master: Started daemon with process name: 2696@bigdata13
18/05/13 21:17:54 INFO SignalUtils: Registered signal handler for TERM
18/05/13 21:17:54 INFO SignalUtils: Registered signal handler for HUP
18/05/13 21:17:54 INFO SignalUtils: Registered signal handler for INT
18/05/13 21:17:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/05/13 21:17:55 INFO SecurityManager: Changing view acls to: root
18/05/13 21:17:55 INFO SecurityManager: Changing modify acls to: root
18/05/13 21:17:55 INFO SecurityManager: Changing view acls groups to: 
18/05/13 21:17:55 INFO SecurityManager: Changing modify acls groups to: 
18/05/13 21:17:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7080. Attempting port 7081.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7081. Attempting port 7082.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7082. Attempting port 7083.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7083. Attempting port 7084.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7084. Attempting port 7085.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7085. Attempting port 7086.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7086. Attempting port 7087.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7087. Attempting port 7088.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7088. Attempting port 7089.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7089. Attempting port 7090.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7090. Attempting port 7091.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
18/05/13 21:17:55 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service 'sparkMaster' failed after 16 retries (starting from 7077)! Consider explicitly setting the appropriate port for the service 'sparkMaster' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
	at java.lang.Thread.run(Thread.java:748)

    这是因为在HA环境下是不需要配置master的,所以将 spark-env.sh 配置中的 SPARK_MASTER_HOST 和 SPARK_MASTER_PORT 这两行注释掉再重启启动即可

你可能感兴趣的:(大数据)