【error】SparkUI端口被占用

 ERROR ui.SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: bind: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:339)
	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:366)
	at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:369)
	at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:369)
	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2234)
	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2226)
	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:369)
	at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
	at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460)
	at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.SparkContext.(SparkContext.scala:460)
	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:839)
	at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:85)
	at flume_push_streaming$.main(flume_push_streaming.scala:11)
	at flume_push_streaming.main(flume_push_streaming.scala)

错误原因

每一个Spark任务都会占用一个SparkUI端口,默认为4040,如果被占用则依次递增端口重试。但是有个默认重试次数,为16次。16次重试都失败后,会放弃该任务的运行。

解决方法:

创建SparkConf 对象时设置重连次数

  • key: spark.port.maxRetries
  • value: 100

你可能感兴趣的:(#,Spark,#,Error)