Could not parse Master URL: 'loacl[12]'

Spark

代码里设置了是spark本地运行,Master是“local[12]”,但是运行后说无法识别。出现过好几次,具体原因没有找到。

解决方案:重启idea后就好了。

  val conf = new SparkConf().setAppName("ALSCode2Recomm").setMaster("local[*]")

报错信息:

18/10/24 11:46:10 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: 'loacl[12]'
	at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2735)
	at org.apache.spark.SparkContext.(SparkContext.scala:522)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2281)
	at com.shandjj.itemCFRating$.(itemCFRating.scala:17)
	at com.shandjj.itemCFRating$.(itemCFRating.scala)
	at com.shandjj.itemCFRating.main(itemCFRating.scala)

Python下也会有类似的这种情况

from pyspark import SparkConf,SparkContext

conf = SparkConf.setMaster('local[2]').set("spark.executor.memory","3g")
sc = SparkContext.getOrCreate(conf)
TypeError                                 Traceback (most recent call last)
 in ()
      1 from pyspark import SparkConf,SparkContext
      2 
----> 3 conf = SparkConf.setMaster('local[2]').set("spark.executor.memory","3g")
      4 sc = SparkContext.getOrCreate(conf)
      5 

TypeError: setMaster() missing 1 required positional argument: 'value'

原因:SparkConf后面少了括号

conf = SparkConf().setMaster('local[2]')

你可能感兴趣的:(DataScience)