报错: Only one SparkContext may be running in this JVM (see SPARK-2243).

org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext
出现这个问题的原因就是你创建了多个sparkcontext,就像下面这种用法,只需要干掉JavaSparkContext就可:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]”);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(conf , Durations.seconds(10));

解决这个问题两种方式:
方式1:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”) .setMaster(“local[4]”);
JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
方式2:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]”);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(jsc , Durations.seconds(10));

你可能感兴趣的:(技术,API)