如何设置和读取Spark配置

设置

在SaprkConf中设置,如

val conf = new SparkConf().setMaster("local").setAppName("Myapplication").set("spark.executor.memory","1g") 
val sc = new SparkContext(conf)

或者通过System.properties设置,如

System.setProperty("spark.executor.memory","14g") 
System.setProperty("spark.worker.memory","16g") 
val conf = new SparkConf().setAppName("Simple Application") 
val sc = new SparkContext(conf)

参考:Spark配置参数详解 - 知乎

读取

1.通过Web UI

如何设置和读取Spark配置_第1张图片

2.spark-shell

scala> println(System.getProperty("spark.app.name"))
Spark shell

scala> println(System.getProperty("spark.driver.cores"))
1

scala> println(System.getProperty("spark.executor.memory"))
4G

scala> println(System.getProperty("spark.serializer"))
org.apache.spark.serializer.JavaSerializer

scala> println(System.getProperty("spark.default.parallelism"))
null

配置的优先级

SparkConf上设置的属性具有最高的优先级,其次是传递给spark-submit或者spark-shell的属性值,最后是spark-defaults.conf文件中的属性值。

优先级顺序:

SparkConf > CLI > spark-defaults.conf

你可能感兴趣的:(大数据,Spark,spark,大数据)