Spark开发 Java程序运行时提示异常:System memory 107374182 must be at least 471859200

原因:系统内存过小(spark1.5或者1.6以上有该问题)
解决方法如下:
源程序:
public static void main(String[] args) {
try(final SparkSession spark = SparkSession
.builder()
.master(“local”)
.appName(“JavaLocalWordCount”)
.getOrCreate()) {
改为
public static void main(String[] args) {
SparkConf conf = new SparkConf();
conf.set(“spark.testing.memory”, “107374182”);//单位Byte,大于512M
try(final SparkSession spark = SparkSession
.builder()
.master(“local”)
.appName(“JavaLocalWordCount”)
.config(conf)
.getOrCreate()) {

你可能感兴趣的:(大数据,Spark,SparkSession,System,memory)