spark Kryo serialization failed: Buffer overflow 错误

今天在写spark任务的时候遇到这么一个错误,我的spark版本是1.5.1.

1 Exception in thread "main" com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 0, required: 124
2     at com.esotericsoftware.kryo.io.Output.require(Output.java:138)
3     at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:220)
4     at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:206)
5     at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:29)
6     at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:18)
7     at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
8     at carbonite.serializer$write_map.invoke(serializer.clj:69)

提示是说kryo序列化缓冲区溢出,导致job执行失败。那好,就把缓冲区改大点好了。搜索了一下答案。

说是可以这么设置:

1 SparkConf sparkConf = new SparkConf();
2 sparkConf.set("spark.kryoserializer.buffer.mb","128"); 
3 JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);

但是在测试的时候,spark提示这个设置在spark1.4之后过时了,建议使用来设置。

1 spark.kryoserializer.buffer

那就改改:

1 SparkConf sparkConf = new SparkConf();
2 sparkConf.set("spark.kryoserializer.buffer","64");
3 JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);

问题解决。

 

转载于:https://www.cnblogs.com/fillPv/p/5045928.html

你可能感兴趣的:(spark Kryo serialization failed: Buffer overflow 错误)