spark2 dataframe map报错

在spark2中使用dataframe的map操作时候报错:

Error:(34, 20) Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.
        mRecord.map(teenager => teenager(0)+"lina").show(false);


这里有两种解决办法:

第一种:

        val spark = SparkSession.builder
        .master("local[4]")
        .appName("test1")
        .getOrCreate();
	import spark.implicits._
在要使用map的语句前面加上:

import spark.implicits._


第二种:

// No pre-defined encoders for Dataset[Map[K,V]], define explicitly
implicit val mapEncoder = org.apache.spark.sql.Encoders.kryo[Map[String, Any]]
// Primitive types and case classes can be also defined as
// implicit val stringIntMapEncoder: Encoder[Map[String, Any]] = ExpressionEncoder()

// row.getValuesMap[T] retrieves multiple columns at once into a Map[String, T]
teenagersDF.map(teenager => teenager.getValuesMap[Any](List("name", "age"))).collect()
// Array(Map("name" -> "Justin", "age" -> 19))

参照官方做法,自己注册一个encoder。一般是第一种方法中没有你要用的encoder的时候才自己注册。

你可能感兴趣的:(spark,spark)