Spark查看变量数据类型和Dataframe每列类型

  • 变量类型
val x = 5
println(x.getClass)

结果是:
class java.lang.Integer
  • 查看Dataframe每列类型
val data = Array(("1", "2", "3", "4", "5"), ("6", "7", "8", "9", "10"))
val df = spark.createDataFrame(data).toDF("col1", "col2", "col3", "col4", "col5")

df.dtypes
结果:

 Array[(String, String)] = Array((col1,StringType), (col2,StringType), (col3,StringType), (col4,StringType), (col5,StringType))
  • 用Map存储列名和类型
val a = df.dtypes.toMap
println(a)

a: scala.collection.immutable.Map[String,String] = Map(col3 -> StringType, col2 -> StringType, col5 -> StringType, col1 -> StringType, col4 -> StringType)
Map(col3 -> StringType, col2 -> StringType, col5 -> StringType, col1 -> StringType, col4 -> StringType)

你可能感兴趣的:(Spark,Scala,DateFrame)