SparkException——Dynamic partition strict mode 问题解决

问题场景

在spark-shell控制台,运行testDF.write.mode("append").partitionBy("dt").saveAsTable("t_pgw_base_statistics_final_dy_test");,提示org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

解决办法

设置参数,就可以存入了。

sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict");

你可能感兴趣的:(spark,spark,SQL,大数据,数据库学习,hive)