spark写入数据报错

报错信息如下:

Exception in thread "main" org.apache.spark.sql.AnalysisException: Cannot overwrite table dwd.dim_user_info that is also being read from
    at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:720)
    at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:626)
    at GS_1.task2$.writeDwdTable(task2.scala:103)
    at GS_1.task2$.teast1(task2.scala:65)
    at GS_1.task2$.main(task2.scala:109)
    at GS_1.task2.main(task2.scala)

我的解决办法如下:

    println("""这里有个报错:Exception in thread "main" org.apache.spark.sql.AnalysisException: Cannot overwrite table dwd.dim_user_info that is also being read from""")
    println("""读写同时报错,我的解决办法是,创建临时表b,删除原表a,表b创建表a,表a删除。""")
    frame.write.mode("overwrite").partitionBy("etl_date").saveAsTable(s"dwd.${dwdTableName}_B")   // 创建 B
    util.getSparkSession.sql(s"drop table dwd.${dwdTableName}")   // 删除 A
    util.getSparkSession.sql(s"select * from dwd.${dwdTableName}_B").write.mode("overwrite").partitionBy("etl_date").saveAsTable(s"dwd.${dwdTableName}")  // 复制B创建A
    util.getSparkSession.sql(s"drop table dwd.${dwdTableName}_B")   // 删除 B

 

你可能感兴趣的:(spark,大数据,分布式)