pyspark 判断 Hive 表是否存在

Catalog.tableExists(tableName: str, dbName: Optional[str] = None)bool
'''
tableName:表名
dbName:库名(可选)
return:bool 值
'''
from pyspark.sql import SparkSession
spark = SparkSession \
        .builder \
        .appName('tableExists') \
        .config('spark.num.executors', '6') \
        .config('spark.executor.memory', '12g') \
        .config('spark.driver.memory', '2g') \
        .config('spark.executor.cores', '4') \
        .config('spark.default.parallelism', '50') \
        .config('spark.executor.memoryOverhead', '2g') \
        .config('spark.task.maxFailures', '10') \
        .config("spark.dynamicAllocation.enabled", 'false') \
        .config("spark.sql.broadcastTimeout", "3600") \
        .enableHiveSupport() \
        .getOrCreate()
# 例1
spark.catalog.tableExists("unexisting_table") # False
_ = spark.sql("DROP TABLE IF EXISTS tbl1")
_ = spark.sql("CREATE TABLE tbl1 (name STRING, age INT) USING parquet")
spark.catalog.tableExists("tbl1") # True

# 例2 不同方式
spark.catalog.tableExists("default.tbl1")
spark.catalog.tableExists("spark_catalog.default.tbl1")
spark.catalog.tableExists("tbl1", "default")
_ = spark.sql("DROP TABLE tbl1")

pyspark.sql.Catalog.tableExists

你可能感兴趣的:(大数据,Hive,Spark,hive,hadoop,数据仓库,pyspark,判断表是否存在,catalog)