spark-sql只显示默认数据库default问题

通过命令行在 hive 命令,在命令行 show databases;

显示初了 default 库之外数据库(正常)

当是在spark 安装目录下bin 启动 spark-sql 只是显示default ;

查看好多配置配置文件,最后是 hive的配置文件hive-site.xml  没有复制到 spark安装conf目录下;

复制后正常显示;

spark-sql> show databases;
19/10/18 16:46:51 INFO execution.SparkSqlParser: Parsing command: show databases
19/10/18 16:46:53 INFO metastore.HiveMetaStore: 0: get_databases: *
19/10/18 16:46:53 INFO HiveMetaStore.audit: ugi=root    ip=unknown-ip-addr      cmd=get_databases: *
19/10/18 16:46:53 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
19/10/18 16:46:53 INFO metastore.ObjectStore: ObjectStore, initialize called
19/10/18 16:46:53 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing
19/10/18 16:46:53 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
19/10/18 16:46:53 INFO metastore.ObjectStore: Initialized ObjectStore
19/10/18 16:46:54 INFO codegen.CodeGenerator: Code generated in 453.516638 ms
accounting
default
traffic
Time taken: 3.032 seconds, Fetched 3 row(s)
19/10/18 16:46:54 INFO CliDriver: Time taken: 3.032 seconds, Fetched 3 row(s)

你可能感兴趣的:(spark)