因为需要熟悉新版本的功能,所以部署了Hadoop3.0.3。看了下hive的配套说明:
This release works with Hadoop 3.x.y. 所以选择hive3.0.0进行安装。但很无奈hive的文档真的太不详细了,按其一步步进行安装报了两个错,困扰了很久,现总结一下配置过程。前置条件为hadoop3.0.3正常运行,mysql服务配置完全
$ tar -xzvf hive-x.y.z.tar.gz
export HIVE_HOME={{pwd}}
export PATH=$HIVE_HOME/bin:$PATH
HADOOP_HOME=/opt/modules/hadoop-3.0.3
export HIVE_CONF_DIR=/opt/modules/hive-3.0.0-bin
<configuration>
<property>
<name>hive.metastore.schema.verificationname>
<value>falsevalue>
property>
<property>
<name>hive.metastore.warehouse.dirname>
<value>/user/hive/warehousevalue>
property>
<property>
<name>hive.exec.mode.local.autoname>
<value>falsevalue>
<description> Let Hive determine whether to run in local mode automatically <
/description>
property>
<property>
<name>hive.metastore.urisname>
<value>thrift://spark.bigdata.com:9083value>
<description>Thrift URI for the remote metastore. Used by metastore client to
connect to remote metastore.description>
property>
<property>
<name>javax.jdo.option.ConnectionURLname>
<value>jdbc:mysql://spark.bigdata.com:3306/hive?createDatabaseIfNotExist=true
value>
property>
<property>
<name>javax.jdo.option.ConnectionDriverNamename>
<value>com.mysql.jdbc.Drivervalue>
property>
<property>
<name>javax.jdo.option.ConnectionUserNamename>
<value>hivevalue>
property>
<property>
<name>javax.jdo.option.ConnectionPasswordname>
<value>hivemysqlvalue>
property>
<property>
<name>hive.cli.print.headername>
<value>truevalue>
property>
<property>
<name>hive.cli.print.current.dbname>
<value>truevalue>
property>
configuration>
schematool -dbType mysql -initSchema
此处会出现第一个问题:直接修改hive-default.xml,配置未生效,可以看下面的LOG,我已经将hive-default.xml中的hiveuser改为了hive,URL,Driver也做了相应的修改,可是执行schematool -dbType mysql -initSchema进行初始化的时候,还是使用了默认的URL,Driver和user APP。所以此处需要将hive-default.xml改为hive-site.xml,执行初始化成功,但在hive官网上没有明确说明(或者是我没理解透)。
[hadoop@spark conf]$ schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hive-3.0.0-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-3.0.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:derby:;databaseName=metastore_db;create=true
Metastore Connection Driver : org.apache.derby.jdbc.EmbeddedDriver
Metastore connection User: APP
Starting metastore schema initialization to 3.0.0
Initialization script hive-schema-3.0.0.mysql.sql
Error: Syntax error: Encountered "" at line 1, column 64. (state=42X01,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
mysql> source /opt/modules/hive-3.0.0-bin/scripts/metastore/upgrade/mysql/hive-schema-3.0.0.mysql.sql
[hadoop@spark bin]$ ./hive --service metastore &
hive (default)> show databases;
OK
database_name
default
Time taken: 3.157 seconds, Fetched: 1 row(s)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
需要在hive-site.xml配置好mysql访问用户名和密码,在hive-default.xml配置未生效。
hive (default)> show databases;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
hive.metastore.schema.verification参数设置需要为false,且在schemaTool初始化成功后,启动hive的metastore服务: ./hive –service metastore &
(文档结束)
Author:@Fighter10
Shenzhen,2018-07-11 13:55