Hive安装详细步骤

Hive安装详细步骤

基础环境准备

Hive安装前,首先以安装完成Hadoop,且本文中使用MySQL作为Hive的元数据存储库,MySQL数据库也要提前准备好,关于Hadoop和MySQL的安装可参考另外两篇文章:

  • 完全分布式Hadoop集群搭建

  • CentOS离线安装MySQL

Hive安装包下载

直接从官网下载需要的版本即可:Hive官网

本文中安装的版本为:apache-hive-2.3.9

上传并解压安装包

本示例中Hive安装在/app目录下,解压后重命名为hive

cd /app
tar -zxvf apache-hive-2.3.9-bin.tar.gz
mv apache-hive-2.3.9-bin hive

配置环境变量

vim ~/.bashrc

# 添加以下内容
export HIVE_HOME=/app/hive
export PATH=$PATH:$HIVE_HOME/bin
export HADOOP_HOME=/app/hadoop-2.10.1

# 生效环境变量
source ~/.bashrc

查看Hive版本,正常输出无问题即可,如下:

# 查看Hive版本命令
hive --version
# 输出信息如下
Hive 2.3.9
Git git://chaos-mbp.lan/Users/chao/git/hive -r 92dd0159f440ca7863be3232f3a683a510a62b9d
Compiled by chao on Tue Jun 1 14:02:14 PDT 2021
From source with checksum 6715a3ba850b746eefbb0ec20d5a0187

修改Hive配置



<configuration>
    
    <property>
        <name>javax.jdo.option.ConnectionURLname>
        <value>jdbc:mysql://199.188.166.113:3306/hive?createDatabaseIfNotExist=true&useSSL=falsevalue>
        <description>JDBC connect string for a JDBC metastoredescription>
    property>
    
    <property>
        <name>javax.jdo.option.ConnectionDriverNamename>
        <value>com.mysql.jdbc.Drivervalue>
        <description>Driver class name for a JDBC metastoredescription>
    property>
    
    <property>
        <name>javax.jdo.option.ConnectionUserNamename>
        <value>hivevalue>
        <description>username to use against metastore databasedescription>
    property>
    
    <property>
        <name>javax.jdo.option.ConnectionPasswordname>
        <value>zhaotyvalue>
        <description>password to use against metastore databasedescription>
    property>
    
    <property>
        <name>hive.metastore.warehouse.dirname>
        <value>/user/hive/warehousevalue>
        <description>location of default database for the warehousedescription>
    property>
    
    <property>
        <name>hive.cli.print.current.dbname>
        <value>truevalue>
        <description>Whether to include the current database in the Hive prompt.description>
    property>
    
    <property>
        <name>hive.cli.print.current.dbname>
        <value>truevalue>
        <description>Whether to include the current database in the Hive prompt.description>
    property>
    
    <property>
        <name>hive.cli.print.headername>
        <value>truevalue>
    property>
    
    
    <property>
        <name>hive.exec.mode.local.autoname>
        <value>truevalue>
        <description>Let Hive determine whether to run in local mode automaticallydescription>
    property>
configuration>

上传MySQL驱动

将 mysql驱动上传到$HIVE_HOME/lib目录下,本机若存在直接CP即可,若没有,可从官网下载:MySQL :: Download Connector/J

初始化元数据

[hadoop@node3 bin]$ schematool -dbType mysql -initSchema
# 输出信息如下
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/app/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/app/hadoop-2.10.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://199.188.166.113:3306/hive?allowPublicKeyRetrieval=true&createDatabaseIfNotExist=true&useSSL=false
Metastore Connection Driver :    com.mysql.cj.jdbc.Driver
Metastore connection User:       hive
Starting metastore schema initialization to 2.3.0
Initialization script hive-schema-2.3.0.mysql.sql
Initialization script completed
schemaTool completed

启动hive

# 启动hive服务之前,请先启动hadoop集群
[hadoop@node3 ~]$ hive
# 输出信息如下,成功进入hive
which: no hbase in (/home/hadoop/.local/bin:/home/hadoop/bin:/app/java/jdk1.8.0_181/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/app/mysql8/bin:/app/hadoop-2.10.1/bin:/app/hive/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/app/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/app/hadoop-2.10.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/app/hive/lib/hive-common-2.3.9.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive (default)> 

Hive日志

Hive的日志默认存放在/tmp/hadoop目录下(hadoop为当前用户名),这个位置可以修改,也可以不改,但是要知道位置。

vi $HIVE_HOME/conf/hive-log4j2.properties
# 添加以下内容:
property.hive.log.dir=/app/hive/logs

你可能感兴趣的:(大数据,hive,hadoop,big,data)