hive的安装配置及使用

hive的安装配置及使用

参考博文:https://blog.csdn.net/jssg_tzw/article/details/72898635#

安装mysql

1、官网下载mysql-server(yum安装)

wget http://dev.mysql.com/get/mysql-community-release-el7-5.noarch.rpm

2、解压

rpm -ivh mysql-community-release-el7-5.noarch.rpm

3、安装

yum install mysql-community-server

4、重启mysql服务:

service mysqld restart

5、进入mysql:

mysql -u root

6、为root用户设置密码root:

mysql> set password for 'root'@'localhost' =password('root');

7、把在所有数据库的所有表的所有权限赋值给位于所有IP地址的root用户:

mysql> grant all privileges on *.* to root@' %'identified by 'root';
mysql>flush privileges; //刷新权限

8、如果是新用户而不是root,则要先新建用户:

mysql>create user 'username'@' %' identified by 'password';  

安装Apache Hive

1、下载和解压

#解压
 sudo tar -zxvf apache-hive-2.1.1-bin.tar.gz 
#把解压好的移到/user/local/下
 sudo mv apache-hive-2.1.1-bin /usr/local/hive

2、Hive配置Hadoop HDFS
①、hive-site.xml配置
进入目录$HIVE_HOME/conf,将hive-default.xml.template文件复制一份并改名为hive-site.xml

#进入hive配置文件目录$HIVE_HOME/conf
 cd $HIVE_HOME/conf 
#拷贝并重命名
 cp hive-default.xml.template hive-site.xml

使用hadoop新建hdfs目录

执行hadoop命令新建/user/hive/warehouse目录:

#新建目录/user/hive/warehouse
 $HADOOP_HOME/bin/hadoop dfs -mkdir -p /user/hive/warehouse
#给新建的目录赋予读写权限
 sh $HADOOP_HOME/bin/hdfs dfs -chmod 777 /user/hive/warehouse
#查看修改后的权限
 sh $HADOOP_HOME/bin/hdfs dfs -ls /user/hive
#运用hadoop命令新建/tmp/hive目录
 $HADOOP_HOME/bin/hdfs dfs -mkdir -p /tmp/hive
#给目录/tmp/hive赋予读写权限
$HADOOP_HOME/bin/hdfs dfs -chmod 777 /tmp/hive
#检查创建好的目录
 $HADOOP_HOME/bin/hdfs dfs -ls /tmp

②、修改HIVE_HOME/conf/hive-site.xml中的临时目录
将hive-site.xml文件中的${system:java.io.tmpdir}替换为hive的临时目录

cd $HIVE_HOME
mkdir tmp

配置文件hive-site.xml:
将文件中的所有 {system:java.io.tmpdir}替换成/usr/local/hive/tmp
将文件中所有的{system:user.name}替换为root

3、把mysql的驱动包上传到Hive的lib目录下:
①、

#上传
sudo cp mysql-connector-java-5.1.36.jar $HIVE_HOME/lib
#查看文件是否上传到了$HIVE_HOME/lib目录下
ls -la $HIVE_HOME/lib/ | grep "mysql*"

②、修改hive-site.xml数据库相关配置
搜索javax.jdo.option.connectionURL,将该name对应的value修改为MySQL的地址:


  javax.jdo.option.ConnectionURL
  jdbc:mysql://192.168.56.181:3306/hive?createDatabaseIfNotExist=true
  
    JDBC connect string for a JDBC metastore.
    To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
    For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
  

搜索javax.jdo.option.ConnectionDriverName,将该name对应的value修改为MySQL驱动类路径:


  javax.jdo.option.ConnectionDriverName
  com.mysql.jdbc.Driver
  Driver class name for a JDBC metastore


搜索javax.jdo.option.ConnectionUserName,将对应的value修改为MySQL数据库登录名:


  javax.jdo.option.ConnectionUserName
  root
  Username to use against metastore database

搜索javax.jdo.option.ConnectionPassword,将对应的value修改为MySQL数据库的登录密码:


  javax.jdo.option.ConnectionPassword
  Love88me
  password to use against metastore database

搜索hive.metastore.schema.verification,将对应的value修改为false:


  hive.metastore.schema.verification
  false
  
    Enforce metastore schema version consistency.
    True: Verify that version information stored in is compatible with one from Hive jars.  Also disable automatic
          schema migration attempt. Users are required to manually migrate schema after Hive upgrade which ensures
          proper metastore schema migration. (Default)
    False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
  

③、在$HIVE_HOME/conf目录下新建hive-env.sh

#进入目录
 cd $HIVE_HOME/conf
#将hive-env.sh.template 复制一份并重命名为hive-env.sh
 cp hive-env.sh.template hive-env.sh
#打开hive-env.sh并添加如下内容
 vim hive-env.sh
export HADOOP_HOME=/home/hadoop/hadoop2.7.3
export HIVE_CONF_DIR=/usr/local/hive/conf
export HIVE_AUX_JARS_PATH=/usr/local/hive/lib

启动和测试

1、

#进入$HIVE/bin
cd $HIVE_HOME/bin
#对数据库进行初始化:
schematool -initSchema -dbType mysql

执行成功后,查看mysql数据库

2、启动Hive

 ./hive

启动及使用Hive

启动Hadoop
初始化Metastore架构:schematool -dbType mysql -initSchema
启动Hive的命令:hive(进入到hive shell中)
Hive应用实例:wordcount
1、建数据源文件并上传到hdfs的/user/input目录下;
2、建数据源表t1:create table t1 (line string);
3、装载数据:load data inpath ‘/user/input’ overwrite into table t1;
4、编写HiveQL语句实现wordcount算法,建表wct1保存计算结果:
5、查看wordcount计算结果:select * from wct1;

你可能感兴趣的:(hive的安装配置及使用)