hive3.1.2 - hadoop3.2.1 搭建

前言

此前我写了一篇关于hive 1.2.1版本搭建的教程,参看https://blog.csdn.net/qq_45415730/article/details/106128376。本篇主要讲基于hadoop3.x hive的搭建,中间的一些步骤与上篇教程一样,大家可以先看下上面链接,本篇主要描述hive3.x与历史版本搭建的不同之处。本地模式用的较少就不复赘述,直接从local模式开始,远程模式与历史版本无异,本篇也不再赘述。

一、local模式

1.修改hive-site.xml

  
	  
	  hive.metastore.warehouse.dir  
	  /user/hive_remote/warehouse  
	  
	
		hive.exec.scratchdir
		/tmp/hive
		HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.
	
	
		hive.exec.local.scratchdir
		/opt/software/hive/temp/root
	
	
		hive.downloaded.resources.dir
		/opt/software/hive/temp/${hive.session.id}_resources
	
	
		hive.server2.logging.operation.log.location
		/opt/software/hive/temp/root/operation_logs
	

	
		hive.querylog.location
		/opt/software/hive/temp/root
	
	  
	  hive.metastore.local  
	  true  
	  
	   
	  
	  javax.jdo.option.ConnectionURL  
	  jdbc:mysql://localhost/hive_meta?createDatabaseIfNotExist=true  
	  
	   
	  
	  javax.jdo.option.ConnectionDriverName  
	  com.mysql.jdbc.Driver  
	  
	   
	  
	  javax.jdo.option.ConnectionUserName  
	  hive  
	  
	   
	  
	  javax.jdo.option.ConnectionPassword  
	  123  
	  

2.修改hive-env.sh

export HADOOP_HOME=/opt/software/hadoop

export HIVE_CONF_DIR=/opt/software/hive/conf

export HIVE_AUX_JARS_PATH=/opt/software/hive/lib

3.将hadoop/share/hadoop/common/lib 下的 guava-xx.jar 复制到hive/lib 下 将hive 的guava-xx.jar删除

4.大功告成了,赶快试试吧!

你可能感兴趣的:(大数据框架)