Hive自定义设置连接用户名和密码

简介

  • 当hive客户端搭建起来之后,应用项目远程连接hive需要设置用户名和密码;
  • 由于hive默认的用户名和密码都是空的,所以需要我们自定义用户名和密码;

实践

  • 首先,需要先用java开发工具打包一个jar工具类,用于解析用户名和密码,可直接下载这个jar工具包:hiveAuth.jar;
  • 也可以自行通过编写代码进行打jar包,代码如下:
package org.apache.hadoop.hive.contrib.auth;
 
 
import javax.security.sasl.AuthenticationException;
 
 
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.slf4j.Logger;
 
 
public class CustomPasswdAuthenticator implements org.apache.hive.service.auth.PasswdAuthenticationProvider{
	
	private Logger LOG = org.slf4j.LoggerFactory.getLogger(CustomPasswdAuthenticator.class);
	
	private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";
	
	private Configuration conf=null;
	
	@Override
	public void Authenticate(String userName, String passwd)  
		      throws AuthenticationException {  
		    LOG.info("user: "+userName+" try login.");  
		    String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));  
		    if(passwdConf==null){  
		      String message = "user's ACL configration is not found. user:"+userName;  
		      LOG.info(message);  
		      throw new AuthenticationException(message);  
		    }   
		    if(!passwd.equals(passwdConf)){  
		      String message = "user name and password is mismatch. user:"+userName;  
		      throw new AuthenticationException(message);  
		    }  
		  }  
	
		  public Configuration getConf() {  
		    if(conf==null){  
		      this.conf=new Configuration(new HiveConf());  
		    }  
		    return conf;  
		  }  
		  
		  public void setConf(Configuration conf) {  
		    this.conf=conf;  
		  }
	
	
}
  •  之后,将jar包放在hive根目录的lib目录下,同时,需要修改conf下的hive-site.xml配置文件;


hive.server2.authentication
CUSTOM

 


hive.server2.custom.authentication.class
org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticator
  
 


 hive.jdbc_passwd.auth.muzili
 muzili
  
  • 最后还需要修改hadoop的相关文件,切换到hadoop配置文件目录:hadoop/etc/hadoop,修改hadoop:core-site.xml,否则java连接hive没权限

  hadoop.proxyuser.hadoop.hosts
  *


  hadoop.proxyuser.hadoop.groups
  *
  •  重启hadoop和hive,可以利用beeline命令去测试,这里用java的客户端去连接测试;

1. pom添加hive依赖


	org.apache.hive
	hive-jdbc
	2.1.1
	
		
			org.eclipse.jetty.aggregate
			*
		
	

2. 创建main方法连接测试

package com.springboot.sixmonth.common.util;
 
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
 
/**
 * hive连接测试类
 * @author muzili
 * @Date 2021年9月3日
 *
 */
public class HiveTest {
	
	//9019是自定义远程连接的端口,默认是10000
	private static final String URLHIVE = "jdbc:hive2://47.100.200.200:9019/default";
    private static Connection connection = null;
 
    public static Connection getHiveConnection() {
        if (null == connection) {
            synchronized (HiveTest.class) {
                if (null == connection) {
                    try {
                        Class.forName("org.apache.hive.jdbc.HiveDriver");
                        connection = DriverManager.getConnection(URLHIVE, "muzili", "muzili");
                        System.out.println("hive启动连接成功!");
                    } catch (SQLException e) {
                        e.printStackTrace();
                    } catch (ClassNotFoundException e) {
                        e.printStackTrace();
                    }
                }
            }
        }
        return connection;
    }
 
 
    public static void main(String args[]) throws SQLException{
    	
    	String sql1="select * from sixmonth limit 1";
    	PreparedStatement pstm = getHiveConnection().prepareStatement(sql1);
    	ResultSet rs= pstm.executeQuery(sql1);
    	
    	while (rs.next()) {
			System.out.println(rs.getString(2));
		}
    	pstm.close();
    	rs.close();
    	
    }
 
}

3. 运行成功之后,即可读取hive里面的数据

总结

  1. 开发工具远程连接hive可自定义端口,缺省10000,阿里云服务器的话需要添加端口的安全组;
  2. 实践是检验认识真理性的唯一标准,自己动手,丰衣足食,加油呀!!!

你可能感兴趣的:(hive,hadoop,big,data)