hive开启kerberos-beeline连接

1.kerberos安装

kerberos安装配置与使用:https://blog.csdn.net/qq_21383435/article/details/83625252

2. 生成 keytab

在 cdh1 节点,即 KDC server 节点上执行下面命令:

cd /var/kerberos/krb5kdc/
 
kadmin.local -q "addprinc -randkey hive/[email protected] "
kadmin.local -q "addprinc -randkey hive/[email protected] "
kadmin.local -q "addprinc -randkey hive/[email protected] "
 
kadmin.local -q "xst  -k hive.keytab  hive/[email protected] "
kadmin.local -q "xst  -k hive.keytab  hive/[email protected] "
kadmin.local -q "xst  -k hive.keytab  hive/[email protected] "

拷贝 hive.keytab 文件到其他节点的 /etc/hive/conf 目录

$ scp hive.keytab cdh1:/etc/hive/conf
$ scp hive.keytab cdh2:/etc/hive/conf
$ scp hive.keytab cdh3:/etc/hive/conf

并设置权限,分别在 cdh1、cdh2、cdh3 上执行:

$ ssh cdh1 "cd /etc/hive/conf/;chown hive:hadoop hive.keytab ;chmod 400 *.keytab"
$ ssh cdh2 "cd /etc/hive/conf/;chown hive:hadoop hive.keytab ;chmod 400 *.keytab"
$ ssh cdh3 "cd /etc/hive/conf/;chown hive:hadoop hive.keytab ;chmod 400 *.keytab"

由于 keytab 相当于有了永久凭证,不需要提供密码(如果修改 kdc 中的 principal 的密码,则该 keytab 就会失效),所以其他用户如果对该文件有读权限,就可以冒充 keytab 中指定的用户身份访问 hadoop,所以 keytab 文件需要确保只对 owner 有读权限(0400)

配置hive-site.xml


<property>
    <name>hive.server2.authenticationname>
    <value>kerberosvalue>
  property>
  <property>
    <name>hive.metastore.kerberos.principalname>
    <value>hive/[email protected]value>
  property>
  <property>
    <name>hive.server2.authentication.kerberos.principalname>
    <value>hive/[email protected]value>
    
  property>
<property>
  <name>hive.server2.authentication.kerberos.keytabname>
  <value>/etc/hive/conf/hive.keytabvalue>
property>
 
<property>
  <name>hive.metastore.sasl.enabledname>
  <value>truevalue>
property>
<property>
  <name>hive.metastore.kerberos.keytab.filename>
  <value>/etc/hive/conf/hive.keytabvalue>
property>
<property>
  <name>hive.metastore.kerberos.principalname>
  <value>hive/[email protected]value>
property>

最终beeline链接语句:

!connect jdbc:hive2://localhost:10001/default;principal=my_hive/[email protected] mysql_user mysql_passwd

在 core-site.xml 中添加:


  hadoop.proxyuser.hive.hosts
  *


  hadoop.proxyuser.hive.groups
  *


  hadoop.proxyuser.hdfs.hosts
  *


  hadoop.proxyuser.hdfs.groups
  *


  hadoop.proxyuser.HTTP.hosts
  *


  hadoop.proxyuser.HTTP.groups
  *


参考:https://blog.csdn.net/a118170653/article/details/43448133

你可能感兴趣的:(大数据-hive)