详解:hive启动hiveserver2连JDBC报错:Could not open client transport with JDBC Uri 解决方案

hive启动hiveserver2连JDBC报错:Could not open client transport with JDBC Uri 解决方案

 [hadoop@hadoop001 bin]$ ./beeline -u jdbc:hive2://hadoop001:10000/default -n hadoop
ls: cannot access /home/hadoop/app/spark-2.4.0-bin-2.6.0-cdh5.7.0/lib/spark-assembly-*.jar: No such file or directory
which: no hbase in (/home/hadoop/app/sqoop-1.4.6-cdh5.7.0/bin:/home/hadoop/app/spark-2.4.0-bin-2.6.0-cdh5.7.0/bin:/home/hadoop/app/scala-2.11.8/bin:/home/hadoop/app/hive-1.1.0-cdh5.7.0/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/bin:/usr/java/jdk1.8.0_45/bin:/opt/software/protobuf/bin:/bin:/opt/software/apache-maven-3.3.9/bin:/usr/java/jdk1.8.0_45/bin:/opt/software/apache-maven-3.3.9/bin:/usr/java/jdk1.8.0_45/bin:/usr/java/jdk1.8.0_45/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
scan complete in 2ms
Connecting to jdbc:hive2://localhost:10000/default
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (state=08S01,code=0)
Beeline version 1.1.0-cdh5.7.0 by Apache Hive
0: jdbc:hive2://localhost:10000/default (closed)> 

图片展示:

详解:hive启动hiveserver2连JDBC报错:Could not open client transport with JDBC Uri 解决方案_第1张图片

解决方案:

原因:hiveserver2增加了权限控制,需要在hadoop的配置文件中配置

解决方法:在hadoop的core-site.xml中添加如下内容,然后重启hadoop,再使用beeline连接即可

参考官网:

https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html

[hadoop@hadoop001 hadoop]$ pwd

/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop[hadoop@hadoop001 sbin]$
[hadoop@hadoop001 hadoop]$ vi core-site.xml

hadoop.proxyuser.hadoop.hosts



hadoop.proxyuser.hadoop.groups

注意补充:

详解:hive启动hiveserver2连JDBC报错:Could not open client transport with JDBC Uri 解决方案_第2张图片
对于每个superUser用户,hosts必须进行配置,而groups和users至少需要配置一个。

这几个配置项的值都可以使用*来表示允许所有的主机/用户组/用户。

例如:

hadoop.proxyuser.userA.hosts * hadoop.proxyuser.userA.users user1,user2

表示允许用户userA,在任意主机节点,代理用户user1和user2

代理用户权限相关配置的改动,需要修改core-site.xml文件中相关配置。修改后配置并不会自动更新到服务器(这与fair- scheduler.xml配置会自动更新不同)。修改配置后需要执行以下命令进行同步,分别将信息更新到namenode和 resourcemananger上。

hdfs dfsadmin –refreshSuperUserGroupsConfiguration
 
yarn rmadmin –refreshSuperUserGroupsConfiguration

配置完,需要重新启动

[hadoop@hadoop001 sbin]$ pwd
/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/sbin
[hadoop@hadoop001 sbin]$ ./stop-all.sh

[hadoop@hadoop001 sbin]$ ./start-all.sh

在去连接就OK了啦

[hadoop@hadoop001 bin]$ pwd
/home/hadoop/app/hive-1.1.0-cdh5.7.0/bin
[hadoop@hadoop001 bin]$ ./beeline -u jdbc:hive2://hadoop001:10000/default -n hadoop
ls: cannot access /home/hadoop/app/spark-2.4.0-bin-2.6.0-cdh5.7.0/lib/spark-assembly-*.jar: No such file or directory
which: no hbase in (/home/hadoop/app/sqoop-1.4.6-cdh5.7.0/bin:/home/hadoop/app/spark-2.4.0-bin-2.6.0-cdh5.7.0/bin:/home/hadoop/app/scala-2.11.8/bin:/home/hadoop/app/hive-1.1.0-cdh5.7.0/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/bin:/usr/java/jdk1.8.0_45/bin:/opt/software/protobuf/bin:/bin:/opt/software/apache-maven-3.3.9/bin:/usr/java/jdk1.8.0_45/bin:/opt/software/apache-maven-3.3.9/bin:/usr/java/jdk1.8.0_45/bin:/usr/java/jdk1.8.0_45/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
scan complete in 2ms
Connecting to jdbc:hive2://hadoop001:10000/default
Connected to: Apache Hive (version 1.1.0-cdh5.7.0)
Driver: Hive JDBC (version 1.1.0-cdh5.7.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.1.0-cdh5.7.0 by Apache Hive
0: jdbc:hive2://hadoop001:10000/default>

你可能感兴趣的:(详解:hive启动hiveserver2连JDBC报错:Could not open client transport with JDBC Uri 解决方案)