hive2 GettingStarted的那些坑

最近开始学习Hive,照着官方文档https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-RunningHiveServer2andBeeline做的时候,在启动hiveserver2的时候,踩到一些坑,记录下。


异常1 :User: *** is not allowed to impersonate anonymous (state=08S01,code=0)

>beeline -u jdbc:hive2://localhost:10000
Connecting to jdbc:hive2://localhost:10000
18/01/25 09:45:37 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: *** is not allowed to impersonate anonymous (state=08S01,code=0)
Beeline version 2.3.2 by Apache Hive
这个异常最简单的解决方案是:

修改$HADOOP_HOME/etc/hadoop/core-site.xml 


    hadoop.proxyuser.${username}.hosts
    *
  
  
    hadoop.proxyuser.${username}.groups
    *
  
${username}中替换成你报错的用户名。

这样做简单的让所有group和所有ip都可以访问hadoop,可能有些童鞋们不想配置得那么宽松。

那么可以进行严格配置,

具体的用户名用:hadoop.proxyuser.${username}.users
具体的用户组用:hadoop.proxyuser.${username}.groups

这里报错里的anonymous是用户名,所以严格的配置是:


                hadoop.proxyuser.someone.users
                hadoop,hive,anonymous,someone
                Allow the superuser adorechen to impersonate users of hadoop, hive, anonymous
        

        
                hadoop.proxyuser.someone.hosts
                172.29.6.3,localhost,127.0.0.1
                The superuser can connect only from host1 and host2 to impersonate a user
        


异常2:Connection Refused

Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)

出现上述异常,是因为你的ip地址(外网ip和localhost)没有配置在属性的值里。

hadoop.proxyuser.adorechen.hosts


异常3: Permission denied: user=anonymous, access=WRITE

Error: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=anonymous, access=WRITE, inode="/user/hive/warehouse":someone:somegroup:drwxrwxr-x
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)


出现这个错误是因为我们在使用beeline连接的时候没有指定username,所以默认使用的是anonymous,这里根据提示

inode="/user/hive/warehouse":someone:somegroup:drwxrwxr-x
要访问该节点需要username=someone(或者你给定的用户在somegroup)里,所以使用上面节点提到的用户登录(同时把这个username配置在你的hadoop core-site.xml文件里)。

beeline -u jdbc:hive2://localhost:10000 -n someone



参考资料:


https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html
http://mangocool.com/1461549960187.html

你可能感兴趣的:(hive)