hive 创建临时函数异常分析

      一个java项目使用jdbc连接池连接hiveserver2 创建临时函数,第一次执行成功,第二次执行失败,返回错误如下:

ERROR org.apache.hadoop.hive.ql.Driver: [HiveServer2-Handler-Pool: Thread-82]: FAILED: SemanticException No valid privileges
 User dm_ds does not have privileges for CREATEFUNCTION
 The required privileges: Server=server1->URI=file:///tmp/023d5b17-76ff-48c4-b15b-6d9aca656892_resources/hive-udf-1.1.0.jar->action=*;
org.apache.hadoop.hive.ql.parse.SemanticException: No valid privileges
 User dm_ds does not have privileges for CREATEFUNCTION
 The required privileges: Server=server1->URI=file:///tmp/023d5b17-76ff-48c4-b15b-6d9aca656892_resources/hive-udf-1.1.0.jar->action=*;
    at org.apache.sentry.binding.hive.HiveAuthzBindingHook.postAnalyze(HiveAuthzBindingHook.java:573)

报的是本地/tmp目录下的包sentry没有授权,创建临时函数的语句:create temporary function decrypty as 'com.lkl.data.scrt.DeCryptyUDF' using jar 'hdfs:///xxx/xxx.jar' 使用的是hdfs上的目录,hdfs上目录已授权,跟本地目录有什么关系。开启hiveserver2服务端debug日志,执行程序,对日志进行分析:sql执行时hive首先在本地创建临时目录:org.apache.hadoop.hive.ql.session.SessionState: [HiveServer2-Handler-Pool: Thread-177]: Created local directory: /tmp/c6450358-bf9d-4a44-984d-52cdae6985c8_resources, 将hdfs上的函数包下载到本地路径加入class path: Added [/tmp/c6450358-bf9d-4a44-984d-52cdae6985c8_resources/hive-udf-1.1.0.jar] to class path, 第一次执行创建函数时验证sentry权限:Going to authorize statement CREATEFUNCTION for subject xxx
2019-05-17 13:43:29,405 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: requiredInputPrivileges = {URI=[ALL]}
2019-05-17 13:43:29,405 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: inputHierarchyList = [[Server [name=server1], URI [name=hdfs:///xxxx/hive-udf-1.1.0.jar]]]
2019-05-17 13:43:29,405 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: requiredOuputPrivileges = {URI=[ALL]}

 inputHierarchyList 使用的是hdfs路径,第二次Going to authorize statement CREATEFUNCTION for subject xxx 后
2019-05-17 13:44:43,148 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: requiredInputPrivileges = {URI=[ALL]}
2019-05-17 13:44:43,148 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: inputHierarchyList = [[Server [name=server1], URI [name=file:///tmp/c6450358-bf9d-4a44-984d-52cdae6985c8_resources/hive-udf-1.1.0.jar]]]
2019-05-17 13:44:43,148 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: requiredOuputPrivileges = {URI=[ALL]}
2019-05-17 13:44:43,148 DEBUG org.apache.sentry.binding.hive.authz.HiveAuthzBinding: [HiveServer2-Handler-Pool: Thread-177]: outputHierarchyList = [[Server [name=server1]], [Server [name=server1], URI [name=hdfs://

 inputHierarchyList 换成了本地目录,hiveserver2创建函数直接从本地缓存读取,导致后面权限检查失败。

解决方法: 在sentry中给临时目录授权。

你可能感兴趣的:(个人总结)