flink 问题总结(5)如何读取Kerberos认证的hadoop数据

问题: flink1.8 如何读取待Kerberos认证的hdfs数据?

org.apache.hadoop.security.AccessControlException: 
SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] 

解决办法:

  1. export HADOOP_CONF_DIR= {core-site.xml 和hdfs-site.xml的位置}

2.配置flink-conf.yml:

security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: 
security.kerberos.login.principal: 
env.java.opts: -Djava.security.krb5.conf=

  1. flink lib目录下添加hadoop相应保本的uber jar https://flink.apache.org/downloads.html

  2. flink 工程pom文件中要包含如下依赖

compile "org.apache.flink:flink-java:$flinkVersion"
compile "org.apache.flink:flink-clients_2.11:$flinkVersion"
compile 'org.apache.hadoop:hadoop-hdfs:$hadoopVersion'
compile 'org.apache.hadoop:hadoop-client:$hadoopVersion'

参考链接:

https://stackoverflow.com/questions/34596165/how-to-do-kerberos-authentication-on-a-flink-standalone-installation

你可能感兴趣的:(flink 问题总结(5)如何读取Kerberos认证的hadoop数据)