访问hfds报错AccessControlException

访问HDFS报错:org.apache.hadoop.security.AccessControlException: Permission denied

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.Test;

import java.io.IOException;
import java.net.URI;
import java.util.Properties;

public class demo01 {
    @Test
    public void mkdir01() throws Exception {
   

        Configuration configuration = new Configuration();
    configuration.set("fs.defaultFS","hdfs://hadoop101:9000");
        FileSystem fs = FileSystem.get(configuration);
        boolean mkdirs = fs.mkdirs(new Path("/0829"));
        System.out.println(mkdirs);
       fs.close();
    }

}

org.apache.hadoop.security.AccessControlException: Permission denied: user=IBM, access=WRITE, inode="/0829":root:supergroup:drwxr-xr-x
hadoop在访问hdfs的时候会进行权限认证,取用户名的过程是这样的:

读取HADOOP_USER_NAME系统环境变量,如果不为空,那么拿它作username,如果为空

读取HADOOP_USER_NAME这个java环境变量,如果为空

从com.sun.security.auth.NTUserPrincipal或者com.sun.security.auth.UnixPrincipal的实例获取username。

如果以上尝试都失败,那么抛出异常LoginException("Can’t find user name")

 

解决方案:

public class demo01 {
    @Test
    public void mkdir01() throws Exception {
   Properties properties = System.getProperties();
        properties.setProperty("HADOOP_USER_NAME", "root");

        Configuration configuration = new Configuration();
    configuration.set("fs.defaultFS","hdfs://hadoop101:9000");
        FileSystem fs = FileSystem.get(configuration);
        boolean mkdirs = fs.mkdirs(new Path("/0828"));
        System.out.println(mkdirs);
       fs.close();
    }

}

你可能感兴趣的:(访问hfds报错AccessControlException)