windows下通过eclipse进行远程执行出错:Permission denied:

前一阵子搭建好了windows下的hadoop开发环境,今天进行远程执行时报了一个异常:

14/02/26 10:33:52 WARNmapred.LocalJobRunner: job_local_0001

org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=weixiang, access=WRITE,inode="":root:supergroup:rwxr-xr-x

    atsun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    atsun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)

    atsun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)

    atjava.lang.reflect.Constructor.newInstance(Unknown Source)

    atorg.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)

    atorg.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)

    atorg.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1428)

    atorg.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:332)

    atorg.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1126)

    atorg.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52)

    atorg.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:186)

Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=weixiang, access=WRITE,inode="":root:supergroup:rwxr-xr-x

    atorg.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)

    atorg.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:180)

    atorg.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)

    atorg.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5468)

    atorg.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5442)

    atorg.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2209)

    atorg.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2178)

    atorg.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:857)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    atjava.lang.reflect.Method.invoke(Method.java:597)

    atorg.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)

    atorg.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)

    atorg.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)

    atjava.security.AccessController.doPrivileged(Native Method)

    atjavax.security.auth.Subject.doAs(Subject.java:396)

    atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)

    atorg.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

 

    atorg.apache.hadoop.ipc.Client.call(Client.java:1107)

    atorg.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)

    at$Proxy1.mkdirs(Unknown Source)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

    atjava.lang.reflect.Method.invoke(Unknown Source)

    atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)

    atorg.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)

    at$Proxy1.mkdirs(Unknown Source)

    atorg.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1426)

    ...4 more

14/02/26 10:33:53 INFOmapred.JobClient:  map 0% reduce 0%

14/02/26 10:33:53 INFOmapred.JobClient: Job complete: job_local_0001

14/02/26 10:33:53 INFO mapred.JobClient:Counters: 0


可以看出是权限不足的问题,通过网上翻阅资料,找到了一个经测试可行的办法;

解决方案:在Hadoop集群的hdfs-site.xml文件中添加以下节点

<property>

   <name>dfs.permissions</name>

   <value>false</value>

 </property>

 

重启hadoop集群即可



你可能感兴趣的:(eclipse,windows,hadoop,异常)