hadoop dfs -put 时的错误

[root@hadoop1 桌面]# hadoop dfs -put /home/hadoop/word.txt /tmp/wordcount/word5.txt出现的错误
3/05/02 18:11:26 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/hadoop123/person.txt could only be replicated to 0 nodes, instead of 1
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1576)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:771)
    at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1439)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1435)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1433)

    at org.apache.hadoop.ipc.Client.call(Client.java:1150)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
    at $Proxy0.addBlock(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy0.addBlock(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3773)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3640)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2400(DFSClient.java:2846)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:3041)

13/05/02 18:11:26 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null
13/05/02 18:11:26 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/hadoop/hadoop123/person.txt" - Aborting...
put: java.io.IOException: File /user/hadoop/hadoop123/person.txt could only be replicated to 0 nodes, instead of 1

解决方案:

 

 目前解决办法是分别启动节点#hadoop-daemon.sh start namenode #$hadoop-daemon.sh start datanode

 

1.   重新启动namenode

 

# hadoop-daemon.sh start namenode

 

starting namenode, logging to /usr/hadoop-0.21.0/bin/../logs/hadoop-root-namenode-www.keli.com.out

 

 

 

2.   重新启动datanode

 

# hadoop-daemon.sh start datanode

starting datanode, logging to /usr/hadoop-0.21.0/bin/../logs/hadoop-root-datanode-www.keli.com.out

你可能感兴趣的:(hadoop)