HDFS报错之-无法解析主机

        Configuration conf = new Configuration();

        conf.set("dfs.client.use.datanode.hostname","true");
        FileSystem fileSystem = FileSystem.get(new URI("hdfs://xxx:9000/"),conf,"root");

        Path src = new Path("C:\\Users\\Administrator\\Downloads\\idman633.exe");

        Path path = new Path("hdfs://xxx:9000/name");

        fileSystem.copyFromLocalFile(src,path);

        fileSystem.close();
    } 

就是这样运行的时候报错

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

java.io.IOException: java.nio.channels.UnresolvedAddressException

at org.apache.hadoop.hdfs.DataStreamer$LastExceptionInStreamer.set(DataStreamer.java:299)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:820)
Caused by: java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Net.java:101)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)


Process finished with exit code -1

搞了很久问了好多人,最后远程解决了,不是因为防火墙原因,

也不是hadoop配置错了,原因是在使用云服务器,需要在windwos建立域名映射


129.xxx.xxx.xxx  bigdata

就这样解决了,真的很坑,我最烦配置了 

另外在与远程主机通信时候需要设置多设置一个参数

HDFS报错之-无法解析主机_第1张图片

 

总之大数据坑还是很多 加油!

你可能感兴趣的:(Hadoop,hdfs)