HADOOP学习笔记——JAVA使用API将本地文件上传到HDFS

HDFS API详解:https://www.cnblogs.com/alisande/archive/2012/06/06/2537903.html

Hadoop HDFS 文件访问权限问题导致Java Web 上传文件到Hadoop失败的原因分析及解决方法:https://blog.csdn.net/bikun/article/details/25506489?utm_medium=distribute.pc_relevant.none-task-blog-BlogCommendFromMachineLearnPai2-1.nonecase&depth_1-utm_source=distribute.pc_relevant.none-task-blog-BlogCommendFromMachineLearnPai2-1.nonecase

Name node is in safe mode 解决办法:离开safe mode就可以

执行一下:hadoop dfsadmin -safemode leave

pom.xml:





4.0.0



HDFS0519

HDFSUpload

1.0-SNAPSHOT







apache

http://maven.apache.org





















org.apache.hadoop

hadoop-common

2.7.1





org.apache.hadoop

hadoop-hdfs

2.7.1









org.apache.hadoop

hadoop-client

2.7.1







代码:

package com.xy.uploadfile;



import java.io.IOException;

import java.net.URI;

import java.net.URISyntaxException;







import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.fs.FileStatus;

import org.apache.hadoop.fs.FileSystem;

import org.apache.hadoop.fs.Path;





/**

* @ClassName s

* @Description

* @Date 2020-05-19 19:44

* @Create By XinYan

*/







public class UploadFile{



public static void main(String[] args) throws IOException, URISyntaxException {



Configuration conf = new Configuration();



URI uri = new URI("hdfs://hadoop:9000");



FileSystem fs = FileSystem.get(uri, conf);



// 本地文件



Path src = new Path("E:/工作学习/学习/java-code/HDFSUpload/uploadfiletext.txt");



//HDFS存放位置



Path dst = new Path("/");



fs.copyFromLocalFile(src, dst);



System.out.println("Upload to " + conf.get("fs.defaultFS"));



//相当于hdfs dfs -ls /



FileStatus files[] = fs.listStatus(dst);



for (FileStatus file:files) {



System.out.println(file.getPath());



}



}

}

 

 

你可能感兴趣的:(大数据)