hadoop实现文件的上传下载

文章目录

  • 1.环境搭建
    • 1)引入依赖
    • 2 需要将配置文件拷贝到resource下
  • 2.代码编写
    • 1) 创建目录
    • 2.上传文件
    • 3.下载文件

1.环境搭建

创建一个Maven项目,这里就不多介绍,hadoop单节点的搭建参考上篇博客。

引入客户端依赖,注意版本需要一致

1)引入依赖

 <dependency>
            <groupId>org.apache.hadoopgroupId>
            <artifactId>hadoop-clientartifactId>
            <version>2.6.5version>
        dependency>

2 需要将配置文件拷贝到resource下

hadoop实现文件的上传下载_第1张图片

2.代码编写

客户端连接与关闭的代码

    public Configuration conf = null;
    public FileSystem fs = null;

    /**
     * 之前
     * @throws Exception
     */
    @Before
    public void conn() throws Exception {
        conf = new Configuration(true);//true
        //去环境变量 HADOOP_USER_NAME  root
        fs = FileSystem.get(URI.create("hdfs://node01:9000/"),conf,"root");
    }
    @After
    public void close() throws Exception {
        fs.close();
    }

1) 创建目录

  /**
     * 测试创建目录
     * @throws Exception
     */
    @Test
    public void mkdir() throws Exception {
        Path dir = new Path("/elite");
        if(fs.exists(dir)){
            fs.delete(dir,true);
        }
        fs.mkdirs(dir);
    }

hadoop实现文件的上传下载_第2张图片

2.上传文件

   /**
     * 测试文件上传
     * @throws Exception
     */
    @Test
    public void upload() throws Exception {

        BufferedInputStream input = new BufferedInputStream(new FileInputStream(new File("D:\\devproject\\devcode\\code\\bigdata\\src\\main\\resources\\test.txt")));
        Path outfile   = new Path("/elite/test.txt");
        FSDataOutputStream output = fs.create(outfile);
        IOUtils.copyBytes(input,output,conf,true);
    }

hadoop实现文件的上传下载_第3张图片

3.下载文件

/**
     * 文件存储以块来切分
     * @throws Exception
     */
    @Test
    public void blocks() throws Exception {

        Path file = new Path("/elite/data.txt");
        FileStatus fss = fs.getFileStatus(file);
        //输出块信息
        BlockLocation[] blks = fs.getFileBlockLocations(fss, 0, fss.getLen());
        for (BlockLocation b : blks) {
            System.out.println(b);
        }
        FSDataInputStream in = fs.open(file);
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
        System.out.print((char) in.readByte());
    }

输出

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
0,9,node01
12312312
Process finished with exit code 0

hadoop实现文件的上传下载_第4张图片

你可能感兴趣的:(大数据,hadoop,java,大数据)