Hadoop3.0 Java API使用指南

0.开发环境概述

客户端环境

Windows 7 64位
Oracle JDK8 64位
Eclipse4.7

服务器伪分布式安装部署Hadoop3


1.Windows平台下Hadoop客户端运行环境搭建

-下载winutils
-解压缩到任意文件夹下
-新建环境变量HADOOP_HOME
-在环境变量PATH中添加%HADOOP_HOME%\bin

2.建立客户端工程

-新建Maven项目,POM文件如下:

        
            jdk.tools
            jdk.tools
            1.8
            system
            ${JAVA_HOME}/lib/tools.jar
        
        
            org.apache.hadoop
            hadoop-common
            3.0.0
            provided
        
        
            org.apache.hadoop
            hadoop-hdfs
            3.0.0
        
        
            org.apache.hadoop
            hadoop-hdfs-client
            3.0.0
            provided
        
        
            org.apache.hadoop
            hadoop-client
            3.0.0
        
        
            junit
            junit
            4.12
            test
        
    
-拷贝服务器端上的core-site.xmlhdfs-site.xml到maven工程的src文件夹下,开始码代码;

-建立与服务器端Hadoop的连接

    private static Configuration conf = new Configuration();
    private FileSystem hdfs;
    private final static String HDFSUri = "hdfs://192.168.1.111:9001";
    public FileSystem initHDFS() throws URISyntaxException {
        URI uri = new URI(HDFSUri);
        conf.addResource("hdfs-site.xml");
        conf.addResource("core-site.xml");
        try {
            hdfs = FileSystem.get(uri,conf);
        } catch (IOException e) {
            e.printStackTrace();
        }
        return hdfs;
    }

-检查文件是否存在

    /**
     * 检查文件是否存在
     * @throws IllegalArgumentException
     * @throws IOException
     */
    @Test
    public void testCheckFile() throws IllegalArgumentException, IOException {
        Path path = new Path("hdfs://192.168.1.111:9001/user/root/test.txt");
        boolean result = hdfsDao.exists(path);
        System.out.println(result?"文件存在":"文件不存在");
    }

-在HDFS上新建文件夹

    /**
     * 新建文件夹
     * @throws IOException
     */
    @Test
    public void testMkdir() throws IOException {
        Path folderPath = new Path("hdfs://192.168.1.133:9000/user/root/tiffData");
        boolean result = hdfsDao.mkdirs(folderPath);
        System.out.println(result?"添加文件夹成功":"添加文件夹失败");
    }

-删除文件夹/文件

    /**
     * 删除文件夹/文件
     * @throws IOException
     */
    @Test
    public void testDeldir() throws IOException {
        Path folderPath = new Path("hdfs://192.168.1.133:9000/user/root/tiffData");
        boolean result = hdfsDao.delete(folderPath, false);
        System.out.println(result?"删除文件夹成功":"删除文件夹失败");
    }

-获取指定目录下所有文件(忽略目录)

/**
     * 获取指定目录下所有文件(忽略目录)
     * @throws FileNotFoundException
     * @throws IllegalArgumentException
     * @throws IOException
     */
    @Test
    public void testGetAllFile() throws FileNotFoundException, IllegalArgumentException, IOException {
        RemoteIterator listFiles = hdfsDao.listFiles(new Path("/"), true);
        SimpleDateFormat sdf=new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
        while (listFiles.hasNext()) {
            LocatedFileStatus fileStatus = (LocatedFileStatus) listFiles.next();
            //权限
            FsPermission permission = fileStatus.getPermission();
            //拥有者
            String owner = fileStatus.getOwner();
            //组
            String group = fileStatus.getGroup();
            //文件大小byte
            long len = fileStatus.getLen();
            long modificationTime = fileStatus.getModificationTime();
            Path path = fileStatus.getPath();
            System.out.println("-------------------------------");
            System.out.println("permission:"+permission);
            System.out.println("owner:"+owner);
            System.out.println("group:"+group);
            System.out.println("len:"+len);
            System.out.println("modificationTime:"+sdf.format(new Date(modificationTime)));
            System.out.println("path:"+path);
        }
    }

-从HDFS上拷贝文件到本地

    /**
     * 从HDFS上拷贝文件到本地
     */
    @Test
    public void testDownLoadFile() {
        Path src = new Path("hdfs://192.168.1.133:9000/user/root/test.txt");
        Path des = new Path("D:\\workData\\test.txt");
        try {
            hdfsDao.copyToLocalFile(src, des);
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

-本地文件上传

    @Test
    public void testFileUpload() {
        Path src = new Path("D:\\workData\\test1.txt");
        Path des = new Path("hdfs://192.168.1.133:9000/user/root/test1.txt");
        try {
            hdfsDao.copyFromLocalFile(src, des);
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

你可能感兴趣的:(Hadoop3.0 Java API使用指南)