Sqoop 1.4.7 Java 开发

Java 1.8 +Sqoop 1.4.7

本文主要是备注,最近在做这方面的工作,发现网上的文档比较少,mark下。


Maven 引用

  • 数据库连接的Jar包
  • common-lang3
  • avro以及avro-mapred
  • hadoop-hdfs,hadoop-common
  • mapreduced 相关jar

pom

 <dependency>
      <groupId>mysqlgroupId>
      <artifactId>mysql-connector-javaartifactId>
      <scope>runtimescope>
  dependency>
  <dependency>
      <groupId>org.apache.sqoopgroupId>
      <artifactId>sqoopartifactId>
      <version>1.4.7version>
  dependency>

  <dependency>
      <groupId>org.apache.commonsgroupId>
      <artifactId>commons-lang3artifactId>
      <version>3.0version>
  dependency>
  
  <dependency>
      <groupId>org.apache.hadoopgroupId>
      <artifactId>hadoop-commonartifactId>
      <version>2.8.4version>
  dependency>
  <dependency>
      <groupId>org.apache.hadoopgroupId>
      <artifactId>hadoop-hdfsartifactId>
      <version>2.8.4version>
  dependency>
  <dependency>
      <groupId>org.apache.hadoopgroupId>
      <artifactId>hadoop-mapreduce-client-coreartifactId>
      <version>2.8.4version>
  dependency>
  <dependency>
      <groupId>org.apache.hadoopgroupId>
      <artifactId>hadoop-mapreduce-client-commonartifactId>
      <version>2.8.4version>
  dependency>
  <dependency>
      <groupId>org.apache.hadoopgroupId>
      <artifactId>hadoop-mapreduce-client-jobclientartifactId>
      <version>2.8.4version>
      <scope>testscope>
  dependency>
  <dependency>
      <groupId>org.apache.avrogroupId>
      <artifactId>avro-mapredartifactId>
      <version>1.8.1version>
  dependency>
  <dependency>
      <groupId>org.apache.hivegroupId>
      <artifactId>hive-commonartifactId>
      <version>2.3.2version>
  dependency>
  <dependency>
      <groupId>org.apache.avrogroupId>
      <artifactId>avroartifactId>
      <version>1.8.1version>
  dependency>

实例代码

实例代码和网上以及官网的测试代码一致

package com.example.demo;


import org.apache.hadoop.conf.Configuration;
import org.apache.sqoop.Sqoop;
import org.apache.sqoop.hive.HiveConfig;
import org.apache.sqoop.tool.ImportTool;
import org.apache.sqoop.tool.SqoopTool;

import java.io.IOException;

public class SqoopTest {
    public static void main(String[] args) throws IOException {
        System.out.println(" begin test sqoop");
        String[] argument = new String[] {
                "--connect","jdbc:mysql://localhost:3306/testsqoop?useSSL=false",
                "--username","root",
                "--password","root",
                "--table","data_table",
                "--hive-import","--hive-database","testsqoop","--hive-overwrite","--create-hive-table",
                "--hive-table","data_table",
                "--delete-target-dir",
        };
        com.cloudera.sqoop.tool.SqoopTool sqoopTool=(com.cloudera.sqoop.tool.SqoopTool)SqoopTool.getTool("import");
        Configuration conf= new Configuration();
        conf.set("fs.default.name","hdfs://localhost:9000");
        Configuration hive=HiveConfig.getHiveConf(conf);
        Sqoop sqoop = new Sqoop(sqoopTool,SqoopTool.loadPlugins(conf) );
        int res = Sqoop.runSqoop(sqoop,argument);
        System.out.println(res);
        System.out.println("执行sqoop结束");
    }
}

经过本地环境测试以及Spring Boot Web服务测试 可以运行

你可能感兴趣的:(大数据)