WorkFlow中Sqoop Action运行案例

复制样例

[beifeng@hadoop-senior oozie-4.0.0-cdh5.3.6]$ cp -r examples/apps/sqoop oozie-apps/

样例重新命令

mv sqoop sqoop-import

复制mysql依赖包

[beifeng@hadoop-senior oozie-apps]$ mkdir sqoop-import/lib
[beifeng@hadoop-senior oozie-apps]$ cp hive-select/lib/mysql-connector-java-5.1.27-bin.jar sqoop-import/lib/

修改job.properties

nameNode=hdfs://hadoop-senior.beifeng.com:8020
jobTracker=hadoop-senior.beifeng.com:8032
queueName=default
examplesRoot=examples
oozieAppsRoot=user/beifeng/oozie-apps
oozieDataRoot=user/beifeng/oozie/datas

oozie.use.system.libpath=true

oozie.wf.application.path=${nameNode}/${oozieAppsRoot}/sqoop-import/workflow.xml
outputDir=sqoop-import-user/output

修改流程文件


    

    
        
            ${jobTracker}
            ${nameNode}
            
                
            
            
                
                    mapred.job.queue.name
                    ${queueName}
                
            
           import --connect jdbc:mysql://hadoop-senior.beifeng.com:3306/test --username root --password 123456 --table my_user_0321 --num-mappers 1  --delete-target-dir  --fields-terminated-by "\t" --target-dir /user/beifeng/oozie/datas/sqoop-import-user/output
        
        
        
    

    
        Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]
    
    

注意:看一下hadoop调用的是老api还是新api。
mapred.job.queue.name 经过查看是调用的老api。所以此处进行修改。
把mapred.job.queue.name 修改为mapreduce.job.queuename
:注意不是以bin/sqoop 开头。而是以命令import或者其他命令开头。分割符必须是以“\t”双引号

创建hdfs的输出目录

bin/hdfs dfs -mkdir -p /user/beifeng/oozie/datas/sqoop-import-user

上传sqoop流程到hdfs系统

bin/hdfs dfs -put ../oozie-4.0.0-cdh5.3.6/oozie-apps/sqoop-import  /user/beifeng/oozie-apps

运行sqoop流程job

export OOZIE_URL=http://hadoop-senior.beifeng.com:11000/oozie
bin/oozie job -config oozie-apps/sqoop-import/job.properties  -run

查看运行结果

bin/oozie job -info JobId

JobId:0000016-180315133250705-oozie-beif-W 来自上一步运行结果

你可能感兴趣的:(WorkFlow中Sqoop Action运行案例)