CDH安装Oozie及oozie学习遇到的问题

CDH修改Oozie数据库类型为mysql
1.将mysql驱动存放到/var/lib/oozie目录下
2.创建oozie数据库
3.初始化数据库
workflow.xml文件配置mapreduce遇到的问题:
1

.java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.rytong.mdap.analytics.compute.basic.UserCounterJob.CounterMapper not found
问题原因:
mapreduce.map.class
             com.rytong.mdap.analytics.compute.basic.UserCounterJob.CounterMapper

解决方式:将点.改为 com.rytong.mdap.analytics.compute.basic.UserCounterJob c o m . r y t o n g . m d a p . a n a l y t i c s . c o m p u t e . b a s i c . U s e r C o u n t e r J o b CounterMapper

问题2:
java.lang.NoClassDefFoundError: com/maxmind/geoip2/exception/AddressNotFoundException
问题原因:
缺失第三方jar包
解决方式:
将jar包放到workfolw.xml所在目录的lib目录下

问题3:
Oozie 出现 ClassNotFoundException 解决方法
只需要在job.properties里面加入oozie.use.system.libpath=true
参考地址:http://jyd.me/nosql/oozie-classnotfoundexception-solution/
问题4:
执行shell时:[org.apache.oozie.action.hadoop.ShellMain], main() threw exception, Cannot run program
问题原因:由于我们是在集群上运行,因此我们需要把脚本拷贝到每个节点:
加一个file节点
/user/root/examples/apps/hive/create_mysql_table.sh
/user/root/examples/apps/hive/create_mysql_table.sh#create_mysql_table.sh

**执行mapreduce时的workfolw.xml**
<workflow-app xmlns="uri:oozie:workflow:0.2" name="usercounter-job-wf">
    <start to="mr-node"/>
    <action name="mr-node">
        <map-reduce>
            <job-tracker>${jobTracker}job-tracker>
            <name-node>${nameNode}name-node>
            <prepare>
                <delete path="${nameNode}/zqc/264/tmp/${outputDir}"/>
            prepare>
            <configuration>
                <property>
                    <name>mapred.job.queue.namename>
                    <value>${queueName}value>
                property>
                   <property>  
                    <name>mapred.mapper.new-apiname>  
                    <value>truevalue>  
                property>  
                <property>  
                    <name>mapred.reducer.new-apiname>  
                    <value>truevalue>  
                property>  
                <property>
                    <name>mapred.job.namename>
                    <value>UserCounterJobvalue>
                property>
                <property>
                    <name>mapreduce.inputformat.classname>
                    <value>org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormatvalue>
                property>
                <property>
                    <name>mapreduce.map.classname>
                    <value>com.rytong.mdap.analytics.compute.basic.UserCounterJob$CounterMappervalue>
                property>
                <property>
                    <name>mapreduce.reduce.classname>
                    <value>com.rytong.mdap.analytics.compute.basic.UserCounterJob$CounterReducervalue>
                property>
                <property>
                    <name>mapred.mapoutput.key.classname>
                    <value>org.apache.hadoop.io.Textvalue>
                property>
                <property>
                    <name>mapred.mapoutput.value.classname>
                    <value>com.rytong.mdap.analytics.source.Messagevalue>
                property>
                <property>
                    <name>mapred.output.key.classname>
                    <value>org.apache.hadoop.hbase.io.ImmutableBytesWritablevalue>
                property>
                    <property>
                    <name>mapred.output.value.classname>
                    <value>org.apache.hadoop.io.LongWritablevalue>
                property>
                 <property>
                    <name>mapreduce.outputformat.classname>
                    <value>org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormatvalue>
                property>
                 <property>
                    <name>mapred.reduce.tasksname>
                    <value>2value>
                property>
                <property>
                    <name>mapred.map.tasksname>
                    <value>2value>
                property>
                <property>
                    <name>mapred.input.dirname>
                    <value>/zqc/264/clean/session/20180415value>
                property>
                <property>
                    <name>mapred.output.dirname>
                    <value>/zqc/264/tmp/${outputDir}value>
                property>
            configuration>
            <file>/user/root/examples/apps/java-main/configfile>
        map-reduce>
        <ok to="end"/>
        <error to="fail"/>
    action>
    <kill name="fail">
        <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]message>
    kill>
    <end name="end"/>
workflow-app>

“`

你可能感兴趣的:(oozie)