sqoop把mysql导入hive时报错:Could not load org.apache.hadoop.hive.conf.HiveConf.

环境:CDH5.7
hadoop2.6.0
hive1.1.0


问题1:
使用sqoop把mysql导入hive时报错
# sqoop import --hive-import --connect jdbc:mysql://10.1.32.34:3306/dicts --username sqoop --password sqoop -m 1 --table nodist --create-hive-table

16/02/18 17:01:15 INFO mapreduce.Job: Running job: job_1455812803225_0020
16/02/18 17:01:24 INFO mapreduce.Job: Job job_1455812803225_0020 running in uber mode : false
16/02/18 17:01:24 INFO mapreduce.Job:  map 0% reduce 0%
16/02/18 17:01:33 INFO mapreduce.Job:  map 25% reduce 0%
16/02/18 17:01:34 INFO mapreduce.Job:  map 50% reduce 0%
16/02/18 17:01:41 INFO mapreduce.Job:  map 100% reduce 0%
16/02/18 17:01:41 INFO mapreduce.Job: Job job_1455812803225_0020 completed successfully
16/02/18 17:01:41 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=555640
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=473
                HDFS: Number of bytes written=8432
                HDFS: Number of read operations=16
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=8
        Job Counters 
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=25664
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=25664
                Total vcore-seconds taken by all map tasks=25664
                Total megabyte-seconds taken by all map tasks=26279936
        Map-Reduce Framework
                Map input records=91
                Map output records=91
                Input split bytes=473
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=351
                CPU time spent (ms)=4830
                Physical memory (bytes) snapshot=802369536
                Virtual memory (bytes) snapshot=6319828992
                Total committed heap usage (bytes)=887095296
        File Input Format Counters 
                Bytes Read=0
        File Output Format Counters 
                Bytes Written=8432
16/02/18 17:01:41 INFO mapreduce.ImportJobBase: Transferred 8,2344 KB in 30,7491 seconds (274,219 bytes/sec)
16/02/18 17:01:41 INFO mapreduce.ImportJobBase: Retrieved 91 records.


16/02/18 17:01:41 WARN hive.TableDefWriter: Column last_updated had to be cast to a less precise type in Hive
16/02/18 17:01:41 INFO hive.HiveImport: Loading uploaded data into Hive
16/02/18 17:01:41 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
16/02/18 17:01:41 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
        at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:195)
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
        ... 12 more


        
解决:
在hdfs用户下添加环境变量HADOOP_CLASSPATH
# vi ~/.bash_profile
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/opt/cloudera/parcels/CDH/lib/hive/lib/*

# source ~/.bash_profile


参考连接:https://community.cloudera.com/t5/Batch-SQL-Apache-Hive/Error-with-quot-Make-sure-HIVE-CONF-DIR-is-set-correctly-quot/m-p/37865#M1140

你可能感兴趣的:(Sqoop)