SQOOP导入hive表报错

sqoop:/sqoop-1.4.6/bin/sqoop import --connect jdbc:oracle:thin:@10.100.100.100:1521:orcl --username aaa --password aaa --table tablename --hive-import -m 1 --fields-terminated-by '\t' --hive-overwrite --hive-table log.hivetablename -- --default-character-set=utf-8

报错:Move from: hdfs://XXX to: hdfs://YYY is not valid.Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.

ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 44

原因:hive表的location和default.fs.name不一致造成的,

  查看location:desc extended hivetablename 

  查看default.fs.name:在hadoop安装目录下core-site.xml文件

将hive表的location 改成fs.default.name的值


修改hive表location : alter table hivetablename set location 'hdfs://10.100.111.1:9000/user/hive/warehouse/log/hivetablename ';


如果报org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory XXX already exists ,要删除XXX,在HDFS  /user/hadoop/目录下

hadoop fs -rmr /user/hadoop/XXX

你可能感兴趣的:(sqoop报错)