Hive exited with status 1

 使用sqoop将mysql数据导入到hadoop中再导入hive中报错如下:

19/08/03 15:20:24 INFO hive.HiveImport: Loading uploaded data into Hive
19/08/03 15:20:31 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
19/08/03 15:20:31 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19/08/03 15:20:31 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hbase-1.4.10/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19/08/03 15:20:31 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
19/08/03 15:20:31 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/08/03 15:20:31 INFO hive.HiveImport: 
19/08/03 15:20:31 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/usr/sqoop/lib/hive-exec-1.2.2.jar!/hive-log4j.properties
19/08/03 15:20:36 INFO hive.HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. AlreadyExistsException(message:Table kuwo_music already exists)
19/08/03 15:20:36 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive exited with status 1
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

这是因为在主节点上我们已经传过这数据,hive中已经存在这个表,而子节点再传输时就会报错。

解决方法:

1.修改导入进hive中的表名,不与主节点相同

2.删除之前主节点导入hive中的数据,删除hadoop上要传输的hdfs文件

你可能感兴趣的:(Hadoop)