Sqoop异常 ERROR tool.ImportTool: Imported Failed: column not found

Sqoop导入数据到hive报列找不到异常,如下:

20/07/22 13:44:42 INFO mapreduce.ImportJobBase: Transferred 111.0097 MB in 40.0756 seconds (2.77 MB/sec)
20/07/22 13:44:42 INFO mapreduce.ImportJobBase: Retrieved 1301733 records.
20/07/22 13:44:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [JOIN_UPLOADERROR] AS t WHERE 1=0
20/07/22 13:44:42 ERROR util.SqlTypeMap: It seems like you are looking up a column that does not
20/07/22 13:44:42 ERROR util.SqlTypeMap: exist in the table. Please ensure that you've specified
20/07/22 13:44:42 ERROR util.SqlTypeMap: correct column names in Sqoop options.
20/07/22 13:44:42 ERROR tool.ImportTool: Imported Failed: column not found: CANTONCODE

我是在使用sqoop从sqlserver导出数据到hive中报错,sqoop语句如下:

sqoop import --hive-import --connect ‘jdbc:sqlserver://ip:1433;username=xxxxxx;password=xxxxxx;database=xxxxxx’ --table JOIN_UPLOADERROR -m 1 --columns “CANTONCODE,YEAR,PERSONCODE,HEDINGCODE” --where “Year = ‘2018’” --hive-table db_test.Join_UploadError --hive-drop-import-delims --hive-overwrite --hive-partition-key dt --hive-partition-value 20200615 --null-non-string ‘\N’ --null-string ‘\N’ --delete-target-dir

网上查到的解决方法是将字段和表名都大写,测试之后错误还是没有解决。

解决方法:

1、使用Navicat查看建表语句,
CREATE TABLE [dbo].[Join_UploadError] (
[CantonCode] varchar(max) COLLATE Chinese_PRC_CI_AS NOT NULL,
[Year] varchar(max) COLLATE Chinese_PRC_CI_AS NOT NULL,
[PersonCode] varchar(max) COLLATE Chinese_PRC_CI_AS NOT NULL,
[hedingcode] varchar(max) COLLATE Chinese_PRC_CI_AS NOT NULL
)
2、sqoop语句中字段复制建表语句中的字段,如下:
sqoop import --hive-import --connect ‘jdbc:sqlserver://ip:1433;username=xxxxxx;password=xxxxxxx;database=xxxxxx’ --table JOIN_UPLOADERROR -m 1 --columns “CantonCode,Year,PersonCode,hedingcode” --where “Year = ‘2018’” --hive-table db_test.JOIN_UPLOADERROR --hive-drop-import-delims --hive-overwrite --hive-partition-key dt --hive-partition-value 20200615 --null-non-string ‘\N’ --null-string ‘\N’ --delete-target-dir

最终执行成功。

你可能感兴趣的:(sqoop,sqoop)