sqoop导入错误:Hive does not support the SQL type for column col_name

问题:
把mysql导入到hive时,由于hive的类型不支持mysql的类型,出现导入错误,如下:


[hdfs@hadoop0 ~]$ sqoop import --connect jdbc:mysql://10.1.32.34:3306/dicts --username sqoop --password sqoop  -m 1 --table ua --hive-import --hive-overwrite --hive-table ua
Warning: /opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/06/08 11:05:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0
......
16/06/08 11:05:47 INFO mapreduce.ImportJobBase: Transferred 1.0704 MB in 18.1975 seconds (60.2308 KB/sec)
16/06/08 11:05:47 INFO mapreduce.ImportJobBase: Retrieved 1245 records.
16/06/08 11:05:47 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `ua` AS t LIMIT 1
16/06/08 11:05:47 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive does not support the SQL type for column v
        at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:181)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:189)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)




解决:
--mysql表类型
mysql> desc ua1;
+---------+-------------+------+-----+-------------------+----------------+
| Field   | Type        | Null | Key | Default           | Extra          |
+---------+-------------+------+-----+-------------------+----------------+
| id      | int(11)     | NO   | PRI | NULL              | auto_increment | 
| k       | varchar(32) | YES  | UNI | NULL              |                | 
| v       | mediumblob  | YES  |     | NULL              |                | 
| updated | timestamp   | NO   | MUL | CURRENT_TIMESTAMP |                | 
+---------+-------------+------+-----+-------------------+----------------+
4 rows in set (0.00 sec)
mysql> truncate table nodist1;
Query OK, 0 rows affected (0.01 sec)


mysql> select * from ua limit 1\G;
*************************** 1. row ***************************
     id: 1
      k: 0000000000007668
      v: {"old_model": "dopod p660_cmcc", "photo_fmt": "JPEG", "midi": "64", "audio_fmt": "MP3\u3001WMA\u3001WAV", "video_fmt": "WMV\u3001AVI\u30013GP(H.263)", "price": "1500-2499", "screen": "240*320", "model": "dopod P660", "wap": "wap 2.0", "os": "Windows", "brand": "\u591a\u666e\u8fbe", "midp": "MIDP 2.0", "manufacturer": "\u6b66\u6c49\u591a\u666e\u8fbe\u901a\u8baf\u6709\u9650\u516c\u53f8"}
updated: 2009-12-25 10:42:02
1 row in set (0.00 sec)


ERROR: 
No query specified


把mysql的指定字段的类型转换成hive string类型。
使用参数--map-column-hive v=string 进行类型转换:
[hdfs@hadoop0 ~]$ sqoop import --connect jdbc:mysql://10.1.32.34:3306/dicts --username sqoop --password sqoop  -m 1 --table ua --hive-import --hive-overwrite --hive-table ua --map-column-hive v=string


导入hive后查看内容:

sqoop导入错误:Hive does not support the SQL type for column col_name_第1张图片


参考:
http://sdhsdhsdhsdh.iteye.com/blog/1944095

你可能感兴趣的:(Sqoop)