troubleshooting-When importing query results in parallel, you must specify --split-by.

原因分析

-m 4 \ 导数命令中map task number=4,当-m 设置的值大于1时,split-by必须设置字段(需要是 int 类型的字段),如果不是 int类型的字段,则需要加上参数
-Dorg.apache.sqoop.splitter.allow_text_splitter=true
例子:
sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true \
--connect ${conn_str} \
--username ${db_username} \
--password ${db_password} \

解决办法

并行导入数据需要加上 --split-by columnName \ 属性。

异常日志

18/09/17 14:23:58 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7-cdh6.0.0
18/09/17 14:23:58 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
When importing query results in parallel, you must specify --split-by.
Try --help for usage instructions.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.0.0-1.cdh6.0.0.p0.537114/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.0.0-1.cdh6.0.0.p0.537114/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Table not found user_base_delta

转载于:https://www.cnblogs.com/chwilliam85/p/9693268.html

你可能感兴趣的:(大数据,数据库)