sqoop将hive数据导入mysql报错

sqoop将hive中的数据导入到mysql中:

sqoop export \
--connect 'mysql的地址加端口/mysql中的表名?useUnicode=true&characterEncoding=utf-8' \

mysql的用户名密码

mysql中的表名

输入的格式

--export-dir "whereis hive的路径,精确到表" \
--input-fields-terminated-by '\t' \
--update-mode allowinsert 

sqoop export \
--connect 'jdbc:mysql://ip:3306/safe_manager?useUnicode=true&characterEncoding=utf-8' \
--username root \
--password Free-Wi11 \
--table bigscreen_line1 \
--input-null-string '\\N' --input-null-non-string '\\N' \
--export-dir "/user/hive/warehouse/shanxi.db/line/batch_date=2021-01-27" \
--input-fields-terminated-by '\t' \
--update-mode allowinsert 

报错日志: 

21/01/29 13:10:31 INFO mapreduce.Job:  map 100% reduce 0%
21/01/29 13:10:31 INFO mapreduce.Job: Job job_1609311981015_1856 failed with state FAILED due to: Task failed task_1609311981015_1856_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0

21/01/29 13:10:31 INFO mapreduce.Job: Counters: 9
	Job Counters 
		Failed map tasks=3
		Launched map tasks=3
		Data-local map tasks=1
		Rack-local map tasks=2
		Total time spent by all maps in occupied slots (ms)=8171
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=8171
		Total vcore-milliseconds taken by all map tasks=8171
		Total megabyte-milliseconds taken by all map tasks=8367104
21/01/29 13:10:31 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/01/29 13:10:31 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 17.0784 seconds (0 bytes/sec)
21/01/29 13:10:31 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
21/01/29 13:10:31 INFO mapreduce.ExportJobBase: Exported 0 records.
21/01/29 13:10:31 ERROR mapreduce.ExportJobBase: Export job failed!
21/01/29 13:10:31 ERROR tool.ExportTool: Error during export: 
Export job failed!
	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:444)
	at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:93)

1.我们hive中的id为nul,本来想通过对hive的id做成uuid的格式,但是uuid是string类型的,id是int类型 的写入不进去

set hive.exec.dynamic.partition=true;
set hive.exec.dynamic.partition.mode=nonstrict;
insert into table app_bigscreen_line partition (batch_date)
SELECT 
'',
regexp_replace(reflect("java.util.UUID", "randomUUID"), "-", "")  as id,
'',
'',
'',
'',
'',
'',
from_unixtime(unix_timestamp(),'yyyy-MM-dd HH:mm:ss'),
'',
from_unixtime(unix_timestamp(),'yyyy-MM-dd HH:mm:ss'),
'',
date_sub(CURRENT_DATE,1) 
from app_bigscreen_line;

uuid:

NULL	a7feedfb0ab348b3ba503d9ffc9c5f99		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	d9010d36dfc048279fbf523f2633ae6a		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	83591d306f164fdfbd3f896307daca9a		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	62c217436a084db1803b6baabb9f5b92		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	18172ffa96904bff84b3628135969a93		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	603d13e37a22459e9daf2f351f2a0c85		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	c49f1d6e44d545e4bde2d7b7f9e2bdaa		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	4e65a71a2c00450fa27782f8229c84e8		NULL				NULL	2021-01-29 11:37:46	NULL	2021-01-29 11:37:46	NULL	2021-01-28
NULL	0fe6de7c976948b58be1f1098f6f56b7	

2.sqoop底层是mr,所以通过ip+8088端口查看hadoop的日志

hdfs dfs -ls	/tmp/logs/root/logs/application_1609311981015_1858
yarn logs -applicationId application_1609311981015_1858

查看错误日志,发现是时间格式不对

2021-01-29 15:05:31,712 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: 
2021-01-29 15:05:31,712 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception raised during data export
2021-01-29 15:05:31,712 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: 
2021-01-29 15:05:31,712 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception: 
java.lang.RuntimeException: Can't parse input data: '2020-02'
	at bigscreen_line1.__loadFromFields(bigscreen_line1.java:814)
	at bigscreen_line1.parse(bigscreen_line1.java:659)
	at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:88)
	at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
	at java.sql.Timestamp.valueOf(Timestamp.java:204)
	at bigscreen_line1.__loadFromFields(bigscreen_line1.java:730)
	... 12 more

https://blog.csdn.net/huozhanfeng/article/details/10502675

你可能感兴趣的:(sqoop将hive数据导入mysql报错)