通过sqoop将mysql中表传到hive中

1.测试mysql连接

[hdfs@ws1dn1 root]$ sqoop list-databases --connect jdbc:mysql://192.168.1.65:3306/ --username root -P
Warning: /usr/hdp/2.4.2.0-258/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/10/13 11:06:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
Enter password: 
16/10/13 11:06:58 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
information_schema
ambari
hive
mysql
performance_schema
test
v3

2.查看语句执行是否成功

[hdfs@ws1dn1 root]$ sqoop eval --connect jdbc:mysql://192.168.1.65:3306/v3 --username root -P --query "select * from t_log_2016 limit 5"

3.前面都没问题就可以导入导出了
将mysql表导入hive(默认导入default库)

[hdfs@ws1dn1 root]$ sqoop import --connect jdbc:mysql://192.168.1.65:3306/v3 --username root -P --table t_log_2016 --hive-import

将hive表导入mysql

sqoop export --connect  jdbc:mysql://192.168.1.65:3306/v3  --username root -P --table t_log_2016 --hcatalog-database default --hcatalog-table t_log_2016

你可能感兴趣的:(Hadoop,大数据动物园,Hive)