sqoop将Mysql数据传入Hive中

1、HIve创建表

CREATE TABLE`piwik_log_link_visit_action` (

  `idlink_va` int(11) unsigned NOT NULLAUTO_INCREMENT,

  `idsite` int(10) unsigned NOT NULL,

  `idvisitor` binary(8) NOT NULL,

  `idvisit` int(10) unsigned NOT NULL,

  `idaction_url_ref` int(10) unsigned DEFAULT'0',

  `idaction_name_ref` int(10) unsigned NOTNULL,

  `custom_float` float DEFAULT NULL,

  `server_time` datetime NOT NULL,

  `idaction_name` int(10) unsigned DEFAULTNULL,

  `idaction_url` int(10) unsigned DEFAULT NULL,

  `time_spent_ref_action` int(10) unsigned NOTNULL

) row format delimited fields terminated by '\t';


2、Sqoop将mysql中指定表转移到Hive的piwik_log_link_visit_action中

     sqoop  import --connect jdbc:mysql://192.168.1.10:3306/itcast --username root--password 123 --table piwik_log_link_visit_action --hive-import --hive-overwrite --hive-table  piwik_log_link_visit_action --fields-terminated-by '\t'

     

备注:将mysql中指定表结构复制到hive中,比如将mysql的piwik_log_link_visit_action表结构复制到hive的piwik_log_link_visit_action表

sqoop import  --connect jdbc:mysql://192.168.1.10:3306/itcast  --table piwik_log_link_visit_action --username root  --password 123  --hive-import --hive-overwrite --hive-table piwik_log_link_visit_action --fields-terminated-by '\t'







你可能感兴趣的:(mysql,hive,sqoop,Hadoop)