sqoop1将mysql表导入hive

1.环境:hadoop2.7.7 hive 2.3.4

下载sqoop1 版本1.4.7 http://mirror.bit.edu.cn/apache/sqoop/1.4.7/sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz

解压到opt下

配置环境变量,配置sqoop-env.sh

#Set path to where bin/hadoop is available
#export HADOOP_COMMON_HOME=
export HADOOP_COMMON_HOME=/opt/hadoop
#Set path to where hadoop-*-core.jar is available
#export HADOOP_MAPRED_HOME=
export HADOOP_MAPRED_HOME=/opt/hadoop

#set the path to where bin/hbase is available
#export HBASE_HOME=
export HBASE_HOME=/opt/hbase

#Set the path to where bin/hive is available
#export HIVE_HOME=
export HIVE_HOME=/opt/hive

#Set the path for where zookeper config dir is
#export ZOOCFGDIR=
export ZOOCFGDIR=/opt/zookeeper
export ZOOKEEPER_HOME=/opt/zookeeper

将mysql (mysql-connector-Java-5.1.31-bin.jar)、hive(/opt/hive/lib/hive-common-2.3.4.jar)的驱动包拷贝到$SQOOP_HOME/lib下。

切换到$SQOOP_HOME/bin目录执行;

./sqoop-import --connect jdbc:mysql://192.168.3.8:3306/xinfang --username root --password 123456 --table yfxiaoqu --hive-table default.yfxiaoqu --hive-import --hive-overwrite


 

你可能感兴趣的:(大数据)