一、在所有节点上安装java1.6或以上版本
二、在segment上安装hadoop,支持以下版本。
Hadoop Distribution |
Version |
gp_hadoop_ target_version |
Pivotal HD |
Pivotal HD 3.0, 3.0.1 |
gphd-3.0 |
Pivotal HD 2.0, 2.1 Pivotal HD 1.0 |
gphd-2.0 |
Greenplum HD |
Greenplum HD 1.2 |
gphd-1.2 |
Greenplum HD 1.1 |
gphd-1.1 (default) |
Cloudera |
CDH 5.2, 5.3, 5.4.x, 5.5.x |
cdh5 |
CDH 5.0, 5.1 |
cdh4.1 |
CDH 4.12 - CDH 4.7 |
cdh4.1 |
Hortonworks Data Platform |
HDP 2.1, 2.2, 2.3 |
hdp2 |
MapR |
MapR 4.x |
gpmr-1.2 |
MapR 1.x, 2.x, 3.x |
gpmr-1.0 |
Apache Hadoop |
2.x |
hadoop2 |
三、配置所有节点gp环境
[gpadmin@db10 ~]$ echo $JAVA_HOME
/usr/java/default
[gpadmin@db10 ~]$ echo $HADOOP_HOME
/opt/hadoop
[gpadmin@db10 ~]$ hadoop version
Hadoop 2.7.2
[gpadmin@db10 ~]$ vim .bashrc
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
# User specific aliases and functions
source /opt/greenplum-db/greenplum_path.sh
export MASTER_DATA_DIRECTORY=/hdd1/master/gpseg-1
export JAVA_HOME=/usr/java/default
export HADOOP_HOME=/opt/hadoop
export HADOOP_VERSION=2.7.2
export PATH=$PATH:$JAVA_HOME/bin/:$HADOOP_HOME/bin
四、配置gp参数
[gpadmin@db9 ~]$ gpconfig -c gp_hadoop_home -v "'/opt/hadoop'"
[gpadmin@db9 ~]$ gpconfig -c gp_hadoop_target_version -v "'hadoop2'"
[gpadmin@db9 ~]$ gpstop -u
五、hive上创建表
create table wdbd_dm.date_test1(id int,name string)
insert into wdbd_dm.date_test1 values(1,’wjian’);
insert into wdbd_dm.date_test1 values(2,’wuj’);
insert into wdbd_dm.date_test1 values(3,'mike’);
在gp上查看生成的hdfs文件,生成了3个文件
[gpadmin@db10 ~]$hdfs dfs -ls hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/
……
-rwxrwxr-x 3 root root 8 2017-05-25 16:02 hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part-00000
-rwxrwxr-x 3 root root 6 2017-05-25 16:02 hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part-00000_copy_1
-rwxrwxr-x 3 root root 7 2017-05-25 16:02 hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part-00000_copy_2
注:172.20.5.8是目标集群的namenode地址,8020是端口
六、创建gphdfs文件外部表
create EXTERNAL table hdfs_test(id int,name varchar(32))
location ('gphdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part*')
format 'TEXT' (DELIMITER '\001')
注:hive文件默认的分隔符是ascii码的控制符\001
edw=# select * from hdfs_test;
id | name
----+-------
2 | wuj
3 | mike
1 | wjian
(3 rows)
来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/29989552/viewspace-2139883/,如需转载,请注明出处,否则将追究法律责任。