此次博主为大家带来的是Hive项目实战系列的第二部分。
[bigdata@hadoop002 hive]$ bin/hiveserver2
[bigdata@hadoop002 hive]$ bin/beeline
Beeline version 1.2.1 by Apache Hive
beeline>
beeline> !connect jdbc:hive2://hadoop002:10000(回车)
Connecting to jdbc:hive2://hadoop002:10000
Enter username for jdbc:hive2://hadoop002:10000: bigdata(回车)
Enter password for jdbc:hive2://hadoop002:10000: (直接回车)
Connected to: Apache Hive (version 1.2.1)
Driver: Hive JDBC (version 1.2.1)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop002:10000> create database guli;
0: jdbc:hive2://hadoop002:10000> use guli;
0: jdbc:hive2://hadoop002:10000> show tables;
+-----------+--+
| tab_name |
+-----------+--+
+-----------+--+
No rows selected (0.036 seconds)
create external table user_text(
uploader string,
videos int,
friends int)
row format delimited fields terminated by '\t'
collection items terminated by '&'
location '/guli/user';
// 查看前五行
0: jdbc:hive2://hadoop002:10000> select * from user_text limit 5;
// video表
create external table video_text(
videoId string,
uploader string,
age int,
category array<string>,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array<string>
)
row format delimited fields terminated by '\t'
collection items terminated by '&'
location '/guli/video_etc';
// 查询
select * from video_text limit 5;
create table video_orc(
videoId string,
uploader string,
age int,
category array<string>,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array<string>
)
row format delimited fields terminated by '\t'
collection items terminated by '&'
stored as orc;
如果创建的是表为如下的这种
就需要输入如下的命令修改,并出现下图标记处的类型就行了:
0: jdbc:hive2://hadoop002:10000> alter table video_orc set tblproperties("EXTERNAL"="FALSE")
0: jdbc:hive2://hadoop002:10000> desc formatted video_orc;
create table user_orc(
uploader string,
videos int,
friends int)
row format delimited fields terminated by '\t'
collection items terminated by '&'
stored as orc;
0: jdbc:hive2://hadoop002:10000> insert into user_orc select * from user_text;
0: jdbc:hive2://hadoop002:10000> insert into video_orc select * from video_text;
0: jdbc:hive2://hadoop002:10000> select * from user_orc limit 5;
0: jdbc:hive2://hadoop002:10000> select * from video_orc limit 5;
好了,到这里,我们就把分析前的数据准备好了。
看 完 就 赞 , 养 成 习 惯 ! ! ! \color{#FF0000}{看完就赞,养成习惯!!!} 看完就赞,养成习惯!!!^ _ ^ ❤️ ❤️ ❤️
码字不易,大家的支持就是我坚持下去的动力。点赞后不要忘了关注我哦!