新一代Hive客户端工具:Beeline

Hive客户端工具后续将使用Beeline 替代HiveCLI ,并且后续版本也会废弃掉HiveCLI 客户端工具。

 

BeelineHive新的命令行客户端工具。

Beeline是从 Hive 0.11版本引入的。

 

HiveServer2 支持一个新的命令行Shell,称为Beeline,它是基于SQLLine CLIJDBC客户端。

Beeline支持嵌入模式(embedded mode)和远程模式(remote mode)。在嵌入式模式下,运行嵌入式的Hive(类似Hive CLI),而远程模式可以通过Thrift连接到独立的HiveServer2进程上。从Hive 0.14版本开始,Beeline使用HiveServer2工作时,它也会从HiveServer2输出日志信息到STDERR

 

[root@SZB-L0007970 bin]# hive --service beeline

Beeline version 0.13.1-cdh5.3.4 by Apache Hive

beeline> !connect jdbc:hive2://SZB-L0007970:10000/annuity_safe

scan complete in 3ms

Connecting to jdbc:hive2://SZB-L0007970:10000/annuity_safe

Enter username for jdbc:hive2://SZB-L0007970:10000/annuity_safe: hive

Enter password for jdbc:hive2://SZB-L0007970:10000/annuity_safe: ****

Connected to: Apache Hive (version 0.13.1-cdh5.3.4)

Driver: Hive JDBC (version 0.13.1-cdh5.3.4)

Transaction isolation: TRANSACTION_REPEATABLE_READ

0: jdbc:hive2://SZB-L0007970:10000/annuity_sa> show tables;

+---------------------------------+--+

|            tab_name             |

+---------------------------------+--+

| orc_table                       |

| parquet_t                       |

| parquet_test                    |

| simple_table                    |

| spark_create                    |

| src                             |

| test_t                          |

| ylx_hbase_hive                  |

| ylx_hivehbase_cid               |

| ylx_map                         |

| ylx_multi_mapping               |

| ylx_pops                        |

+---------------------------------+--+

16 rows selected (0.227 seconds)

 

我们也可以通过命令行去执行SQL命令或脚本:

[root@SZB-L0007970 bin]# beeline -u jdbc:hive2://SZB-L0007970:10000/annuity_safe --color=true --silent=false -e"show tables"

scan complete in 3ms

Connecting tojdbc:hive2://SZB-L0007970:10000/annuity_safe

Connected to: Apache Hive (version0.13.1-cdh5.3.4)

Driver: Hive JDBC (version 0.13.1-cdh5.3.4)

Transaction isolation:TRANSACTION_REPEATABLE_READ

+---------------------------------+--+

|           tab_name             |

+---------------------------------+--+

| orc_table                       |

| parquet_t                       |

| parquet_test                    |

| simple_table                    |

| spark_create                    |

| src                             |

| test_t                          |

| ylx_hbase_hive                  |

| ylx_hivehbase_cid               |

| ylx_map                         |

| ylx_multi_mapping               |

| ylx_pops                        |

+---------------------------------+--+

16 rows selected (0.151 seconds)

Beeline version 0.13.1-cdh5.3.4 by ApacheHive

Closing: 0:jdbc:hive2://SZB-L0007970:10000/annuity_safe

 

也可以执行SQL脚本:

[root@SZB-L0007970 ~]# beeline -u jdbc:hive2://SZB-L0007970:10000/annuity_safe  -f  beeline.sql

scan complete in 3ms

Connecting tojdbc:hive2://SZB-L0007970:10000/annuity_safe

Connected to: Apache Hive (version0.13.1-cdh5.3.4)

Driver: Hive JDBC (version 0.13.1-cdh5.3.4)

Transaction isolation:TRANSACTION_REPEATABLE_READ

0: jdbc:hive2://SZB-L0007970:10000/annuity_sa>use annuity_safe;

No rows affected (0.052 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> show tables;

+---------------------------------+--+

|           tab_name             |

+---------------------------------+--+

| orc_table                       |

| parquet_t                       |

| parquet_test                    |

| simple_table                    |

| spark_create                    |

| src                             |

| test_t                          |

| ylx_hbase_hive                  |

| ylx_hivehbase_cid               |

| ylx_map                         |

| ylx_multi_mapping               |

| ylx_pops                        |

+---------------------------------+--+

16 rows selected (0.132 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa>

Closing: 0: jdbc:hive2://SZB-L0007970:10000/annuity_safe

[root@SZB-L0007970 ~]#

 

执行初始化SQL文件:

[root@SZB-L0007970 ~]# beeline -u jdbc:hive2://SZB-L0007970:10000/annuity_safe -i beeline.sql

scan complete in 2ms

Connecting tojdbc:hive2://SZB-L0007970:10000/annuity_safe

Connected to: Apache Hive (version0.13.1-cdh5.3.4)

Driver: Hive JDBC (version 0.13.1-cdh5.3.4)

Transaction isolation:TRANSACTION_REPEATABLE_READ

Running init script beeline.sql

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> use annuity_safe;

No rows affected (0.052 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> show tables;

+---------------------------------+--+

|           tab_name             |

+---------------------------------+--+

| orc_table                       |

| parquet_t                       |

| parquet_test                    |

| simple_table                    |

| spark_create                    |

| src                             |

| test_t                          |

| ylx_hbase_hive                  |

| ylx_hivehbase_cid               |

| ylx_map                         |

| ylx_multi_mapping               |

| ylx_pops                        |

+---------------------------------+--+

16 rows selected (0.136 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa>

Beeline version 0.13.1-cdh5.3.4 by ApacheHive

0: jdbc:hive2://SZB-L0007970:10000/annuity_sa>select max(key) from test_t;

Error: Error while processing statement: FAILED: ExecutionError, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask(state=08S01,code=1)

解释:这个如果不跟用户名和密码登陆beeline的话,就会报这个错误。

下面我们通过Beeline登陆Hive服务器时要指定用户名和密码。

[root@SZB-L0007970 ~]# beeline -ujdbc:hive2://SZB-L0007970:10000/annuity_safe -n hive -p hive -i beeline.sql

scan complete in 2ms

Connecting tojdbc:hive2://SZB-L0007970:10000/annuity_safe

Connected to: Apache Hive (version0.13.1-cdh5.3.4)

Driver: Hive JDBC (version 0.13.1-cdh5.3.4)

Transaction isolation:TRANSACTION_REPEATABLE_READ

Running init script beeline.sql

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> use annuity_safe;

No rows affected (0.043 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> show tables;

+---------------------------------+--+

|           tab_name             |

+---------------------------------+--+

| orc_table                       |

| parquet_t                       |

| parquet_test                    |

| simple_table                    |

| spark_create                    |

| src                             |

| test_t                          |

| ylx_hbase_hive                  |

| ylx_hivehbase_cid               |

| ylx_map                         |

| ylx_multi_mapping               |

| ylx_pops                        |

+---------------------------------+--+

16 rows selected (0.145 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> select * from test_t;

+-------------+---------------+--+

| test_t.key  | test_t.value  |

+-------------+---------------+--+

| 1           | pingan        |

| 2           | huazhong      |

+-------------+---------------+--+

2 rows selected (0.227 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> select max(key) from test_t;

+------+--+

| _c0 |

+------+--+

| 2   |

+------+--+

1 row selected (30.606 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa>

这样就可以在Beeline里面对Hive执行相关操作了。

 

说明一下:

参数-f和-i的区别是:

-f执行SQL文件后就直接退出Beeline客户端

而-i执行SQL文件后进行Beeline客户端。

 

如果想查看beeline的帮助信息,执行如下命令:

[root@SZB-L0007970 bin]# beeline --help

Usage: javaorg.apache.hive.cli.beeline.BeeLine

   -u                the JDBC URL to connect to

   -n                    the username to connect as

   -p                   thepassword to connect as

   -d               thedriver class to use

   -i                   script file forinitialization

   -e                      query that should be executed

   -f                   script filethat should be executed

  --hiveconf property=value      Use value for given property

  --hivevar name=value           hive variable name and value

                                   This is Hive specificsettings in which variables

                                   can be setat session level and referenced in Hive

                                   commands orqueries.

  --color=[true/false]           control whether color is used for display

  --showHeader=[true/false]      show column names in query results

  --headerInterval=ROWS;         the interval between which heades are displayed

  --fastConnect=[true/false]     skip building table/column list for tab-completion

   --autoCommit=[true/false]       enable/disable automatic transactioncommit

   --verbose=[true/false]          show verbose error messages and debuginfo

  --showWarnings=[true/false]    display connection warnings

  --showNestedErrs=[true/false]  display nested errors

  --numberFormat=[pattern]       format numbers using DecimalFormat pattern

  --force=[true/false]           continue running script even after errors

  --maxWidth=MAXWIDTH            the maximum width of the terminal

  --maxColumnWidth=MAXCOLWIDTH   the maximum width to use when displaying columns

  --silent=[true/false]           bemore silent

  --autosave=[true/false]        automatically save preferences

  --outputformat=[table/vertical/csv2/tsv2/dsv/csv/tsv]  format mode for result display

                                   Note thatcsv, and tsv are deprecated - use csv2, tsv2 instead

 --truncateTable=[true/false]   truncate table column when it exceeds length

  --delimiterForDSV=DELIMITER    specify the delimiter for delimiter-separated values output format(default: |)

  --isolation=LEVEL              set the transaction isolation level

   Setthe transaction isolation level to TRANSACTION_READ_COMMITTED 
or TRANSACTION_SERIALIZABLE.

 

  --nullemptystring=[true/false] set to true to get historic behavior of printing null as empty string

  --help                         display this message

Beeline version 0.13.1-cdh5.3.4 by ApacheHive

 

常用的一些命令总结:

(1)    reset

还原Beeline客户端对Hive参数的设置

 

(2)    dfs

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa> dfs -ls /apps-data;

+-----------------------------------------------------------------------------------+--+

|        DFS Output                                     |

+-----------------------------------------------------------------------------------+--+

|Found 1 items                                           |

| drwxr-xr-x - hdfs supergroup 0 2015-10-2216:13 /apps-data/hduser0103  |

+-----------------------------------------------------------------------------------+--+

2 rows selected (0.008 seconds)

0:jdbc:hive2://SZB-L0007970:10000/annuity_sa>

 

你可能感兴趣的:(Hive,Hive实战)