参考链接1.http://www.iteblog.com/archives/846
操作实际参考代码:http://sunhs.sinaapp.com/?p=343
配置eclipse 搞1天, 不管怎样导入包一直提示DriverManager not found exception:
注意1:要导入的包hive1.2.1里面的一个$HIVE_HOME/lib/slf4j-api-1.6.1.jar 应该没有,要到hadoop/share/hadoop/common/lib 下面导入以下2个代替,
/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar
/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar
注意2:虽然hive创建在mysql 的元数据库是hive,但是我们这里写的数据库只能写default,不能写hive,不然怎么调整mysql 都会提示数据库不存在,
但是改成default 之后,在mysql 中hive 数据库下面的TBLS表里可以查询到下面代码运行所创建的testhivedrivertable表.
附: 在eclipse 运行代码的过程中,在hive --service hiveserver2 terminal 终端可以看到程序如map 进程.
原因如下:
上面是用Java连 接HiveServer,而HiveServer本身存在很多问题(比如:安全性、并发性等);针对这些问题,Hive0.11.0版本提供了一个全新的 服务:HiveServer2,这个很好的解决HiveServer存在的安全性、并发性等问题。这个服务启动程序在${HIVE_HOME}/bin /hiveserver2里面,你可以通过下面的方式来启动HiveServer2服务:
$HIVE_HOME/bin/hiveserver2
也可以通过下面的方式启动HiveServer2
$HIVE_HOME/bin/hive --service hiveserver2
两种方式效果都一样的。但是以前的程序需要修改两个地方,如下所示:
private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver"; 改为 private static String driverName = "org.apache.hive.jdbc.HiveDriver"; Connection con = DriverManager.getConnection( "jdbc:hive://localhost:10002/default", "wyp", ""); 改为 Connection con = DriverManager.getConnection( "jdbc:hive2://localhost:10002/default", "wyp", "");
这里顺便说说本程序所依赖的jar包,一共有以下几个(版本不同,但是匹配类似就行):
hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar $HIVE_HOME/lib/hive-exec-0.11.0.jar $HIVE_HOME/lib/hive-jdbc-0.11.0.jar $HIVE_HOME/lib/hive-metastore-0.11.0.jar $HIVE_HOME/lib/hive-service-0.11.0.jar $HIVE_HOME/lib/libfb303-0.9.0.jar $HIVE_HOME/lib/commons-logging-1.0.4.jar $HIVE_HOME/lib/slf4j-api-1.6.1.jar
代码
package getConnect; import java.sql.SQLException; import java.sql.Connection; import java.sql.ResultSet; import java.sql.Statement; import java.sql.DriverManager; public class HiveJdbcClient { private static String driverName = "org.apache.hive.jdbc.HiveDriver"; /** * @param args * @throws SQLException */ public static void main(String[] args) throws SQLException { try { Class.forName(driverName); } catch (ClassNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); System.exit(1); } //replace "hive" here with the name of the user the queries should run as Connection con = DriverManager.getConnection("jdbc:hive2://192.168.2.35:10000/default", "hadoop", "1234"); Statement stmt = con.createStatement(); String tableName = "testHiveDriverTable"; stmt.execute("drop table if exists " + tableName); stmt.execute("create table " + tableName + " (key int, value string)"); // show tables String sql = "show tables '" + tableName + "'"; System.out.println("Running: " + sql); ResultSet res = stmt.executeQuery(sql); if (res.next()) { System.out.println(res.getString(1)); } // describe table sql = "describe " + tableName; System.out.println("Running: " + sql); res = stmt.executeQuery(sql); while (res.next()) { System.out.println(res.getString(1) + "\t" + res.getString(2)); } // load data into table // NOTE: filepath has to be local to the hive server // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line String filepath = "/usr/local/hive/examples/files/kv1.txt"; sql = "load data local inpath '" + filepath + "' into table " + tableName; System.out.println("Running: " + sql); stmt.execute(sql); // select * query sql = "select * from " + tableName; System.out.println("Running: " + sql); res = stmt.executeQuery(sql); while (res.next()) { System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2)); } // regular hive query sql = "select count(1) from " + tableName; System.out.println("Running: " + sql); res = stmt.executeQuery(sql); while (res.next()) { System.out.println(res.getString(1)); }}}