Hive学习笔记:Hive JDBC+Java API

环境:CentOS7
           hive-1.1.0-cdh5.14.0
           hadoop-2.6.0-cdh5.14.0

Hive JDBC配置与实现

修改$HIVE_HOIME/conf下的hive-site.xml,添加以下内容


        hive.server2.thrift.port
        10000


        hive.server2.thrift.bind.host
        0.0.0.0

设置Hiveserver2 Thrift的port和host(host设置成0.0.0.0,来接收未知来源的ip)

nohup启动metastore和hiveserver2(当然是要先起hdfs和yarn)
nohup hive --service metastore > metastore.log 2>&1 &
nohup hive --service hiveserver2 > hiveserver2.log 2>&1 &
(可以写在一个shell文件里,然后在测试环境要起hive的时候就可以偷懒了233)
jps看一下有两个runjar那就说明没有问题啦,或者看一下生成的日志信息保证没有报错

然后使用beeline连接hive

[root@centos bin]# beeline
Beeline version 1.1.0-cdh5.14.0 by Apache Hive
beeline> !connect jdbc:hive2://centos:10000
scan complete in 1ms
Connecting to jdbc:hive2://centos:10000
Enter username for jdbc:hive2://centos:10000: root
Enter password for jdbc:hive2://centos:10000: ******
Connected to: Apache Hive (version 1.1.0-cdh5.14.0)
Driver: Hive JDBC (version 1.1.0-cdh5.14.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://centos:10000> 

Java API 操作 

新建一个Maven Project,这是用到的pom.xml,只需要将到间的复制到自己的project,update一下maven project就可以了


  4.0.0
  06HiveJDBCJava
  06HiveJDBCJava
  0.0.1-SNAPSHOT
  
   
    
    	
    		
    		
    			org.eclipse.m2e
    			lifecycle-mapping
    			1.0.0
    			
    				
    					
    						
    							
    								
    									org.apache.maven.plugins
    								
    								
    									maven-compiler-plugin
    								
    								[3.1,)
    								
    									compile
    								
    							
    							
    								
    							
    						
    					
    				
    			
    		
    	
    
  
  
  
    
      cloudera
      https://repository.cloudera.com/artifactory/cloudera-repos/
    
    
      maven
      http://central.maven.org/maven2/
    
    
    alimaven  
  			http://maven.aliyun.com/nexus/content/groups/public/
      
     


		  
		    org.apache.maven.plugins  
		    maven-resources-plugin  
		    2.5  
		  
	  
		org.apache.hadoop
		hadoop-common
		2.6.0-cdh5.14.0
		
		
		org.apache.hadoop
		hadoop-hdfs
		2.6.0-cdh5.14.0
		
		
		org.apache.hadoop
		hadoop-maven-plugins
		2.6.0-cdh5.14.0
		
		 
            org.apache.hadoop
            hadoop-client
            2.6.0-cdh5.14.0
        
		 
		    org.apache.hadoop
		    hadoop-yarn-server-resourcemanager
		    2.6.0-cdh5.14.0
		 
		
		 
		    org.apache.hadoop
		    hadoop-yarn-server-nodemanager
		    2.6.0-cdh5.14.0
		 
		        
		 
		    		org.apache.hadoop
		    		hadoop-yarn-common
		   		2.6.0-cdh5.14.0
		 
		 
    org.apache.hadoop
    hadoop-mapreduce-client-app
    2.6.0-cdh5.14.0
		
		 
		    org.apache.hadoop
		    hadoop-mapreduce-client-common
		    2.6.0-cdh5.14.0
		
		
		    org.apache.hadoop
		    hadoop-mapreduce-client-core
		    2.6.0-cdh5.14.0
		
		 
		    org.apache.hadoop
		    hadoop-mapreduce-client-jobclient
		    2.6.0-cdh5.14.0
		
		
		     org.apache.hadoop
		     hadoop-yarn-api
		    2.6.0-cdh5.14.0
		 
		
		    org.apache.hadoop
		    hadoop-yarn-client
		    2.6.0-cdh5.14.0
		
	
        org.apache.hive
        hive-exec
        1.1.0-cdh5.14.0
     
      
    org.apache.hive
    hive-common
    1.1.0-cdh5.14.0

 
    org.apache.hive
    hive-jdbc
    1.1.0-cdh5.14.0




    org.apache.hive
    hive-metastore
    1.1.0-cdh5.14.0




    org.apache.hive
    hive-exec
    1.1.0-cdh5.14.0

	  
			org.apache.mrunit
			mrunit
			1.1.0
			hadoop2
			test
		
		
			org.mockito
			mockito-all
			1.9.5
			test
		
		
			junit
			junit
			4.10
			test
		
		  
            jdk.tools  
            jdk.tools  
            1.8  
            system  
            ${JAVA_HOME}/lib/tools.jar  
          
 

具体实现

package com.jdbc.test;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;

public class HiveJDBCTest 
{
	//驱动,URL,用户名,密码,数据库
	public static final String HIVE_DRIVER="org.apache.hive.jdbc.HiveDriver";
	public static final String HIVE_URL="jdbc:hive2://centos:10000";
	public static final String USER_NAME="root";
	public static final String PASSWORD="123456";
	
	public static void main(String[] args) throws SQLException 
	{
		Connection conn=null;
		PreparedStatement pstmt=null,pstmt1=null;
		ResultSet rs=null;
		
		try {
			//加载驱动,创建连接
			Class.forName(HIVE_DRIVER);
			conn = DriverManager.getConnection(HIVE_URL, USER_NAME, PASSWORD);
			
			//使用PreparedStatement,这里注意,查询的时候使用的是executeQuery,而且会有结果集返回其他的加载数据,创表创库等操作用的是execute方法
			pstmt1=conn.prepareStatement("load data local inpath"+"/student.txt"+"overwrite into table student");
			pstmt1.execute();
			
			pstmt = conn.prepareStatement("select * from student");
			rs=pstmt.executeQuery();
			
			//遍历输出每一行查询结果,注意get方法对应的字段的类型
			while(rs.next()!=false)
			{
				System.out.println(rs.getInt(1)+" "+rs.getString(2)+" "+rs.getString(3)+" "+rs.getInt(4));
			}
			
		} catch (ClassNotFoundException e) {
			e.printStackTrace();
		} catch (SQLException e) {
			e.printStackTrace();
		}

		//释放资源
		rs.close();
		pstmt.close();
		pstmt1.close();
		conn.close();
		
	}
}

这里值得一提的是,这里使用了PreparedStatement,关于这个有一篇文章写的很详细,这里转个链接
http://www.cnblogs.com/zhizhuwang/p/3513372.html

你可能感兴趣的:(Hive)