jdbc调用sparksql

将hive-site.xml拷贝到spark目录下conf文件夹
local模式

spark-sql --driver-class-path /usr/local/hive-1.2.1/lib/mysql-connector-java-5.1.31-bin.jar

或者
需要在$SPARK_HOME/conf/spark-env.sh中的SPARK_CLASSPATH添加jdbc驱动的jar包

export export SPARK_CLASSPATH=$SPARK_CLASSPATH:/usr/local/hive-1.2.1/lib/mysql-connector-java-5.1.31-bin.jar

连接到集群:

spark-sql --master spark://10.8.2.100:7077  --driver-class-path /usr/local/hive-1.2.1/lib/mysql-connector-java-5.1.31-bin.jar

开启thriftserver,指定服务器为hadoop-master

内网连接:
sbin/start-thriftserver.sh --master spark://10.9.2.100:7077  --driver-class-path /usr/local/hive-1.2.1/lib/mysql-connector-java-5.1.31-bin.jar
外网连接:
sbin/start-thriftserver.sh   --hiveconf hive.server2.thrift.port=10000  --hiveconf hive.server2.thrift.bind.host=hadoop-master     --master spark://10.9.2.100:7077  --driver-class-path /usr/local/hive-1.2.1/lib/mysql-connector-java-5.1.31-bin.jar

停止thriftserver

sbin/stop-thriftserver.sh 

beeline客户端访问thrift server,下面两个ip是hadoop-master的内外网地址

beeline -u jdbc:hive2://10.9.2.100:10000

beeline -u jdbc:hive2://122.23.368.32:10000

java连接验证程序:

import java.sql.*; 



/** * 注意: * 使用JavaHiveContext时 * 1:需要在classpath下面增加三个配置文件:hive-site.xml,core-site.xml,hdfs-site.xml * 2:需要增加postgresql或mysql驱动包的依赖 * 3:需要增加hive-jdbc,hive-exec的依赖 * */

public class SimpleDemo {

    public static void main(String[] args) {

        System.out.println("start");

        String jdbcdriver="org.apache.hive.jdbc.HiveDriver"; 

        String jdbcurl="jdbc:hive2://122.23.368.32:10000"; 

        String username="hive"; 

        String password="hive"; 

        try

        {

            Class.forName(jdbcdriver);



            Connection c = DriverManager.getConnection(jdbcurl,username,password); 

            System.out.println("fffffff:");

            Statement st = c.createStatement(); 

            print( "num should be 1 " , st.executeQuery("select count(*) from test"));

        } catch (Exception e)

        {

            // TODO Auto-generated catch block

            System.out.println("fffffff:");

            e.printStackTrace();

        }  



    }

    static void print( String name , ResultSet res )  

            throws SQLException {  

            System.out.println( name); 

            ResultSetMetaData meta=res.getMetaData(); 

            //System.out.println( "\t"+res.getRow()+"条记录"); 

            String  str=""; 

            for(int i=1;i<=meta.getColumnCount();i++){ 

                str+=meta.getColumnName(i)+" "; 

                //System.out.println( meta.getColumnName(i)+" "); 

            }  

            System.out.println("\t"+str); 

            str=""; 

            while ( res.next() ){  

                for(int i=1;i<=meta.getColumnCount();i++){ 

                    str+= res.getString(i)+" "; } 

                System.out.println("\t"+str); 

                str=""; 

            }  

        }         

}

你可能感兴趣的:(jdbc,sparksql)