spark 本地远程连接hive

1 将 core-site.xml,hdfs-site.xml,hive-site.xml三个文件从服务器上down下来,放在项目的resources目录中

2 添加maven依赖,注意版本号要一致


      org.apache.spark
      spark-mllib_2.11
      2.2.0


      org.apache.spark
      spark-hive_2.11
      2.2.0


      mysql
      mysql-connector-java
      5.1.39

3 测试代码

SparkSession sparkSession = SparkSession.builder()
                .appName("app_1")
                .master("local[*]")
                .enableHiveSupport()
                .getOrCreate();
        sparkSession.sparkContext().setLogLevel("ERROR");

        SparkContext sparkContext = sparkSession.sparkContext();
        JavaSparkContext javaSparkContext = new JavaSparkContext(sparkContext);

        sparkSession.catalog().listDatabases().show(50,false);
        Dataset sql = sparkSession.sql("select * from test02.deliver");
        sql.show();
        sql.printSchema();

 

你可能感兴趣的:(#,spark)