spark操作hive

1、下载spark并解压

wget http://mirrors.tuna.tsinghua.edu.cn/apache/spark/spark-2.3.4/spark-2.3.4-bin-hadoop2.6.tgz

tar zxvf spark-2.3.4-bin-hadoop2.6.tgz

2、配置

1)将hive-site.xml拷贝到$SPARK_HOME/conf目录下。

2)spark-env.sh增加内容:

export JAVA_HOME=/usr/java/jdk1.8.0
export CLASSPATH=$CLASSPATH:/opt/cloudera/parcels/CDH/lib/hive/lib
export SCALA_HOME=/data/scala-2.12.8
export HADOOP_CONF_DIR=/etc/hadoop/conf
export HIVE_CONF_DIR=/etc/hive/conf

3、测试

$SPARK_HOME/bin/spark-sql

spark操作hive_第1张图片

你可能感兴趣的:(Spark,Hive)