spark使用hive出错,添加以下配置

错误:Failed to start database 'metastore_db' with class loader org.apache.spark.sql



SPARK_HOME/CONF/spark-env.sh中配置:

export HIVE_CONF_DIR=/opt/modules/hive-1.0.1/conf
export CLASSPATH=$CLASSPATH:/opt/modules/hive-1.0.1/lib
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/opt/modules/hive-1.0.1/lib/mysql-connector-java-5.1.37-bin.jar

将hive-site.xml拷贝到SPARK_HOME/conf下
cp hive-site.xml /opt/modules/spark-1.6.0-bin-hadoop2.6/conf/

你可能感兴趣的:(spark,hive出错)