黑猴子的家:初次启动 Hive Error解决

1、异常重现

[victor@hadoop102 hive]$ bin/hive shell
ls: cannot access /home/hadoop/spark-2.2.0-bin-hadoop2.6/lib/spark-assembly-*.jar: 
No such file or directory
which: no hbase in (/home/hadoop/hive110/bin:
/home/hadoop/spark-2.2.0-bin-hadoop2.6/bin:
/home/hadoop/scala-2.11.0/bin:
/home/hadoop/protobuf250/bin:/home/hadoop/hadoop260/bin:
/home/hadoop/zookeeper345/bin:
/home/hadoop/maven339/bin:
/home/hadoop/jdk1.8.0_144/bin:
/home/hadoop/spark-2.2.0-bin-hadoop2.6/bin:
/home/hadoop/scala-2.11.0/bin:
/home/hadoop/protobuf250/bin:
/home/hadoop/hadoop260/bin:
/home/hadoop/zookeeper345/bin:
/home/hadoop/maven339/bin:
/home/hadoop/jdk1.8.0_144/bin:
/usr/local/bin:
/bin:/usr/bin:/usr/local/sbin:
/usr/sbin:/sbin:/home/hadoop/bin)

Logging initialized using configuration in file:
/home/hadoop/hive110/conf/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.

2、出现这个问题的原因

spark升级到spark2以后,原有lib目录下的大JAR包被分散成多个小JAR包,原来的spark-assembly-*.jar已经不存在,所以hive没有办法找到这个JAR包。

3、解决方法

编辑hive的安装目录下的bin目录的hive文件,修改sparkAssemblyPath变量

[victor@hadoop102 hive]$ vim bin/hive
 # add Spark assembly jar to the classpath
 if [[ -n "$SPARK_HOME" ]]
 then
  #sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`
  sparkAssemblyPath=`ls ${SPARK_HOME}/jars/*.jar`
  CLASSPATH="${CLASSPATH}:${sparkAssemblyPath}"
 fi

黑猴子的家:初次启动 Hive Error解决_第1张图片

问题解决

你可能感兴趣的:(Hive)