hive ls: cannot access /usr/local/src/spark-2.1.3-bin-hadoop2.6/lib/spark-assembly-*.jar: No such fi

执行hive报错
ls: cannot access /usr/local/src/spark-2.1.3-bin-hadoop2.6/lib/spark-assembly-*.jar: No such file or directory

[root@server ~]# hive
ls: cannot access /usr/local/src/spark-2.1.3-bin-hadoop2.6/lib/spark-assembly-*.jar: No such file or directory

Logging initialized using configuration in jar:file:/usr/local/src/hive-1.2.2-bin/lib/hive-common-1.2.2.jar!/hive-log4j.properties
hive> 
原因:

spark2.0版本之后,原有lib目录下的spark-assembly-*.jar包被拆成多个jar包,存放路径是在spark的jars目录下。

解决:编辑hive文件,修改关于这个jar路径的代码

进入Hive的安装目录,编辑hive

cd /usr/local/src/hive-1.2.2-bin/bin
vim hive

快速搜索,输入 /spark-assembly ,回车,定位到spark-assembly
i 进入编辑模式

#将如下代码
sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`
#修改为
sparkAssemblyPath=`ls ${SPARK_HOME}/jars/*.jar`

esc,:wq,保存退出
如下图(注意,下图保留了原有的配置,只是为了对比,加#表示注释)
hive ls: cannot access /usr/local/src/spark-2.1.3-bin-hadoop2.6/lib/spark-assembly-*.jar: No such fi_第1张图片


end

你可能感兴趣的:(异常小记,hive,cannot,access,spark-assembly)