spark-shell Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with ....

主要错误信息如下:
Caused by: org.apache.derby.iapi.error.StandardException: Container Container(0, 401) cannot be opened; it either has been dropped or does not exist.

Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@65d9e72a, see the next exception for details.

Caused by: ERROR 40XD2: Container Container(0, 401) cannot be opened; it either has been dropped or does not exist.
......
原因:可能之前您的环境和hive整合过
解决方法:
删除spark安装目录下的metastore_db目录,再次启动报如下错误
java.lang.IllegalArgumentException: Error while instantiating 
'org.apache.spark.sql.hive.HiveSessionState':
Caused by: java.net.ConnectException: Call From bigdata131/192.168.137.131 to 
bigdata131:9000

原因:可能您之前和yarn整合过
解决方法:
方法1:启动hdfs
方法2:修改spark配置文件和HADOOP相关的内容
  vim ./conf/spark-env.sh
  #export HADOOP_CONF_DIR=/root/training/hadoop-2.8.5/etc/hadoop 将这行改为
  export HADOOP_CONF_DIR=  注意这里值给空,但是变量要存在
  再次启动
  ./bin/spark-shell --master spark://bigdata131:7077
  ......
  Spark session available as 'spark'.
  Welcome to
  ____              __
  / __/__  ___ _____/ /__
  _\ \/ _ \/ _ `/ __/  '_/
  /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
  /_/

  Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_144)
  Type in expressions to have them evaluated.
  Type :help for more information.
  scala>

 

你可能感兴趣的:(Spark)