Spark启动时出现Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configur

CDH版本启动Spark时报一下错误:

$ sbin/start-all.sh 
starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/spark-1.6.0-cdh5.13.2/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-bigdata.out
failed to launch org.apache.spark.deploy.master.Master:
  	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  	... 7 more
full log in /home/hadoop/spark-1.6.0-cdh5.13.2/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-bigdata.out
bigdata: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-1.6.0-cdh5.13.2/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-bigdata.out
bigdata: failed to launch org.apache.spark.deploy.worker.Worker:
bigdata:   	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
bigdata:   	... 7 more
bigdata: full log in /home/hadoop/spark-1.6.0-cdh5.13.2/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-bigdata.out

首先查看日志获取错误信息:

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 7 more

原因是CDH版的Spark从1.4版本以后,Spark编译时都没有将hadoop的classpath编译进去,所以必须在spark-env.sh中指定hadoop中的所有jar包。
具体设置:
在spark-env.sh中添加一条配置信息,将hadoop的classpath引入

export export SPARK_DIST_CLASSPATH=$(${HADOOP_HOME}/bin/hadoop classpath)

之后有可能还会报如下错误:

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 7 more

这种一般就是缺少jar包 下载下来添加到lib里即可。还有就是要讲scala安装到当前集群中。

你可能感兴趣的:(云计算,Spark,大数据)