scala中spark错误 Error initializing SparkContext

在做日志分析的时候,spark出现一个很匪夷所思的问题,更新完代码,运行本地环境报错(错误见下),在集群yarn环境正常

2017-08-29 09:46:30 [org.apache.hadoop.util.NativeCodeLoader]-[WARN] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-08-29 09:46:32 []-[WARN] Your hostname, USER-20151116VQ resolves to a loopback/non-reachable address: fe80:0:0:0:e08b:ef97:c992:73ae%18, but we couldn't find any external IP address!
2017-08-29 09:46:33 [org.apache.spark.SparkContext]-[ERROR] Error initializing SparkContext.
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
    at java.lang.ClassLoader.checkCerts(ClassLoader.java:944)
    at java.lang.ClassLoader.preDefineClass(ClassLoader.java:658)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:786)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:136)
    at org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:129)
    at org.spark-project.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:98)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:129)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:116)
    at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:79)
    at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:63)
	at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:63)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:63)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
    at org.apache.spark.ui.SparkUI.(SparkUI.scala:80)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:208)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:150)
    at org.apache.spark.SparkContext.(SparkContext.scala:484)
    at org.richinfo.se.process.clean.ClientServiceLogClean$.run(ClientServiceLogClean.scala:136)
    at org.richinfo.se.process.clean.ClientServiceLogClean$.main(ClientServiceLogClean.scala:112)
    at org.richinfo.se.process.clean.ClientServiceLogClean.main(ClientServiceLogClean.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

提示我包冲突,无法初始化,以下是我的依赖包

<dependencies>
        <dependency>
            <groupId>org.scala-langgroupId>
            <artifactId>scala-libraryartifactId>
            <version>${scala.version}version>
        dependency>
        <dependency>
            <groupId>org.richinfo.segroupId>
            <artifactId>se-common-usertransformsartifactId>
            <version>0.0.2-RELEASESversion>
        dependency>
        <dependency>
            <groupId>mysqlgroupId>
            <artifactId>mysql-connector-javaartifactId>
            <version>5.1.30version>
        dependency>
        <dependency>
            <groupId>org.apache.httpcomponentsgroupId>
            <artifactId>httpmimeartifactId>
            <version>4.5.2version>
        dependency>
        <dependency>
            <groupId>org.noggitgroupId>
            <artifactId>noggitartifactId>
            <version>0.5version>
        dependency>
        <dependency>
            <groupId>org.apache.sparkgroupId>
            <artifactId>spark-core_2.10artifactId>
            <version>1.6.0-cdh5.8.0version>
        dependency>
        <dependency>
            <groupId>org.apache.solrgroupId>
            <artifactId>solr-solrjartifactId>
            <version>4.6.1version>
        dependency>
        <dependency>
            <groupId>org.apache.sparkgroupId>
            <artifactId>spark-sql_2.10artifactId>
            <version>1.6.0-cdh5.8.0version>
        dependency>
    dependencies>

**怎么检查也没检查出来哪个包有问题,猜测可能是solr中的包引用和spark中的hadoop依赖包冲突,最后从stackover上发现一个有些类似的情况:

https://stackoverflow.com/questions/29742480/spark-runtime-error-spark-metrics-sink-metricsservlet-cannot-be-instantialized

**
尝试更改maven引入顺序,见spark引用包换到scala包下面,程序顺利执行。。。
总结:包的引用顺序也会造成一系列的问题,且问题很难查找,任何细小的问题都可能造成很复杂的错误,耐心加细心

你可能感兴趣的:(hadoop,spark)