Spark问题解决 - scala.Predef$.$scope()Lscala/xml/TopScope$和not found: type Application异常

使用intellij idea+scala+spark,运行程序提示下面错误。

问题1描述:java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/03 22:35:16 INFO SparkContext: Running Spark version 2.1.0
17/10/03 22:35:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/03 22:35:17 INFO SecurityManager: Changing view acls to: Administrator
17/10/03 22:35:17 INFO SecurityManager: Changing modify acls to: Administrator
17/10/03 22:35:17 INFO SecurityManager: Changing view acls groups to: 
17/10/03 22:35:17 INFO SecurityManager: Changing modify acls groups to: 
17/10/03 22:35:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Administrator); groups with view permissions: Set(); users  with modify permissions: Set(Administrator); groups with modify permissions: Set()
17/10/03 22:35:18 INFO Utils: Successfully started service 'sparkDriver' on port 63233.
17/10/03 22:35:18 INFO SparkEnv: Registering MapOutputTracker
17/10/03 22:35:18 INFO SparkEnv: Registering BlockManagerMaster
17/10/03 22:35:18 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/10/03 22:35:18 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/10/03 22:35:18 INFO DiskBlockManager: Created local directory at C:\Users\Administrator\AppData\Local\Temp\blockmgr-7d37f54c-7f7d-4452-bbe1-edd74a1b3cef
17/10/03 22:35:18 INFO MemoryStore: MemoryStore started with capacity 908.1 MB
17/10/03 22:35:18 INFO SparkEnv: Registering OutputCommitCoordinator
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
    at org.apache.spark.ui.jobs.AllJobsPage.(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:65)
    at org.apache.spark.ui.SparkUI.(SparkUI.scala:82)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:220)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:162)
    at org.apache.spark.SparkContext.(SparkContext.scala:452)
    at cn.hadron.JoinDemo$.main(JoinDemo.scala:10)
    at cn.hadron.JoinDemo.main(JoinDemo.scala)
17/10/03 22:35:18 INFO DiskBlockManager: Shutdown hook called
17/10/03 22:35:18 INFO ShutdownHookManager: Shutdown hook called
17/10/03 22:35:18 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-fa8aeada-59ea-402b-98cd-1f0424746877\userFiles-e03aaa25-fd89-45a5-9917-bde095172ac8
17/10/03 22:35:18 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-fa8aeada-59ea-402b-98cd-1f0424746877

Process finished with exit code 1

解决方法:

原因是版本依赖出了问题,将原来的pom.xml

 
    org.apache.spark
    spark-core_2.10
    2.1.0
 

修改为:


    org.apache.spark
    spark-core_2.11
    2.1.1

再次运行,上面的问题已经消失,但是出现下面的问题。

问题2描述:not found: type Application

Error:(7, 20) not found: type Application
object App extends Application {

解决方法:

参考: 
http://stackoverflow.com/questions/26176509/why-does-2-11-1-fail-with-error-not-found-type-application 
Application has been deprecated from scala 2.9, probably it has been deleted in scala 2.11 (it still exists in scala 2.10) even though at the moment I can’t find proofs for that, use App instead.

this is the scala 2.11 branch on github which has only an App.scala and this is the 2.10which has App.scala and Application.scala with a deprecated warning.

既然App.scala和Application.scala已经过时,直接删除生成App.scala文件即可。 

再次运行,正常。

你可能感兴趣的:(大数据-Spark)