spark启动不了

在Windows上的eclipse运行spark代码,但是根本启动不了,以前跑程序没问题,突然出现的问题,请各位大佬帮忙看看,下面是日志情况:
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
19/04/16 17:11:48 INFO SparkContext: Running Spark version 2.1.0
19/04/16 17:11:49 INFO SecurityManager: Changing view acls to: lenovo
19/04/16 17:11:49 INFO SecurityManager: Changing modify acls to: lenovo
19/04/16 17:11:49 INFO SecurityManager: Changing view acls groups to:
19/04/16 17:11:49 INFO SecurityManager: Changing modify acls groups to:
19/04/16 17:11:49 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(lenovo); groups with view permissions: Set(); users with modify permissions: Set(lenovo); groups with modify permissions: Set()
19/04/16 17:11:49 INFO Utils: Successfully started service ‘sparkDriver’ on port 63671.
19/04/16 17:11:49 INFO SparkEnv: Registering MapOutputTracker
19/04/16 17:11:49 INFO SparkEnv: Registering BlockManagerMaster
19/04/16 17:11:49 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/04/16 17:11:49 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/04/16 17:11:49 INFO DiskBlockManager: Created local directory at C:\Users\lenovo\AppData\Local\Temp\blockmgr-50c1a90c-9700-4308-a4e1-de054070d22e
19/04/16 17:11:49 INFO MemoryStore: MemoryStore started with capacity 1992.9 MB
19/04/16 17:11:49 INFO SparkEnv: Registering OutputCommitCoordinator
19/04/16 17:11:49 INFO Utils: Successfully started service ‘SparkUI’ on port 4040.
19/04/16 17:11:49 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://198.198.1.224:4040
19/04/16 17:11:49 INFO Executor: Starting executor ID driver on host localhost
19/04/16 17:11:49 INFO Utils: Successfully started service ‘org.apache.spark.network.netty.NettyBlockTransferService’ on port 63712.
19/04/16 17:11:49 INFO NettyBlockTransferService: Server created on 198.198.1.224:63712
19/04/16 17:11:49 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/04/16 17:11:49 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 198.198.1.224, 63712, None)
19/04/16 17:11:49 INFO BlockManagerMasterEndpoint: Registering block manager 198.198.1.224:63712 with 1992.9 MB RAM, BlockManagerId(driver, 198.198.1.224, 63712, None)
19/04/16 17:11:49 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 198.198.1.224, 63712, None)
19/04/16 17:11:49 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 198.198.1.224, 63712, None)
19/04/16 17:11:50 INFO SparkContext: Invoking stop() from shutdown hook
19/04/16 17:11:50 INFO SparkUI: Stopped Spark web UI at http://198.198.1.224:4040
19/04/16 17:11:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/04/16 17:11:50 INFO MemoryStore: MemoryStore cleared
19/04/16 17:11:50 INFO BlockManager: BlockManager stopped
19/04/16 17:11:50 INFO BlockManagerMaster: BlockManagerMaster stopped
19/04/16 17:11:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/04/16 17:11:50 INFO SparkContext: Successfully stopped SparkContext
19/04/16 17:11:50 INFO ShutdownHookManager: Shutdown hook called
19/04/16 17:11:50 INFO ShutdownHookManager: Deleting directory C:\Users\lenovo\AppData\Local\Temp\spark-9a002292-f6e0-4aab-bfa0-e690556a327c

下面是我的代码:
object Main {
def main(args: Array[String]) {
val conf=new SparkConf().setAppName(“split”).setMaster(“local[*]”)
System.setProperty(“hadoop.home.dir”, “D:/Hadoop/hadoop/”)

Logger.getRootLogger.setLevel(Level.WARN)
val sc = new SparkContext(conf)    

}
}

你可能感兴趣的:(spark启动不了)