Spark代码调试遇到的错误Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:

以下是我调试出现的错误提示:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/08/28 15:07:48 INFO SparkContext: Running Spark version 2.0.0
17/08/28 15:07:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/28 15:07:49 INFO SecurityManager: Changing view acls to: WG
17/08/28 15:07:49 INFO SecurityManager: Changing modify acls to: WG
17/08/28 15:07:49 INFO SecurityManager: Changing view acls groups to: 
17/08/28 15:07:49 INFO SecurityManager: Changing modify acls groups to: 
17/08/28 15:07:49 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(WG); groups with view permissions: Set(); users  with modify permissions: Set(WG); groups with modify permissions: Set()
17/08/28 15:07:50 INFO Utils: Successfully started service 'sparkDriver' on port 51589.
17/08/28 15:07:50 INFO SparkEnv: Registering MapOutputTracker
17/08/28 15:07:50 INFO SparkEnv: Registering BlockManagerMaster
17/08/28 15:07:50 INFO DiskBlockManager: Created local directory at C:\Users\WG\AppData\Local\Temp\blockmgr-0cc8b1c7-db14-4baa-abc4-88deb8e3f74a
17/08/28 15:07:50 INFO MemoryStore: MemoryStore started with capacity 900.6 MB
17/08/28 15:07:50 INFO SparkEnv: Registering OutputCommitCoordinator
17/08/28 15:07:50 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/08/28 15:07:50 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.10.10.92:4040
17/08/28 15:07:50 INFO Executor: Starting executor ID driver on host localhost
17/08/28 15:07:50 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51599.
17/08/28 15:07:50 INFO NettyBlockTransferService: Server created on 10.10.10.92:51599
17/08/28 15:07:50 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.10.10.92, 51599)
17/08/28 15:07:50 INFO BlockManagerMasterEndpoint: Registering block manager 10.10.10.92:51599 with 900.6 MB RAM, BlockManagerId(driver, 10.10.10.92, 51599)
17/08/28 15:07:50 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.10.10.92, 51599)
17/08/28 15:07:51 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
17/08/28 15:07:51 INFO SharedState: Warehouse path is 'file:E:\spark\GPSline/spark-warehouse'.
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:E:/spark/GPSline/spark-warehouse
	at org.apache.hadoop.fs.Path.initialize(Path.java:205)
	at org.apache.hadoop.fs.Path.(Path.java:171)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:114)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.(SessionCatalog.scala:89)
	at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95)
	at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95)
	at org.apache.spark.sql.internal.SessionState$$anon$1.(SessionState.scala:112)
	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112)
	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
	at org.apache.spark.sql.DataFrameReader.text(DataFrameReader.scala:492)
	at org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:528)
	at org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:501)
	at test$.main(test.scala:15)
	at test.main(test.scala)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:E:/spark/GPSline/spark-warehouse
	at java.net.URI.checkPath(URI.java:1823)
	at java.net.URI.(URI.java:745)
	at org.apache.hadoop.fs.Path.initialize(Path.java:202)
	... 18 more
17/08/28 15:07:53 INFO SparkContext: Invoking stop() from shutdown hook
17/08/28 15:07:53 INFO SparkUI: Stopped Spark web UI at http://10.10.10.92:4040
17/08/28 15:07:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/08/28 15:07:53 INFO MemoryStore: MemoryStore cleared
17/08/28 15:07:53 INFO BlockManager: BlockManager stopped
17/08/28 15:07:53 INFO BlockManagerMaster: BlockManagerMaster stopped
17/08/28 15:07:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/08/28 15:07:53 INFO SparkContext: Successfully stopped SparkContext
17/08/28 15:07:53 INFO ShutdownHookManager: Shutdown hook called
17/08/28 15:07:53 INFO ShutdownHookManager: Deleting directory C:\Users\WG\AppData\Local\Temp\spark-3eb5885c-1b8d-4de2-b7ee-405195e83793


Process finished with exit code 1

其实关键就在这一句:

Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:E:/spark/GPSline/spark-warehouse

事实上我的本地并没有这个文件夹,所以就需要添加一个配置spark.sql.warehouse.dir,如果不添加上该配置,默认是找的user.dir下面的目录。

val spark = SparkSession.builder().appName("yy").master("local").config("spark.sql.warehouse.dir", "E:/spark/GPSline/spark-warehouse").getOrCreate()

运行之后,果然就可以了



你可能感兴趣的:(spark)