Spark中解决jdbc.DefaultSource没发现问题

今天在写一个小功能项目,主要要做的是读取ES数据,处理后存入MySQL,在IDEA中一直运行正常,打成jar包通过java命令运行出错,出错信息如下:

具体实现代码:ElasticSearch 数据导入到 MySQL

Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: jdbc. Please find packages at http://spark.apache.org/third-party-projects.html
        at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:241)
        at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:499)
        at com.example.tool.EsToMysql.main(EsToMysql.java:93)
Caused by: java.lang.ClassNotFoundException: jdbc.DefaultSource
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at org.apache.spark.sql.execution.datasources.DataSource
        at org.apache.spark.sql.execution.datasources.DataSource
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
        at scala.util.Try.orElse(Try.scala:84)
        at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
        ... 3 more
在网上找了好久都没有解决的答案,最后发现是运行方式不对导致的, 不能只能通过java -jar xxx来运行, 要用spark-submit来运行程序,命令如下:

bin/spark-submit --class xxx.xx.xx.mainObject  --master local[2]   /opt/xxx.jar

你可能感兴趣的:(Spark)