关于spark 1.61 dataframe写mysql数据问题

最近通过s_web_url.write.jdbc("jdbc:mysql://xxxxxxxx:3306","fqzwz.zwd_test", "append",{"user":"xxx", "password":"xxxxx"})一直报如下错误

ava.lang.IllegalStateException: Did not find registered driver with class org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper,


报如下错误原因网上解释是:

It occurs when we:

  • Are using Oracle's ojdbc
  • The driver is wrapping ojdbc with a DriverWrapper because it is added via the Spark class loader.
  • We don't specify an explicit "driver" property

Then in /org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala (createConnectionFactory)

The driver will get the driverClass as:

val driverClass: String = userSpecifiedDriverClass.getOrElse

{ DriverManager.getDriver(url).getClass.getCanonicalName }

Which since the Driver is wrapped by a DriverWrapper will be "org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper"

That gets passed to the Executor which will attempt to find a matching wrapper with the name "org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper". However the Executor is aware of the wrapping and will compare with the wrapped classname instead:

case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == driverClass => d

I think the fix is just to change the initialization of driverClass to also be aware that there might be a wrapper and if so pass the wrapped classname.

The problem can be worked around by setting the driver property for the jdbc call:

val props = new java.util.Properties()
props.put("driver", "oracle.jdbc.OracleDriver")
val result = sqlContext.read.jdbc(connectionString, query, props)





将命令改写成:

s_web_url.write.jdbc("jdbc:mysql://xxxxxxxx:3306","fqzwz.zwd_test", "append",{"driver":"com.mysql.jdbc.Driver","user":"xxx", "password":"xxxxx"}),指定驱动,问题得到解决.


官方对于jdbc的调用写着也不是很详细,对于properties 的参数列的也不是很详细只写了用户名和密码的





你可能感兴趣的:(SPARK)