在spark SQL中指定多个表

1.写一个创建mysql表的工具类Utility.scala

/**
    * 创建mysql的表
    *
    * @param url      msyql的url
    * @param userName mysql用户名
    * @param password mysql密码
    * @param tables   注册的表名集合
    * @param spark    SparkSession
    */
def createMysqlTempView(url: String, userName: String, password: String, tables: List[String], spark: SparkSession): Unit = {
  val proBasicData = new Properties()
  proBasicData.put("user", userName)
  proBasicData.put("password", password)

  val dfs = for {
    table <- tables
  } yield (table, spark.read.jdbc(url, table, proBasicData))

  for {
    (name, df) <- dfs
  } df.createOrReplaceTempView(name)
}

2.main.scala中调用

    Utility.createMysqlTempView(MySQLConnetionUtility.MOVIEBASIC_URL, MySQLConnetionUtility.MOVIEBASIC_USERNAME, MySQLConnetionUtility.MOVIEBASIC_PASSWORD,
      List("表名"), spark)

 

你可能感兴趣的:(大数据入门)