目录
一、控制日志输出方法
二、代码测试
1、使用spark默认的log4j配置文件设置日志输出级别
Logger.getLogger("org.apache.spark").setLevel(Level.ERROR)
Logger.getLogger("org").setLevel(Level.ERROR)
注意:此方法要放在主程序的首行。靠后放则对其上面代码的输出日志级别控制不起作用。
2、使用sparkContext设置日志输出级别
sparkContext.setLogLevel("ERROR")
3、也可以使用scala自带的日志控制方法
val logger: slf4j.Logger = org.slf4j.LoggerFactory.getLogger(this.getClass)
logger.info("要输出的日志信息")
1、使用spark默认的log4j配置文件设置日志输出级别
package com.gl.test
import org.apache.log4j.{Level, Logger}
import org.apache.spark.sql.{DataFrame, SparkSession}
object Test {
case class DateT(name: String, birthday: String)
def main(args: Array[String]): Unit = {
Logger.getLogger("org").setLevel(Level.ERROR)
val spark = SparkSession.builder().master("local[*]").appName("testDate").getOrCreate()
import spark.implicits._
import org.apache.spark.sql.functions._
val DateTest: DataFrame = Seq(
DateT("aa", "1995-12-11 12:12:13"),
DateT("bb", "2000-01-14 10:10:57")
).toDF()
DateTest.withColumn("current_date", current_date())
.withColumn("current_timestamp", current_timestamp())
.withColumn("date", date_format(current_timestamp(), "yyyy-MM-dd HH:mm:ss"))
.withColumn("date1", date_format(current_timestamp(), "yyyy-MM-dd hh:mm:ss"))
.withColumn("date2", date_format(col("birthday"), "yyyy-MM-dd"))
.show(false)
spark.close()
}
}
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
+----+-------------------+------------+-----------------------+-------------------+-------------------+----------+
|name|birthday |current_date|current_timestamp |date |date1 |date2 |
+----+-------------------+------------+-----------------------+-------------------+-------------------+----------+
|aa |1995-12-11 12:12:13|2021-09-07 |2021-09-07 14:24:30.421|2021-09-07 14:24:30|2021-09-07 02:24:30|1995-12-11|
|bb |2000-01-14 10:10:57|2021-09-07 |2021-09-07 14:24:30.421|2021-09-07 14:24:30|2021-09-07 02:24:30|2000-01-14|
+----+-------------------+------------+-----------------------+-------------------+-------------------+----------+
2、使用sparkContext设置日志输出级别
package com.gl.test
import org.apache.log4j.{Level, Logger}
import org.apache.spark.sql.{DataFrame, SparkSession}
object Test {
case class DateT(name: String, birthday: String)
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder().master("local[*]").appName("testDate").getOrCreate()
import spark.implicits._
import org.apache.spark.sql.functions._
val sc = spark.sparkContext
sc.setLogLevel("ERROR")
val DateTest: DataFrame = Seq(
DateT("aa", "1995-12-11 12:12:13"),
DateT("bb", "2000-01-14 10:10:57")
).toDF()
DateTest.withColumn("current_date", current_date())
.withColumn("current_timestamp", current_timestamp())
.withColumn("date", date_format(current_timestamp(), "yyyy-MM-dd HH:mm:ss"))
.withColumn("date1", date_format(current_timestamp(), "yyyy-MM-dd hh:mm:ss"))
.withColumn("date2", date_format(col("birthday"), "yyyy-MM-dd"))
.show(false)
spark.close()
}
}
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/09/07 14:25:46 INFO SparkContext: Running Spark version 2.2.1
21/09/07 14:25:46 INFO SparkContext: Submitted application: testDate
21/09/07 14:25:46 INFO SecurityManager: Changing view acls to: lenovo
21/09/07 14:25:46 INFO SecurityManager: Changing modify acls to: lenovo
21/09/07 14:25:46 INFO SecurityManager: Changing view acls groups to:
21/09/07 14:25:46 INFO SecurityManager: Changing modify acls groups to:
21/09/07 14:25:46 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(lenovo); groups with view permissions: Set(); users with modify permissions: Set(lenovo); groups with modify permissions: Set()
21/09/07 14:25:47 INFO Utils: Successfully started service 'sparkDriver' on port 54804.
21/09/07 14:25:47 INFO SparkEnv: Registering MapOutputTracker
21/09/07 14:25:47 INFO SparkEnv: Registering BlockManagerMaster
21/09/07 14:25:47 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/09/07 14:25:47 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/09/07 14:25:47 INFO DiskBlockManager: Created local directory at C:\Users\lenovo\AppData\Local\Temp\blockmgr-483269e3-e891-4117-bf05-bf95ace77941
21/09/07 14:25:47 INFO MemoryStore: MemoryStore started with capacity 1990.8 MB
21/09/07 14:25:47 INFO SparkEnv: Registering OutputCommitCoordinator
21/09/07 14:25:47 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/09/07 14:25:47 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.110.70:4040
21/09/07 14:25:47 INFO Executor: Starting executor ID driver on host localhost
21/09/07 14:25:47 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54845.
21/09/07 14:25:47 INFO NettyBlockTransferService: Server created on 192.168.110.70:54845
21/09/07 14:25:47 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/09/07 14:25:47 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.110.70, 54845, None)
21/09/07 14:25:47 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.110.70:54845 with 1990.8 MB RAM, BlockManagerId(driver, 192.168.110.70, 54845, None)
21/09/07 14:25:47 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.110.70, 54845, None)
21/09/07 14:25:47 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.110.70, 54845, None)
+----+-------------------+------------+-----------------------+-------------------+-------------------+----------+
|name|birthday |current_date|current_timestamp |date |date1 |date2 |
+----+-------------------+------------+-----------------------+-------------------+-------------------+----------+
|aa |1995-12-11 12:12:13|2021-09-07 |2021-09-07 14:25:49.728|2021-09-07 14:25:49|2021-09-07 02:25:49|1995-12-11|
|bb |2000-01-14 10:10:57|2021-09-07 |2021-09-07 14:25:49.728|2021-09-07 14:25:49|2021-09-07 02:25:49|2000-01-14|
+----+-------------------+------------+-----------------------+-------------------+-------------------+----------+