Scala日志SLF4J配置

Scala日志配置

scala的日志与java的基本相同,看到网上一篇入门讲解的非常详细java日志使用记录log4j使用总结,作者是习翔宇

Spark默认的日志配置文件路径在spark-2.1.1-bin-hadoop2.7文件夹下面,例如D:\ProgramData\spark-2.1.1-bin-hadoop2.7\conf\log4j.properties

我们可以把log4j.properties文件放到项目的src下面,但是必须得要放到source文件夹下面才能生效,先加入Maven依赖,如下,我们使用大名鼎鼎的slf4j依赖

<dependency>
  <groupId>org.slf4j</groupId>
  <artifactId>slf4j-api</artifactId>
  <version>1.7.21</version>
</dependency>
<dependency>
  <groupId>org.slf4j</groupId>
  <artifactId>slf4j-log4j12</artifactId>
  <version>1.7.21</version>
</dependency>

导入相关的包,设置日志级别以及输出位置都在配置文件中

import org.slf4j.LoggerFactory
object DataFrameTest {
  val logger = LoggerFactory.getLogger(this.getClass)

  def main(args: Array[String]): Unit = {
    logger.info("time stamp:{}",System.currentTimeMillis()) // 这种调用方式非常高效,会避免很多不必要的字符串拼接操作
  }
 }

我们重点看下配置文件,log4j.rootLogger后面的第一个参数是日志的级别,后面的参数表示哪些输出终端是这样的日志级别,这里的日志输出终端都是自定义命名,比方说输出到控制台命名为“console”,输出到文件命名为logFile

log4j.rootLogger=DEBUG,console,logFile
log4j.additivity.org.apache=true


# 控制台(console)
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.DatePattern='.'yyyy-MM-dd
log4j.appender.console.File=logs/notify-subscription.log
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d - %m%n
log4j.appender.console.Append=true
log4j.appender.stdout=org.apache.log4j.consoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH\:mm\:ss} %p [%c] %m%n


# 日志文件(logFile)
log4j.appender.logFile=org.apache.log4j.FileAppender
log4j.appender.logFile.Threshold=DEBUG
log4j.appender.logFile.ImmediateFlush=true
log4j.appender.logFile.Append=true
log4j.appender.logFile.File=D:/logs/log.logFile
log4j.appender.logFile.layout=org.apache.log4j.PatternLayout
log4j.appender.logFile.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} [%p] %m%n


# 滚动文件(rollingFile)
log4j.appender.rollingFile=org.apache.log4j.RollingFileAppender
log4j.appender.rollingFile.Threshold=DEBUG
log4j.appender.rollingFile.ImmediateFlush=true
log4j.appender.rollingFile.Append=true
log4j.appender.rollingFile.File=D:/logs/log.rollingFile
log4j.appender.rollingFile.MaxFileSize=200KB
log4j.appender.rollingFile.MaxBackupIndex=50
log4j.appender.rollingFile.layout=org.apache.log4j.PatternLayout
log4j.appender.rollingFile.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} [%p] %m%n


# 定期滚动日志文件(dailyFile)
log4j.appender.dailyFile=org.apache.log4j.DailyRollingFileAppender
log4j.appender.dailyFile.Threshold=DEBUG
log4j.appender.dailyFile.ImmediateFlush=true
log4j.appender.dailyFile.Append=true
log4j.appender.dailyFile.File=D:/logs/log.dailyFile
log4j.appender.dailyFile.DatePattern='.'yyyy-MM-dd
log4j.appender.dailyFile.layout=org.apache.log4j.PatternLayout
log4j.appender.dailyFile.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} [%p] %m%n

# 屏蔽Spark相关依赖的输出日志
## Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR
#
## SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

## Set the default spark-shell log level to WARN. When running the spark-shell, the
## log level for this class is used to overwrite the root logger's log level, so that
## the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=WARN

你可能感兴趣的:(scala,Spark,Scala日志,slf4j配置)