flink-1.17 log4j2 log推送

flink-conf.yml曾如下配置获取 yarn 的ContainerId

env.java.opts.taskmanager: -DyarnContainerId=$CONTAINER_ID
env.java.opts.jobmanager: -DyarnContainerId=$CONTAINER_ID

log4j.propertites增加如下配置

# kafka appender config
rootLogger.appenderRef.kafka.ref = Kafka
appender.kafka.type=Kafka
appender.kafka.name=Kafka
appender.kafka.syncSend=true
appender.kafka.ignoreExceptions=false
appender.kafka.topic=flink_logs
appender.kafka.property.type=Property
appender.kafka.property.name=bootstrap.servers
appender.kafka.property.value=localhost:9092
appender.kafka.layout.type=JSONLayout
apender.kafka.layout.value=net.logstash.log4j.JSONEventLayoutV1
appender.kafka.layout.compact=true
appender.kafka.layout.complete=false
appender.kafka.layout.additionalField1.type=KeyValuePair
appender.kafka.layout.additionalField1.key=logdir
appender.kafka.layout.additionalField1.value=${sys:log.file}
appender.kafka.layout.additionalField2.type=KeyValuePair
appender.kafka.layout.additionalField2.key=flink_job_name
appender.kafka.layout.additionalField2.value=${sys:flink_job_name}
appender.kafka.layout.additionalField3.type=KeyValuePair
appender.kafka.layout.additionalField3.key=yarnContainerId
appender.kafka.layout.additionalField3.value=${sys:yarnContainerId}
 
# 自定义布局格式
appender.kafka.layout.type=PatternLayout
appender.kafka.layout.pattern={"log_level":"%p","yarnContainerId":"${sys:yarnContainerId}","log_timestamp":"%d{ISO8601}","log_thread":"%t","log_file":"%F", "log_line":"%L","log_message":"'%m'","log_path":"%X{log_path}","job_name":"${sys:flink_job_name}"}%n

启动命令传入job_name

-yD env.java.opts="-Dflink_job_name=test1"

log如下:

{"log_level":"INFO","yarnContainerId":"container_1689663742364_0066_01_000001","log_timestamp":"2023-08-22T09:07:39,525","log_thread":"SourceCoordinator-Source: kafka source","log_file":"SourceCoordinator.java", "log_line":"383","log_message":"'Marking checkpoint 19 as completed for source Source: kafka source.'","log_path":"","job_name":"test1"}

你可能感兴趣的:(flink,大数据)