Hi,

I am recently working on syncing my Flink log to Kafka via log4j2 Kafka 
appender. I have a log4j2.properties file which works fine locally, say run my 
flink fat jar form terminal via following command:
      PS D:\repo>>java -cp .\reconciliation-1.0-SNAPSHOT.jar <myMainClass>
The log can be synced to Kafka successfully when run locally.

The contents of log4j2.properties file are pasted below:
rootLogger.level = INFO
rootLogger.appenderRef.kafka.ref = KafkaLog
appender.kafka.type = Kafka
appender.kafka.name = KafkaLog

appender.kafka.topic = topicName
appender.kafka.properties[0].type=Property
appender.kafka.properties[0].name=bootstrap.servers
appender.kafka.properties[0].value=<bootstrap.servers>
appender.kafka.properties[1].type=Property
appender.kafka.properties[1].name=sasl.mechanism
appender.kafka.properties[1].value=PLAIN
appender.kafka.properties[2].type=Property
appender.kafka.properties[2].name=sasl.jaas.config
appender.kafka.properties[2].value=org.apache.kafka.common.security.plain.PlainLoginModule
 required username="$ConnectionString" 
password="${env:log_event_hub_connection_string}";
appender.kafka.properties[3].type=Property
appender.kafka.properties[3].name=security.protocol
appender.kafka.properties[3].value=SASL_SSL

appender.kafka.layout.type = JsonTemplateLayout
appender.kafka.layout.eventTemplateUri = classpath:kusto-applogv2-layout.json
appender.kafka.layout.eventTemplateAdditionalField[0].type = 
EventTemplateAdditionalField
appender.kafka.layout.eventTemplateAdditionalField[0].key = Application
appender.kafka.layout.eventTemplateAdditionalField[0].value = reconciliation
appender.kafka.layout.eventTemplateAdditionalField[0].format = String
appender.kafka.layout.eventTemplateAdditionalField[1].type = 
EventTemplateAdditionalField
appender.kafka.layout.eventTemplateAdditionalField[1].key = Language
appender.kafka.layout.eventTemplateAdditionalField[1].value = Java
appender.kafka.layout.eventTemplateAdditionalField[1].format = String


I am now deploying Flink via Flink Kubernetes operator. However, after I copied 
the contents in log4j2.properties file to log4j-console.properties under 
section of logConfiguration in FlinkDeployment yaml, the kafka Appender failed 
to init with error message:

      2023-12-05 10:12:36,903 main ERROR Unable to locate plugin type for 
JsonTemplateLayout

      2023-12-05 10:12:36,991 main ERROR Unable to locate plugin for 
EventTemplateAdditionalField

      2023-12-05 10:12:36,991 main ERROR Unable to locate plugin for 
EventTemplateAdditionalField

      2023-12-05 10:12:37,047 main ERROR Unable to locate plugin for 
JsonTemplateLayout


My question is that Does Flink Kubernetes operator support Kafka appender 
configuration in log4j-console.properties? If so can anyone provide me with an 
example?


PS: similar error message once showed up when run locally, I fixed the issue 
with sulotion posted here. via adding

com.github.edwgiz.mavenShadePlugin.log4j2CacheTransformer.PluginsCacheFileTransformer
 to pom file.

java - Console contains an invalid element or attribute "JsonTemplateLayout" 
even after adding dependency - Stack 
Overflow<https://stackoverflow.com/questions/75838785/console-contains-an-invalid-element-or-attribute-jsontemplatelayout-even-after>


Thanks,

Chosen

Reply via email to