Hi, I am afraid passing of these options is not supported in SQL yet. I created FLINK-21229 to add support for it.
In a regular job you can construct a schema registry client manually: RegistryAvroDeserializationSchema<GenericRecord> deserializationSchema = new RegistryAvroDeserializationSchema<>( GenericRecord.class, // or a SpecificRecord and null schema schema, () -> new ConfluentSchemaRegistryCoder( new CachedSchemaRegistryClient(/* configure with ssl */) ) ); Best, Dawid [1] https://issues.apache.org/jira/browse/FLINK-21229 On 28/01/2021 17:39, Laurent Exsteens wrote: > Hello, > > I'm trying to us Flink SQL (on Ververica Platform, so no other options > than pure Flink SQL) to read confluent avro messages from Kafka, when > the schema registry secured via SSL. > > Would you know what are the correct properties to setup in the kafka > consumer config? > > The following options work for a simple java kafka producer/consumer > (not a Flink job): > - schema.registry.ssl.truststore.location > - schema.registry.ssl.truststore.password > - schema.registry.ssl.keystore.location > - schema.registry.ssl.keystore.password > > However, they don't seem to be taken into account in my query (and > also not when I tried in a Flink job), even when prefixed by > 'properties.'. > > I'm using Flink 1.11 for the SQL query (Ververica Platform 2.3), and > Flink 1.10 on my job. > > Would you have an idea how can I tell my Flink SQL Kafka Connector how > to connect to that SR via SSL? Or a normal Flink job? > > Thanks in advance for your help. > > Best Regards, > > Laurent. > > > -- > *Laurent Exsteens* > Data Engineer > (M) +32 (0) 486 20 48 36 > > *EURA NOVA* > > Rue Emile Francqui, 4 > > 1435 Mont-Saint-Guibert > > (T) +32 10 75 02 00 <tel:%2B32%2010%2075%2002%2000> > > *euranova.eu <http://euranova.eu/>* > > *research.euranova.eu* <http://research.euranova.eu/> > > > ♻ Be green, keep it on the screen
signature.asc
Description: OpenPGP digital signature