Hi everyone,
Before digging into what it would it take to implement a general solution, I
narrowed down the scope to write a fix which makes the query mentioned in the
thread work. Here are some findings:
- For the temporal join logic, it's not the watermark that matters but having a
TimeIndic
Hi Till,
That solved my issue ! Many many thanks for the solution and for the useful
StackOverflow link ! ☺️
Cheers,
Sébastien
> Le 30 mars 2021 à 18:16, Till Rohrmann a écrit :
>
> Hi Sebastien,
>
> I think the Scala compiler infers the most specific type for deepCop
issed something
even if it’s a really basic use-case and code, I’m beginner in Scala.
Thanks in advance for your help !
Sebastien
> this question was answered in quote some detail :
>> https://flink.apache.org/news/2020/01/15/demo-fraud-detection.html
>> Best regardsTheo
>> Ursprüngliche Nachricht
>> Von: Eduardo Winpenny Tejedor
>> Datum: Mo., 17. Feb. 2020, 21:07
>> An: Lehuede s
Hi all,
I'm currently working on a Flink Application where I match events against a
set of rules. At the beginning I wanted to dynamically create streams
following the category of events (Event are JSON formatted and I've a field
like "category":"foo" in each event) but I'm stuck by the impossibil
/flink/blob/release-1.5.3/flink-runtime/src/main/java/org/apache/flink/runtime/security/modules/JaasModule.java#L74
With `sasl.mechanism=GSSAPI` the connection to Kafka with kerberos
authentication succeeds.
Regards,
Sebastien
> On September 11, 2018 at 8:08 AM Sebastien Pereira
>
- We strongly suspect the UnsupportedCallbackException is caused by missing
content in the generated JAAS file.
Thanks,
Sebastien Pereira
logs/events in CEF Format, you
can just use 'split' in the flatmap function for example.
Hope will help.
Regards,
Sebastien.
uot; generated file :
*import com.nybble.alpha.AvroDeserializationSchema;*
*import com.nybble.alpha.toKafka;*
Do i miss something ?
Regards,
Sebastien.
2018-04-25 11:32 GMT+02:00 Timo Walther :
> Hi Sebastien,
>
> for me this seems more an Avro issue than a Flink issue. You can ignore
> the shaded excepti
Avro Deserialization ?
I can't find much information about "
avro.shaded.com.google.common.util.concurrent.UncheckedExecutionException"
Regards,
Sebastien.
onctionnalities. But when i was looking for a solution to obtain my final
result, i came across KafkaJsonTableSource.
Does anyone think this can be a good solution for my use case ?
I think i will be able to store JSON from Kafka, process data then modify
the table and send data to another Kafka, is it correct ?
Regards,
Sebastien
afka-0.11_${scala.binary.version}*
*${flink.version}*
**
And here is the import line in my java file :
*import**
org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011; *
Can anyone could help me with this issue ?
Regards,
Sebastien
4/dev/connectors/kafka.html
Can you confirm i will not be able to use latest version of Kafka (1.1.0),
Flink 1.4 and this connector to do my test ? The connector is compatible
with Kafka 0.11 and under only ?
Regards,
Sebastien.
13 matches
Mail list logo