Hi Zhipeng,
Good that you've reached out, I wasn't aware that Gelly is being used in
Alink. Are you proposing to write a new graph library as a successor of
Gelly and bundle that with Alink?
Best regards,
Martijn
On Tue, 4 Jan 2022 at 02:57, Zhipeng Zhang wrote:
> Hi everyone,
>
> Thanks for
Hi Arvid,
I spent time reading through the existing KafkaSource related code and
thinking about the best possible solution in the last few days. Now I no
longer think it is a good idea to let user specify this logic in
de-serializer and pass this information via the Collector. I also thought
more
Hi everyone,
Thanks for starting the discussion :)
We (Alink team [1]) are actually using part of the Gelly library to support
graph algorithms (connected component, single source shortest path, etc.)
for users in Alibaba Inc.
As DataSet API is going to be dropped, shall we also provide a new gr
Hi,
I was recently looking at the Flink native Kubernetes integration [1]
to get an idea how it relates to existing operator based solutions
[2], [3].
Part of the native integration's motivations was simplicity (no extra
component to install), but arguably that is also a shortcoming. The
k8s oper
Most of the inquiries I've had about Gelly in recent memory have been from
folks looking for a streaming solution, and it's only been a handful.
+1 for dropping Gelly
David
On Mon, Jan 3, 2022 at 2:41 PM Till Rohrmann wrote:
> I haven't seen any changes or requests to/for Gelly in ages. Hence,
I definitely do, and you can see in my initial post that this is the first
thing I tried but I got warnings and it doesn't use credentials I supplied.
Though you are right that I do find a solution - using credentialProvider
object and injecting keys as a java env variables through:
-yd "env.java.o
Hi Mariam,
a quick mailing list query and Jira query didn't reveal any pointers for
Flink with Milvus, unfortunately. But have you had a look at Flink's
AsyncIO API [1]? I haven't worked with it, yet. But it sounds like
something that might help you accessing an external system.
Matthias
[1]
http
Hi Daniel,
I'm assuming you already looked into the Flink documentation for this topic
[1]? I'm gonna add Fabian to this thread. Maybe, he's able to help out here.
Matthias
[1]
https://nightlies.apache.org/flink/flink-docs-release-1.12/dev/connectors/kinesis.html#kinesis-producer
On Fri, Dec 31,
For documentation purposes: Surendra started a discussion in FLINK-25411
[1].
[1] https://issues.apache.org/jira/browse/FLINK-25411
On Wed, Dec 22, 2021 at 9:51 AM Surendra Lalwani
wrote:
>
> Hi Team,
>
> JsonRowSerializationSchema is unable to parse fields with type
> TIMESTAMP_LTZ, seems like
Hi Puneet,
Flink logs things like the job name which can be specified by the user.
Hence, a user could (as far as I understand) add a job name containing
malicious content. This is where the Flink cluster's log4j version comes
into play. Therefore, it's not enough to provide only an updated log4j
d
Dear All,
I have a question regarding contacting a remote server and receiving
responses in flink functions. What is the best approach to do so? If also
other users have used flink with milvus server, I have trouble running the
job on flink cluster although it is working locally.
I would really ap
I haven't seen any changes or requests to/for Gelly in ages. Hence, I would
assume that it is not really used and can be removed.
+1 for dropping Gelly.
Cheers,
Till
On Mon, Jan 3, 2022 at 2:20 PM Martijn Visser wrote:
> Hi everyone,
>
> Flink is bundled with Gelly, a Graph API library [1]. Th
Hi everyone,
Flink is bundled with Gelly, a Graph API library [1]. This has been marked
as approaching end-of-life for quite some time [2].
Gelly is built on top of Flink's DataSet API, which is deprecated and
slowly being phased out [3]. It only works on batch jobs. Based on the
activity in the
Hi,
Can you provide a reproducer? Sounds like a bug, but I might be wrong.
I wonder why you need List, can't you infer the type?
In any case, You can workaround this issue overriding the method
UserDefinedFunction#getTypeInference to return a custom TypeInference,
which you can build with your o
Hey team,
We are migrating our Flink codes from Flink-1.9 to Flink-1.14 and as a part
of this, we are updating a bunch of UDFs. Wanted to understand, how to
provide *data type hints for the UDFs which return Object[]*.
For example, if the return type is simply Object something like this works.
*
As there were no strong objections, we'll proceed with bumping the Hadoop
version to 2.8.5 and removing the safeguards and the CI for any earlier
versions. This will effectively make the Hadoop 2.8.5 the least supported
version in Flink 1.15.
Best,
D.
On Thu, Dec 23, 2021 at 11:03 AM Till Rohrman
16 matches
Mail list logo