fkaSourceEnumerator.java#L286
>>> <https://github.com/apache/flink-connector-kafka/blob/main/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/source/enumerator/KafkaSourceEnumerator.java#L286>
>>> Best Regards
>>> Ahmed Hamdy
>>>
>>&
ator.java#L286
>> Best Regards
>> Ahmed Hamdy
>>
>>
>> On Sun, 6 Oct 2024 at 06:41, Anil Dasari wrote:
>>
>>> Hello,
>>> I have implemented a custom source that reads tables in parallel, with
>>> each split corresponding to a table and c
On Sun, 6 Oct 2024 at 06:41, Anil Dasari wrote:
>
>> Hello,
>> I have implemented a custom source that reads tables in parallel, with
>> each split corresponding to a table and custom source implementation can be
>> found here -
>> https://github
://github.com/apache/flink-connector-kafka/blob/main/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/source/enumerator/KafkaSourceEnumerator.java#L286
Best Regards
Ahmed Hamdy
On Sun, 6 Oct 2024 at 06:41, Anil Dasari wrote:
> Hello,
> I have implemented a custom source that
Hello,
I have implemented a custom source that reads tables in parallel, with each
split corresponding to a table and custom source implementation can be
found here -
https://github.com/adasari/mastering-flink/blob/main/app/src/main/java/org/example/paralleljdbc/DatabaseSource.java
However, it
Hey folks,
I'm looking for an example for creating a custom source in PyFlink. The one
that I found in the tests is a wrapper around a java class:
def test_add_custom_source(self):
custom_source = SourceFunction(
"org.apache.flink.python.util.MyCustomSourceFunction")
ds = self
TCP socket stream and for
> the same written one custom source function reading data by TCP socket.
>
>
>
> Job is running successfully but while trying to take a savepoint, error
> comes that savepoint cannot be taken.
>
>
>
> Is there any limitation that TCP streams sta
>>
>> Hello,
>>
>> Thanks for your suggestion but please confirm below.
>>
>> Is it the case that TCP socket source job can’t be restored from last
>> checkpoint?
>>
>> Rgds,
>>
>> Kamal
>>
>> *From:*Jan Lukavský
>>
*To:* user@flink.apache.org
*Subject:* Re: Flink TCP custom source - secured server socket
> ... a state backward in (processing) time ...
(of course not processing, I meant to say event time)
On 6/29/23 14:45, Jan Lukavský wrote:
Hi Kamal,
you probably have several options:
a) bundle you
Hello Community,
I have a requirement to read data coming over TCP socket stream and for the
same written one custom source function reading data by TCP socket.
Job is running successfully but while trying to take a savepoint, error comes
that savepoint cannot be taken.
Is there any
Hello,
Thanks for your suggestion but please confirm below.
Is it the case that TCP socket source job can’t be restored from last
checkpoint?
Rgds,
Kamal
From: Jan Lukavský
Sent: 29 June 2023 06:18 PM
To: user@flink.apache.org
Subject: Re: Flink TCP custom source - secured server socket
st felt it
would be good to stress this out.
Best,
Jan
On 6/29/23 12:53, Kamal Mittal via user wrote:
Hello Community,
I have created TCP stream custom source and reading data from TCP
stream source.
But this TCP connection needs to be secured i.e. SSL based, query is
how to configure/provid
a plain TCP socket. It is likely you
will experience data-loss issues with this solution (regardless of
security). This might be okay for you, I just felt it would be good to
stress this out.
Best,
Jan
On 6/29/23 12:53, Kamal Mittal via user wrote:
Hello Community,
I have created TCP
Hello Community,
I have created TCP stream custom source and reading data from TCP stream source.
But this TCP connection needs to be secured i.e. SSL based, query is how to
configure/provide certificates via Flink for Client-Server secured TCP
connection?
Rgds,
Kamal
Thanks Lasse, I think you can create an issue and update the document if
there is any wrong
Best,
Shammon FY
On Wed, Mar 29, 2023 at 3:48 AM Lasse Nedergaard <
lassenedergaardfl...@gmail.com> wrote:
> Hi.
> I have figured it out. The documentation are wrong in both places.
>
> Med venlig hilsen
Hi Lasse
Does your job table/sql or datastream? Here's the doc [1] for customized
source in table and there is an example of socket source.
[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sourcessinks/
Best,
Shammon FY
On Mon, Mar 27, 2023 at 8:02 PM Lasse Nedergaard <
Hi.
I have to use data from a Rest API, very slow changing. Instead of doing an
async i/o request I would like to create a source function that read from Rest
Api so I can connect and enrich another stream without a lookup for each
incoming row.
Flink doc provide an explanation of how to do i
Thanks Fabian,
I was looking forward to use the unified Source interface in my use case.
The implementation was very intuitive with this new design.
I will try with TableFunction then.
Best.
Krzysztof Chmielewski
pt., 5 lis 2021 o 14:20 Fabian Paul napisał(a):
> Hi Krzysztof,
>
> The blog post
Hi Krzysztof,
The blog post is not building a lookup source but only a scan source. For scan
sources you can choose between the old RichSourceFunction
or the new unified Source interface.
For lookup sources you need to implement either a TableFunction or a
AsyncTableFunction there are current
Ok,
I think there is some misunderstanding here.
As it is presented in [1] for implementing Custom Source Connector for
Table API and SQL:
*"You first need to have a source connector which can be used in Flink’s
runtime system (...)*
*For complex connectors, you may want to implemen
I think neither Source nor RichSourceFunction are correct in this case. You can
have a look at the Jdbc lookup source[1][2]. Your function needs to
implement TableFunction.
Best,
Fabian
[1]
https://github.com/apache/flink/blob/master/flink-connectors/flink-connector-jdbc/src/main/java/org/apac
Thank you Fabian,
what if I would rewrite my custom Source to use old RichSourchFunction
instead unified Source Interface?
Would it work then as Lookup?
Best,
Krzysztof
pt., 5 lis 2021 o 11:18 Fabian Paul napisał(a):
> Hi Krzysztof,
>
> The unified Source is meant to be used for the D
Hi Krzysztof,
The unified Source is meant to be used for the DataStream API and Table API.
Currently, we do not have definition of look up sources in the
DataStream API therefore the new source do not work as lookups and only as scan
sources.
Maybe in the future we also want to define look ups
and source will get data for given
>>> parameters and I don't need to scan the entire resource.
>>>
>>> Cheers,
>>>
>>> czw., 4 lis 2021 o 15:48 Krzysztof Chmielewski <
>>> krzysiek.chmielew...@gmail.com> napisał(a):
>>>
lt;
>> krzysiek.chmielew...@gmail.com> napisał(a):
>>
>>> Hi,
>>> I was wondering if it is possible to implement a Source Table connector
>>> like it is described in [1][2] with custom source that implements a new
>>> Source interface [3] and not a
gt; Cheers,
>
> czw., 4 lis 2021 o 15:48 Krzysztof Chmielewski <
> krzysiek.chmielew...@gmail.com> napisał(a):
>
>> Hi,
>> I was wondering if it is possible to implement a Source Table connector
>> like it is described in [1][2] with custom source that implements a new
&
mail.com> napisał(a):
> Hi,
> I was wondering if it is possible to implement a Source Table connector
> like it is described in [1][2] with custom source that implements a new
> Source interface [3] and not a SourceFunction.
>
> I already have my custom source but when I'm tr
lew...@gmail.com> wrote:
> Hi,
> I was wondering if it is possible to implement a Source Table connector
> like it is described in [1][2] with custom source that implements a new
> Source interface [3] and not a SourceFunction.
>
> I already have my custom source but when I'
Hi Krzysztof,
It is great to hear that you have implemented your source with the unified
Source interface. To integrate you source in the Table API you can use the
SourceProvider. You can take a look at how our FileSource does is[1]
Btw I think you forgot to add your references ;)
Best,
Fabi
Hi,
I was wondering if it is possible to implement a Source Table connector
like it is described in [1][2] with custom source that implements a new
Source interface [3] and not a SourceFunction.
I already have my custom source but when I'm trying to implement a Table
Source from LookupTableS
Hi Bin,
We could try the following method to cover the source/sink test.
Unit test: To verify whether the behavior of each method in custom source
or sink is expected. You could mock interactions with external storage
(database, IO, etc.) in this part.
Integration test: To test whether the source
enough.
Xinbin Huang 于2021年8月10日周二 上午4:22写道:
> Hi team,
>
> I'm currently implementing a custom source and sink, and I'm trying to
> find a way to test these implementations. The testing section
> <https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/data
Hi team,
I'm currently implementing a custom source and sink, and I'm trying to find
a way to test these implementations. The testing section
<https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/datastream/testing/#unit-testing-stateful-or-timely-udfs--custom-oper
wrote:
> Hi team,
>
> I'm trying to develop a custom source using the new Data Source API but I
> have some hard time finding examples for it. Can you point me to some
> existing Sources implemented with the new Data Source API? It would be
> ideal if source is for a pull-ba
Hi team,
I'm trying to develop a custom source using the new Data Source API but I
have some hard time finding examples for it. Can you point me to some
existing Sources implemented with the new Data Source API? It would be
ideal if source is for a pull-based unbound source (i.e. Kafka).
T
Hi Roman,
In general, the use of inconsistent types is discouraged but there is
little that you can do on your end.
I think your approach with SourceFunction is good but I'd probably not use
Row already but rather some POJO or source format record. Note, that I have
never seen side-outputs in a s
Hi everyone,
I want to connect to a proprietary data stream, which sends different types
of messages (maybe interpreted as a table), intertwined in the stream.
Every type of message (or table) can have a different schema, but for each
type this schema is known when connecting (i.e., at runtime) an
ll simply go nowhere.
Then you call "DataStreamUtils.collect(datastream);", which internally
calls "execute" again.
In short: remote the first call to "env.execute()", that should do the
trick.
Stephan
On Thu, May 26, 2016 at 5:09 PM, Ahmed Nader
wrote:
> Hello,
&
Hello,
I have defined a custom source function for an infinite stream source,
where in my overwritten run method I have a while true loop to keep
listening for the input. I want to apply some transformations on the
resulting datastream from my source and collect the output so far of these
Hi!
I have the class, I want to create objects using the values extraced from
the JSON text, my fault haha.
Sorry for that
2016-05-15 19:10 GMT+02:00 Gábor Horváth :
> Hi!
>
> On 14 May 2016 at 23:47, iñaki williams wrote:
>
>> Hi Flink Community!
>>
>> I am new using Apache Flink and I have a
Hi!
On 14 May 2016 at 23:47, iñaki williams wrote:
> Hi Flink Community!
>
> I am new using Apache Flink and I have a problem reading a JSON.
>
> I am using a JSON from a webpage, this JSON is changing continuosly so I
> decided to use my own Source Function in order to grab the JSON using a URL
Hi Flink Community!
I am new using Apache Flink and I have a problem reading a JSON.
I am using a JSON from a webpage, this JSON is changing continuosly so I
decided to use my own Source Function in order to grab the JSON using a URL
that is always the same: http://www.example/blabla/this.json
I
42 matches
Mail list logo