Hi,Yuxin Tan:
Thank you very much. My problem has been resolved.
Best,
Zbz
Hi, zhao,
IIRC, it seems that the error occurred because you ran the test
before packaging. While running in
flink-clients/pom.xml, maven can not find the packages.
Based on your command, it seems that you want to run the tests.
Have you tried running "mvn clean package" or "mv
192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
```
The command executed is :
```
./mvnw clean test -e -DforkCount=2 -DreuseForks=true
-Dmaven.test.failure.ignore=true -pl '!flink-docs' -Dflink.forkCount=4
-Dflink.forkCountTestPackage=4 -B -nsu -Dskip.npm -Dscala-2.12
-Ddeprecated.check.skip=t
Hello Experts,
I have developed a couple of modules where one of the modules is getting
data from Kafka, applying the process window function, and getting the
required form of data. However, I went through the official documentation
and there are no approaches for implementing the unit test
Community,
please forgive me for this message. This is a test, because all day, my
replays to my other user thread are being rejected by email server.
Sincerely apologies
Krzysztof
pert to help take a quick look.
>
>
> Best,
> Leonard
>
> 2022年10月31日 上午11:47,Matt Fysh 写道:
>
> Hi there,
>
> I am running a local test with:
> * source = env.from_collection
> * sink = datastream.execute_and_collect
> with a map function between, and two very
Hi, Matt
I’ve checked your job is pretty simple, I've CC Xingbo who is a PyFlink expert
to help take a quick look.
Best,
Leonard
> 2022年10月31日 上午11:47,Matt Fysh 写道:
>
> Hi there,
>
> I am running a local test with:
> * source = env.from_collection
> * sink = datastrea
Hi there,
I am running a local test with:
* source = env.from_collection
* sink = datastream.execute_and_collect
with a map function between, and two very small data points in the
collection
I'm able to generate an OutOfMemoryError, and due to the nature of this
test using simple source and sink
19:44写道:
> When I run mvn clean install ,It will run Flink test case .
> However , I get Error:
> [ERROR] Failures:
> [ERROR]
>
> KubernetesClusterDescriptorTest.testDeployApplicationClusterWithNonLocalSchema:155
> Previous
When I run mvn clean install ,It will run Flink test case .
However , I get Error??
[ERROR] Failures:
[ERROR]
KubernetesClusterDescriptorTest.testDeployApplicationClusterWithNonLocalSchema:155
Previous method call should have failed but it returned
via user
> wrote:
>
>> Thanks a lot !! I have removed the .idea folder and the unit test works.
>>
>> On Mon, Jul 11, 2022 at 2:44 PM Alexander Fedulov <
>> alexan...@ververica.com> wrote:
>>
>>> Hi Min Tu,
>>>
>>> try clean
Hi,
you don't have to do that. Next time you can try "Invalidate Caches..."
under the File menu in Intellij Idea.
Best regards,
Jing
On Wed, Jul 13, 2022 at 7:21 PM Min Tu via user
wrote:
> Thanks a lot !! I have removed the .idea folder and the unit test works.
>
> On Mon
Thanks a lot !! I have removed the .idea folder and the unit test works.
On Mon, Jul 11, 2022 at 2:44 PM Alexander Fedulov
wrote:
> Hi Min Tu,
>
> try clean install to make sure the build starts from scratch. Refresh
> maven modules in IntelliJ after the build. If that doesn
>
>> Hi,
>>
>> I have downloaded the flink code and the build works fine with following
>> command
>>
>> mvnw install -DskipTests -Dcheckstyle.skip
>>
>> Then I try to run the unit test code in IntelliJ, but got following error:
>>
>>
>&g
link code and the build works fine with following
> command
>
> mvnw install -DskipTests -Dcheckstyle.skip
>
> Then I try to run the unit test code in IntelliJ, but got following error:
>
>
> /Users/mintu/ApacheProjects/flink/flink-scala/src/test/scala/org/apache/flink/api/scala/Delta
Hi,
I have downloaded the flink code and the build works fine with following
command
mvnw install -DskipTests -Dcheckstyle.skip
Then I try to run the unit test code in IntelliJ, but got following error:
/Users/mintu/ApacheProjects/flink/flink-scala/src/test/scala/org/apache/flink/api/scala
I register my job parameters as flink global parameters, and I need to get those parameters in udf's open method like:
I know in DataStream API there are test harnesses to test user-defined functions as shows in docs:So I wonder is there a similar way in Table/SQL API
Hi,
What kind of parameters do you want to get, Flink global job parameters, or
some other parameters?
Best,
Zhanghao Chen
From: zhouhaifengmath
Sent: Thursday, May 12, 2022 14:33
To: user@flink.apache.org
Subject: How to test Flink SQL UDF with open method
Hi,
I am trying to test a flink sql udf which has open method to get some parameters with Flink1.14, but i can't find an example to set those parameters in a test. Can someone give me a example on this question? Thanks for your help~Thanks &&
Thanks & Regards,
Samir Vasani
Thank you for information !
From: Farouk
Sent: Thursday, April 21, 2022 1:14:00 AM
To: Aeden Jameson
Cc: Alexey Trenikhun ; Flink User Mail List
Subject: Re: Integration Test for Kafka Streaming job
Hi
I would recommend to use kafka-junit5 from salesforce
e in output Kafka topics. We want write integration test for
> it. I've looked at KafkaTableITCase, we can do similar setup of Kafka
> topics, prepopulate data but since in our case it is endless stream, we
> need after some timeout (or condition on output topics) stop the job.
> Should we r
a topics. We want write integration test for it. I've
> looked at KafkaTableITCase, we can do similar setup of Kafka topics,
> prepopulate data but since in our case it is endless stream, we need after
> some timeout (or condition on output topics) stop the job. Should we run
Hello,
We have Flink job that read data from multiple Kafka topics, transforms data
and write in output Kafka topics. We want write integration test for it. I've
looked at KafkaTableITCase, we can do similar setup of Kafka topics,
prepopulate data but since in our case it is endless stream, we
Hi all,
I’m using org.apache.flink.statefun.flink.harness.Harness in some unit test
code, where I control the sources so that they are finite.
This is similar to what I found at
https://stackoverflow.com/questions/61939681/is-it-possible-to-write-a-unit-test-which-terminates-using-flink
Thanks for the reply. If I upgrade my legacy Sources to use the new split
Sources is there a better unit test harness for that?
Thanks,
James.
Sent from my iPhone
On 15 Feb 2022, at 13:24, Chesnay Schepler wrote:
I don't think there is anything of the sort for the legacy sources. I would
the results or verify it within the job).
On 14/02/2022 18:06, James Sandys-Lumsdaine wrote:
Hi all,
I've been using the test harness classes to unit test my stateful 1
and 2 stream functions. But I also have some stateful legacy Source
classes I would like to unit test and can't find any
Hi all,
I've been using the test harness classes to unit test my stateful 1 and 2
stream functions. But I also have some stateful legacy Source classes I would
like to unit test and can't find any documentation or example for that - is
this possible?
Thanks,
James.
lateness. The whole pipeline and input/output are defined outside of that
function.
In Beam API the test might looks like:
words = testStreamOf[String]
.addElementsAtTime("00:00:00", "foo")
.addElementsAtTime("00:00:30", "bar")
.advanceWatermar
the expected result?
( The background here is that I want to write tests for some event time
processing and assert that the SideOutput will be used for late events
and these are being
processed correctly. In production the Kafka source will not always emit
data continuously, hence I want to tes
so i went ahead and put some logging in the WatermarkGeneartor.onEvent and
.onPeriodicEmit functions in the test source watermark generator, and i do
see the watermarks come by with values through those functions. they're
just not being returned as expected via the rest api.
On Tue, Nov 30, 2021
Hi!
I see. So to test your watermark strategy you would like to fetch the
watermarks downstream.
I would suggest taking a look at
org.apache.flink.streaming.api.operators.AbstractStreamOperator. This class
has a processWatermark method, which is called when a watermark flows
through
thanks for the reply caizhi!
we're on flink 1.12.3. in the test, i'm using a custom watermark strategy
that is derived from BoundedOutOfOrdernessWatermarks that emits watermarks
using processing time after a period of no events to keep the timer-reliant
operators happy. basically, it's using
the output watermarks
just for stopping the source. You can control the records and the watermark
strategy from the source. From my point of view, constructing some test
data with some specific row time would be enough.
Jin Yi 于2021年11月30日周二 上午3:34写道:
> bump. a more general question is what
bump. a more general question is what do people do for more end to end,
full integration tests to test event time based jobs with timers?
On Tue, Nov 23, 2021 at 11:26 AM Jin Yi wrote:
> i am writing an integration test where i execute a streaming flink job
> using faked, "unbou
Dear flink community,
I'm trying to migrate to flink 1.14 from 1.10, I
replaced FlinkKafkaConsumer011->KafkaSource and BucketingSink->FileSink
without problems. I want to test whole application pipeline but can't find
any Source/Sink classes suitable to substitute KafkaSource/FileSink.
Prev
i am writing an integration test where i execute a streaming flink job
using faked, "unbounded" input where i want to control when the source
function(s) complete by triggering them once the job's operator's maximum
output watermarks are beyond some job completion watermark that'
ould be useful
>> to run on different CPUs.
>>
>> Hope those help,
>> Austin
>>
>> [1]: https://github.com/knaufk/flink-faker
>> [2]: https://github.com/apache/flink-benchmarks
>>
>> On Fri, Nov 5, 2021 at 5:14 PM Vijay Balakrishnan
>> wrote:
>>
ufk/flink-faker
> [2]: https://github.com/apache/flink-benchmarks
>
> On Fri, Nov 5, 2021 at 5:14 PM Vijay Balakrishnan
> wrote:
>
>> Hi,
>> I am a newbie to running a performance benchmark load test of Flink on
>> new CPUs.
>> Is there an* existing worklo
Hi Dian,
I got it.
A few days ago, I also found some test cases implemented with Python here
<https://github.com/apache/flink/tree/master/flink-python/pyflink/table/tests>
in
Flink's official repository. I took a look at them and it seems like many
internal functions are used and since
Hi Long,
I agree with Fabian that currently you have to test it with a e2e job.
There are still no such test harnesses for PyFlink jobs.
Regards,
Dian
On Fri, Nov 5, 2021 at 5:22 PM Long Nguyễn
wrote:
> Thanks, Fabian. I'll check it out.
>
> Hope that Dian can also give me so
chmark load test of Flink on new
> CPUs.
> Is there an* existing workload generator* that I can use with Kafka and
> then ingest it with Flink KafkaConnector & test the performance against
> various new chips on servers ?
>
> Measuring CPU performance etc, vCPU usage, Latency,
Hi,
I am a newbie to running a performance benchmark load test of Flink on new
CPUs.
Is there an* existing workload generator* that I can use with Kafka and
then ingest it with Flink KafkaConnector & test the performance against
various new chips on servers ?
Measuring CPU performance etc,
Thanks, Fabian. I'll check it out.
Hope that Dian can also give me some advice.
Best,
Long
On Fri, Nov 5, 2021 at 3:48 PM Fabian Paul wrote:
> Hi,
>
> Since you want to use Table API you probably can write a more high-level
> test around executing the complete program. A g
Hi,
Since you want to use Table API you probably can write a more high-level test
around executing the complete program. A good examples are the
pyflink example programs [1].
I also could not find something similar to the testing harness from Java. I
cced Dian maybe he knows more about
erators>
section
in Flink's docs and it seems that the best way is to use the test harnesses.
However, test harnesses are only available with apps written in Java and I
have not found anything similar for Python yet.
So what is the proper way to unit test stateful operators, i.e. window, in
a P
Hi,
I am using the KeyedOneInputStreamOperatorTestHarness. With that, I can take a
snapshot and then use OperatorSnapshotUtil to write and read it. I am wondering
if I can take a savepoint using the test harness or write the snapshot as a
savepoint in order to use the ExistingSavepoint API
Hi Bin,
We could try the following method to cover the source/sink test.
Unit test: To verify whether the behavior of each method in custom source
or sink is expected. You could mock interactions with external storage
(database, IO, etc.) in this part.
Integration test: To test whether the source
be enough.
Xinbin Huang 于2021年8月10日周二 上午4:22写道:
> Hi team,
>
> I'm currently implementing a custom source and sink, and I'm trying to
> find a way to test these implementations. The testing section
> <https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/datastream/tes
Hi team,
I'm currently implementing a custom source and sink, and I'm trying to find
a way to test these implementations. The testing section
<https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/datastream/testing/#unit-testing-stateful-or-timely-udfs--custom-operat
ect order. We
> will change internals and your test will stop working in newer Flink
> versions.
> - Harness is meant as a way for Flink devs to perform unit tests of
> operators or parts thereof. A unit test for user-defined function should
> not use any Flink classes (e.g. you really jus
changes with each minor release.
- You need to know internal implementation details, especially with the
threading model, to invoke the harness methods in the correct order. We
will change internals and your test will stop working in newer Flink
versions.
- Harness is meant as a way for Flink devs
.setNumberTaskManagers(1)
> .setNumberSlotsPerTaskManager(PARALLELISM)
> .build());
>
> //
>
> @Test
> public void testAsyncFunction() throws Exc
MiniClusterResourceConfiguration.Builder()
.setNumberTaskManagers(1)
.setNumberSlotsPerTaskManager(PARALLELISM)
.build());
//
@Test
public void testAsyncFunction() throws Exception
HI
I am trying to use RichAsyncFunction with flink's test harness. My code
looks like below
final MetadataEnrichment.AsyncFlowLookup fn = new
MetadataEnrichment.AsyncFlowLookup();
final AsyncWaitOperatorFactory> operator = new AsyncWaitOperatorFactory<>(fn
method that
> adds the KeyProcessFunction into the job ? Very thanks!
>
> Best,
> Yun
>
> --
> From:Chirag Dewan
> Send Time:2021 Jun. 9 (Wed.) 15:15
> To:User ; Yun Gao
> Subject:Re: Multiple Exceptions duri
--
From:Chirag Dewan
Send Time:2021 Jun. 9 (Wed.) 15:15
To:User ; Yun Gao
Subject:Re: Multiple Exceptions during Load Test in State Access APIs with
RocksDB
Thanks for the reply Yun.
The key is an Integer type. Do you think there can be hash collisions for
Integers
ype used
and the key type should has a stable hashcode method.
Best,Yun
--Original Mail --Sender:Chirag Dewan
Send Date:Tue Jun 8 18:06:07 2021Recipients:User
, Yun Gao Subject:Re: Multiple
Exceptions during Load Test in State Access APIs with RocksDB
Hi,
Altho
think you might first check the key type used
and
the key type should has a stable hashcode method.
Best,
Yun
--Original Mail --
Sender:Chirag Dewan
Send Date:Tue Jun 8 18:06:07 2021
Recipients:User , Yun Gao
Subject:Re: Multiple Exceptions during Load Test
here ?
Best,Yun
[1] https://issues.apache.org/jira/browse/FLINK-18587
Best,Yun
--Original Mail --Sender:Chirag Dewan
Send Date:Sat Jun 5 20:29:37 2021Recipients:User
Subject:Multiple Exceptions during Load Test in State
Access APIs with RocksDB
Hi,
I am getting
Thanks Yangze.
Nextmark is useful to me.
Best regards
> On Jun 6, 2021, at 8:08 PM, Yangze Guo wrote:
>
> Hi, Luck,
>
> I may not fully understand your requirements. If you just want to test
> the performance of typical streaming jobs with the Flink, you can
> refer to t
,Yun
[1] https://issues.apache.org/jira/browse/FLINK-18587
Best,Yun
--Original Mail --Sender:Chirag Dewan
Send Date:Sat Jun 5 20:29:37 2021Recipients:User
Subject:Multiple Exceptions during Load Test in State
Access APIs with RocksDB
Hi,
I am getting multiple
--Sender:Chirag Dewan
Send Date:Sat Jun 5 20:29:37 2021Recipients:User
Subject:Multiple Exceptions during Load Test in State
Access APIs with RocksDB
Hi,
I am getting multiple exceptions while trying to use RocksDB as astate backend.
I have 2 Task Managers with 2 taskslots and 4 cores each
Hi, Luck,
I may not fully understand your requirements. If you just want to test
the performance of typical streaming jobs with the Flink, you can
refer to the nexmark[1]. If you just care about the performance
regression of your specific production jobs, I don't know there is
such a framework
Hi flink community,
Is there any test framework that we can use to test flink jobs performance?
We would like to automate process for regression tests during flink version
upgrade and job performance tests when rolling out new changes to prod.
Any suggestions would be appreciated!
Thank you
Subject:Multiple Exceptions during Load Test in State Access APIs with RocksDB
Hi,
I am getting multiple exceptions while trying to use RocksDB as astate backend.
I have 2 Task Managers with 2 taskslots and 4 cores each.
Below is our setup:
Kafka(Topic with 2 partitions) ---> FlinkKafkaConsu
close() throws Exception {
valueState.clear();
}
}
While doing a load test, I get a NullPointerException in valueState.value().
Which seems strange as we would have updated the value state above.
Another strange thing is that this is observed only in load conditions and
works fine otherwise.
We al
ference, the snipet below allows to create a SQL table with
> > a nested field and a watermark and filled with hard-coded values, which
> > is all I need in order to test SQL expressions.
> >
> > It's quite a mouthful though, is there a more succint to express
!
For future reference, the snipet below allows to create a SQL table with
a nested field and a watermark and filled with hard-coded values, which
is all I need in order to test SQL expressions.
It's quite a mouthful though, is there a more succint to express the
same thing?
var testData = List
I found an answer to my own question!
For future reference, the snipet below allows to create a SQL table with a
nested field and a watermark and filled with hard-coded values, which is all I
need in order to test SQL expressions.
It's quite a mouthful though, is there a more succint
I'm trying to write java unit test for a Flink SQL application using Flink mini
cluster, but I do not manage to create an input table with nested fields and
time characteristics.
I had a look at the documentation and examples below, although I'm still
struggling:
https://ci.apache.org/projects
;>
>> Regards,
>> Dian
>>
>> 2021年3月23日 下午2:39,Yik San Chan 写道:
>>
>> Hi Dian,
>>
>> However users do want to unit test their UDFs, as supported in
>> https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/testing.html#testing-user-define
2021年3月23日 下午2:39,Yik San Chan > <mailto:evan.chanyik...@gmail.com>> 写道:
>>
>> Hi Dian,
>>
>> However users do want to unit test their UDFs, as supported in
>> https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/testing.html#testing-user-defined
gt; Hi Dian,
>
> However users do want to unit test their UDFs, as supported in
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/testing.html#testing-user-defined-functions
>
> Even though the examples are for Flink, I believe PyFlink should ideally
> be no differ
As I replied in previous email, it doesn’t block users to write tests for
PyFlink UDFs. Users could use ._func to access the original Python function if
they want.
Regards,
Dian
> 2021年3月23日 下午2:39,Yik San Chan 写道:
>
> Hi Dian,
>
> However users do want to unit test their UDF
Hi Dian,
However users do want to unit test their UDFs, as supported in
https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/testing.html#testing-user-defined-functions
Even though the examples are for Flink, I believe PyFlink should ideally be
no difference.
What do you think
add some
> public-facing method to make it more intuitive for use in unit test?
> What do you think?
>
> Best,
> Yik San
>
> On Tue, Mar 23, 2021 at 2:02 PM Yik San Chan
> wrote:
>
>> Hi Dian,
>>
>> Thanks! It solves my problem.
>>
>> Bes
Hi Dian,
The ._func method seems to be internal only. Maybe we can add some
public-facing method to make it more intuitive for use in unit test?
What do you think?
Best,
Yik San
On Tue, Mar 23, 2021 at 2:02 PM Yik San Chan
wrote:
> Hi Dian,
>
> Thanks! It solves my problem.
>
&
AM Yik San Chan
> wrote:
>
>> (This question is cross-posted on StackOverflow
>> https://stackoverflow.com/questions/66756612/failed-to-unit-test-pyflink-udf
>> )
>>
>> I am using PyFlink and I want to unit test my UDF written in Python.
>>
>>
nc returns the original Python function.
Regards,
Dian
On Tue, Mar 23, 2021 at 10:56 AM Yik San Chan
wrote:
> (This question is cross-posted on StackOverflow
> https://stackoverflow.com/questions/66756612/failed-to-unit-test-pyflink-udf
> )
>
> I am using PyFlink and I want to
(This question is cross-posted on StackOverflow
https://stackoverflow.com/questions/66756612/failed-to-unit-test-pyflink-udf
)
I am using PyFlink and I want to unit test my UDF written in Python.
To test the simple udf below:
```python
# tasks/helloworld/udf.py
from pyflink.table import
Hi Smile,
Thanks for your clarification, it helped.
Thanks,
Vijay
> On Feb 28, 2021, at 7:06 PM, Smile wrote:
>
> Hi Vijay,
>
> Since version 1.7 Flink builds with Scala version 2.11 (default) and 2.12.
> Flink has APIs, libraries, and runtime modules written in Scala. Users of
> the Scala
Hi Vijay,
Since version 1.7 Flink builds with Scala version 2.11 (default) and 2.12.
Flink has APIs, libraries, and runtime modules written in Scala. Users of
the Scala API and libraries may have to match the Scala version of Flink
with the Scala version of their projects (because Scala is not
Hi Team,
While running java flink project in local, I am facing following issues: *Could
not create actor system ; Caused by: java.lang.NoSuchMethodError:
scala.Product.$init$(Lscala/Product;)V*
Could you suggest does flink java project needs scala at run time? What
versions might be
Thank you! I had scala-library 2.12.8 in my dependency tree (Probably a
remnant from when I was testing with Scala 2.12.8).
I did the following to fix this issue.
Removed scala-library 2.12.8 from my dependency tree and added the below
dependency.
org.scala-lang
scala-library
Coud you check your dependency tree for the version of scala-library?
On 2/24/2021 7:28 AM, soumoks wrote:
Thank you for the response but this error continues to happen with Scala
2.12.7.
The app itself continues to compile without errors but the test cases fail
with the same error.
Seems
Thank you for the response but this error continues to happen with Scala
2.12.7.
The app itself continues to compile without errors but the test cases fail
with the same error.
Seems to be related to
https://issues.apache.org/jira/browse/FLINK-12461
I have set the Scala version in pom.xml file
mpiles and the test cases pass as well.
Is this a known compatibility issue between Flink 1.11.2 and Scala
2.12.12?
Thanks,
Sourabh
/narrow down the issue, I upgraded the flink
package from 1.9.1 to to 1.11.2 while keeping the scala version the same
i.e 2.11.12 instead of 2.12.12 and this seems to have resolved the issue.
The app compiles and the test cases pass as well.
Is this a known compatibility issue between Flink 1.11.2
you build Flink from the root
directory (not calling maven from within a maven module?)
On Tue, Jan 19, 2021 at 11:19 AM Smile@LETTers wrote:
Hi,
I got an error when tried to compile & package Flink (version 1.12 & current
master).
It can be reproduced by run 'mvn clean test' under
f
believe it's a local issue.
>>
>> Are you sure that the build is failing when you build Flink from the root
>> directory (not calling maven from within a maven module?)
>>
>> On Tue, Jan 19, 2021 at 11:19 AM Smile@LETTers
>> wrote:
>>
>>>
rectory (not calling maven from within a maven module?)
On Tue, Jan 19, 2021 at 11:19 AM Smile@LETTers wrote:
Hi,
I got an error when tried to compile & package Flink (version 1.12 & current
master).
It can be reproduced by run 'mvn clean test' under
flink-end-to-end-tests/flink-e
an error when tried to compile & package Flink (version 1.12 &
>> current master).
>> It can be reproduced by run 'mvn clean test' under
>> flink-end-to-end-tests/flink-end-to-end-tests-common-kafka.
>>
>> It seems that a necessary dependency for test scope was missing an
iling when you build Flink from the root
directory (not calling maven from within a maven module?)
On Tue, Jan 19, 2021 at 11:19 AM Smile@LETTers wrote:
Hi,
I got an error when tried to compile & package Flink (version 1.12 & current
master).
It can be reproduced by run 'mvn clean
tried to compile & package Flink (version 1.12 &
> current master).
> It can be reproduced by run 'mvn clean test' under
> flink-end-to-end-tests/flink-end-to-end-tests-common-kafka.
>
> It seems that a necessary dependency for test scope was missing and some
> classes
同一段代码,在main里面可以正常正常,在Test里面却直接结束
StreamExecutionEnvironment bsEnv =
StreamExecutionEnvironment.getExecutionEnvironment();
EnvironmentSettings bsSettings =
EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
StreamTableEnvironment bsTableEnv
Hi,
I got an error when tried to compile & package Flink (version 1.12 & current
master).
It can be reproduced by run 'mvn clean test' under
flink-end-to-end-tests/flink-end-to-end-tests-common-kafka.
It seems that a necessary dependency for test scope was missing and some
cla
This may help you out.
https://github.com/apache/flink/blob/master/flink-libraries/flink-cep/src/test/java/org/apache/flink/cep/CEPITCase.java
On Sun, Jan 17, 2021 at 10:32 AM narasimha wrote:
>
> Hi,
>
> I'm using Flink CEP, but couldn't find any examples for writin
Hi,
I'm using Flink CEP, but couldn't find any examples for writing test cases
for the streams with CEP.
Can someone help on how to write test cases for streams with CEP applied on
it?
--
A.Narasimha Swamy
stom` and provide a partitioner if you want to
>> control explicitly how elements are distributed to downstream tasks.
>>
>>
>>
>> *From: *Martin Frank Hansen
>> *Reply-To: *"m...@berlingskemedia.dk"
>> *Date: *Thursday, January 14, 2021 at 1:48 AM
1 - 100 of 335 matches
Mail list logo