Hello Flink dev team,
This question might have been answered already but I want to ask again
after the official release 1.18.
What are the major differences between Fink 1.17 and 1.18? Is 1.18 faster
for streaming processing?
Thank you very much.
Best,
Amir
Does the 1.18 version support the Kafka connector?
On Wed, Oct 18, 2023 at 8:26 PM Jing Ge wrote:
> Hi everyone,
>
> Please review and vote on the release candidate #3 for the version
> 1.18.0, as follows:
> [ ] +1, Approve the release
> [ ] -1, Do not approve the release (please provide
Hi Flink Dev Team,
I am trying to create a Docker image. Before asking my question, I will
explain to you about my application. My main (command) jar file:
1) Has dependencies on other jar files (they are all in the same directory)
2) It Needs to read some arguments from a config file.
3)
08484cab78c8bd7b74a287edf7d1f3b01/flink-state-backends/flink-statebackend-rocksdb/src/main/java/org/apache/flink/contrib/streaming/state/RocksDBListState.java#L131
>
> Best
> Yun Tang
>
> From: Amir Hossein Sharifzadeh
> Sent: Tuesday, May 23, 202
Dear Flink Dev team,
It's about a while since I am dealing with an issue that can't figure out
myself. I spent quite a lot of time trying to solve the problem myself, but
I feel stuck.
I explained the problem statement and the issue here:
Thanks. If you look at the code, I am defining/creating the table as:
create_kafka_source_ddl = """
CREATE TABLE payment_msg(
createTime VARCHAR,
orderId BIGINT,
payAmount DOUBLE,
payPlatform INT,
provinceId INT
) WITH (
tend.java:1168)
Caused by: java.lang.RuntimeException: Python process exits with code: 1
at
org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:130)
... 13 more
Thanks a lot,
Best,
Amir
On Tue, Feb 7, 2023 at 10:48 AM Amir Hossein Sharifzadeh <
amirsharifza...@gmail.com&g
> you can post the full error message .
>
> Best regards,
> Yuxia
>
> --
> *发件人: *"Amir Hossein Sharifzadeh"
> *收件人: *"yuxia"
> *抄送: *"dev"
> *发送时间: *星期二, 2023年 2 月 07日 上午 10:39:25
> *主题: *Re: Need help how to use Table API
tely and then insert them into those tables. The data will
> be consumed from what we implemented here.
>
> Best regards,
> Yuxia
>
> --
> *发件人: *"Amir Hossein Sharifzadeh"
> *收件人: *luoyu...@alumni.sjtu.edu.cn
> *发送时间: *星期日, 20
nightlies.apache.org/flink/flink-docs-master/docs/dev/table/tableapi/#joins
>
> Best regards,
> Yuxia
>
> - 原始邮件 -
> 发件人: "Amir Hossein Sharifzadeh"
> 收件人: "dev"
> 发送时间: 星期五, 2023年 2 月 03日 上午 4:45:08
> 主题: Need help how to use Table API to
Hello,
I have a Kafka producer and a Kafka consumer that produces and consumes
multiple data respectively. You can think of two data sets here. Both
datasets have a similar structure but carry different data.
I want to implement a Table API to join two Kafka streams while I
consume them. For
Hi developers:
I am trying to run a sample Flink consumer (
https://nightlies.apache.org/flink/flink-docs-release-1.16/api/python//examples/datastream/connectors.html)
but I get these error messages (I am running the program on Mac M1, and
downgraded my java to jdk 8: java version "1.8.0_351"):
12 matches
Mail list logo