Any one can help me? How to connect offline hadoop cluster and realtime hadoop cluster by different hive catalog?

2021-08-26 Thread Jim Chen
Hi, All My flink version is 1.13.1 and my company have two hadoop cluster, offline hadoop cluster and realtime hadoop cluster. Now, on realtime hadoop cluster, we want to submit flink job to connect offline hadoop cluster by different hive catalog. I use different hive configuration diretory in h

Re: Any one can help me? Flink consume Kafka duplicate message via savepoint restart

2021-08-01 Thread Jim Chen
\ -p 64 \ -s hdfs://ztcluster/flink_realtime_warehouse/checkpoint/UserClickLogAll/savepoint/savepoint-59cf6c-f82cb4317494 \ -n \ -c com.datacenter.etl.ods.common.mobile.UserClickLogAll \ /opt/case/app/realtime/v1.0/batch/buryingpoint/paiping/all/realtime_etl-1.0-SNAPSHOT.jar Jim Chen 于2021年8月2日周一 下

Any one can help me? Flink consume Kafka duplicate message via savepoint restart

2021-08-01 Thread Jim Chen
Hi all, my flink job consume kafka topic A, and write to kafka topic B. When i restart my flink job via savepoint, topic B have some duplicate message. Any one can help me how to solve this problem? Thanks! My Versions: Flink 1.12.4 Kafka 2.0.1 Java 1.8 Core code: env.enableCheckpointing(30);

HELP!!! Flink1.10 sql insert into HBase, error like validateSchemaAndApplyImplicitCast

2020-07-15 Thread Jim Chen
Hi, I use flink1.10.1 sql to connect hbase1.4.3. When inset into hbase, report an error like validateSchemaAndApplyImplicitCast. Means that the Query Schema and Sink Schema are inconsistent. Mainly Row (EXPR$0) in Query Schema, which are all expressions. Sink Schema is Row(device_id). I don't k

HELP ! ! ! When using Flink1.10 to define table with the string type, the query result is null

2020-07-05 Thread Jim Chen
Hi, everyone! When i use flink1.10 to define table, and i want to define the json array as the string type. But the query resutl is null when i execute the program. The detail code as follow: package com.flink; import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; import

Re: How to consume kafka from the last offset?

2020-03-26 Thread Jim Chen
. > > czw., 26 mar 2020 o 08:38 Jim Chen > napisał(a): > >> Thanks! >> >> I made a mistake. I forget to set the auto.offset.reset=false. It's my >> fault. >> >> Dominik Wosiński 于2020年3月25日周三 下午6:49写道: >> >>> Hi Jim, >>&g

When i use the Tumbling Windows, find lost some record

2020-03-26 Thread Jim Chen
Hi, All When i use the Tumbling Windows, find lost some record. My code as follow *env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);* *env.addSource(FlinkKafkaConsumer011..)* *.assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor(Time.minutes(3)) {

Re: How to consume kafka from the last offset?

2020-03-26 Thread Jim Chen
> Best Regards, > Dom. > > śr., 25 mar 2020 o 11:19 Jim Chen > napisał(a): > >> Hi, All >> I use flink-connector-kafka-0.11 consume the Kafka0.11. In >> KafkaConsumer params, i set the group.id and auto.offset.reset. In the >> Flink1.10, set the kaf

How to consume kafka from the last offset?

2020-03-25 Thread Jim Chen
Hi, All I use flink-connector-kafka-0.11 consume the Kafka0.11. In KafkaConsumer params, i set the group.id and auto.offset.reset. In the Flink1.10, set the kafkaConsumer.setStartFromGroupOffsets(); Then, i restart the application, found the offset is not from the last position. Any one know wh

Re: Flink1.10 Cluster's Received is zero in the web when consume from Kafka0.11

2020-03-25 Thread Jim Chen
chain after the source / > before the sink, or query the numRecordsOut metric for the source / > numRecordsIn metric for the sink via the WebUI metrics tab or REST API. > > On 25/03/2020 10:49, Jim Chen wrote: > > Hi, all > When I use flink-connector-kafka-0.11 consume Kafka0

Flink1.10 Cluster's Received is zero in the web when consume from Kafka0.11

2020-03-25 Thread Jim Chen
Hi, all When I use flink-connector-kafka-0.11 consume Kafka0.11, the Cluster web's Received Record is always 0. However, the log is not empty. Any one can help me? [image: image.png]