use flink 1.19 JDBC Driver can find jdbc connector

2024-05-09 文章 McClone
I put flink-connector-jdbc into flink\lib.use flink 1.19 JDBC Driver can not  
find jdbc connector,but use sql-client is normal.

怎么关闭operatorChaining

2021-05-31 文章 McClone
版本flink 1.11.2
 EnvironmentSettings build = 
EnvironmentSettings.newInstance().inBatchMode().build();
TableEnvironment tEnv = TableEnvironment.create(build);

hive streaning 问题请教

2020-10-16 文章 McClone
StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
env.disableOperatorChaining();
StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);
tEnv.getConfig().addConfiguration(
new Configuration()
.set(ExecutionCheckpointingOptions.CHECKPOINTING_INTERVAL, 
Duration.ofSeconds(30)));
tEnv.executeSql("CREATE TEMPORARY FUNCTION TestFunca AS 
'org.example.flink.TestFunca' LANGUAGE JAVA");
tEnv.executeSql("CREATE TABLE datagen (\n" +
" name STRING,\n" +
" pass STRING,\n" +
" type1 INT,\n" +
" t1 STRING,\n" +
" t2 STRING,\n" +
" ts AS localtimestamp,\n" +
" WATERMARK FOR ts AS ts\n" +
") WITH (\n" +
" 'connector' = 'datagen',\n" +
" 'rows-per-second'='1',\n" +
" 'fields.type1.min'='1',\n" +
" 'fields.type1.max'='10'\n" +
")");

tEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
tEnv.executeSql("CREATE TABLE hive_table (\n" +
"  user_id STRING,\n" +
"  order_amount STRING\n" +
") PARTITIONED BY (dt STRING, hr STRING) STORED AS parquet 
TBLPROPERTIES (\n" +
"  'sink.partition-commit.trigger'='partition-time',\n" +
"  'sink.partition-commit.delay'='1 h',\n" +
"  'sink.partition-commit.policy.kind'='metastore,success-file'\n" +
")");

tEnv.executeSql("insert into hive_table select 
t1,t2,TestFunca(type1),TestFunca(type1) from datagen");

Caused by: org.apache.flink.table.api.ValidationException: Table options do not 
contain an option key 'connector' for discovering a connector.
at 
org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:321)
at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:157)
... 18 more
发送自 Windows 10 版邮件应用



请教 hive streaming 报错

2020-08-24 文章 McClone
版本为:Flink 1.11.0 


2020-08-24 13:33:03,019 ERROR 
org.apache.flink.runtime.webmonitor.handlers.JarRunHandler   [] - Unhandled 
exception.

java.lang.IllegalAccessError: tried to access class 
org.apache.flink.streaming.api.functions.sink.filesystem.DefaultBucketFactoryImpl
 from class 
org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder

at 
org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder.(HadoopPathBasedBulkFormatBuilder.java:70)
 ~[?:?]

at 
org.apache.flink.connectors.hive.HiveTableSink.consumeDataStream(HiveTableSink.java:197)
 ~[?:?]

at 
org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:114)
 ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at 
org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:48)
 ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at 
org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:58)
 ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at 
org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlan(StreamExecLegacySink.scala:48)
 ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at 
org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:67)
 ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at 
org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:66)
 ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.Iterator$class.foreach(Iterator.scala:891) 
~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 
~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) 
~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.11-1.11.0.jar:1.11.0]