Re: Fink 15: InvalidProgramException: Table program cannot be compiled. This is a bug

2022-06-09 Thread Shengkai Fang
Hi.

I open a ticket about upgrading the version[1]. Maybe it is worth a try.

Best,
Shengkai

[1] https://issues.apache.org/jira/browse/FLINK-27995

Benenson, Michael  于2022年6月10日周五 04:51写道:

> Hi, David
>
>
>
> Hard to tell for sure, but yes, [1] could also indicate some problems with
> Janio.
>
> Both Flink 1.14.3 & 1.15.0 use 3.0.11
>
> The difference is that Flink 1.14.3 has been compiled & run on jvm-8, but
> Flink 1.15.0 has been compiled & run on jvm-11
>
>
>
> In the revision history for Janio at [2] one comment for release 3.0.11 is
>
>- At last, the "jdk-9" implementation works.
>
> Probably, we could imply that jdk-11 does not work in Janio 3.0.11, so
> switch to the latest version of Janio 3.1.7
> <https://mvnrepository.com/artifact/org.codehaus.janino/janino/3.1.7> could
> help.
>
>
>
> Do you know, how to make nightly flink build with janio 3.1.7
> <https://mvnrepository.com/artifact/org.codehaus.janino/janino/3.1.7>?
>
>
>
> [1] https://lists.apache.org/thread/9tw165cgpdqz4ron76b1ckmwm9hy4qfd
>
> [2] https://janino-compiler.github.io/janino/changelog.html
>
>
>
>
>
>
>
> *From: *David Anderson 
> *Date: *Thursday, June 9, 2022 at 12:05 PM
> *To: *Benenson, Michael 
> *Cc: *user@flink.apache.org , Deshpande, Omkar <
> omkar_deshpa...@intuit.com>, Waghulde, Suraj 
> *Subject: *Re: Fink 15: InvalidProgramException: Table program cannot be
> compiled. This is a bug
>
> This email is from an external sender.
>
>
>
> Sorry; I guess I jumped to conclusions and got it wrong. Let's dig in
> deeper.
>
>
>
> This sounds similar to what was reported in this thread [1] where someone
> else ran into problems with Janino after upgrading from 1.14 to 1.15. Could
> this be another instance of the same issue (i.e., Flink ends up using the
> wrong version of Janino)?
>
>
>
> [1] https://lists.apache.org/thread/9tw165cgpdqz4ron76b1ckmwm9hy4qfd
>
>
>
> On Thu, Jun 9, 2022 at 8:18 PM Benenson, Michael <
> mikhail_benen...@intuit.com> wrote:
>
> Hi, David
>
>
>
> I have tried CREATE TABLE … without proc_time, but it does not help, I see
> the same exception.
>
>
> And this is a new issue for Flink 1.15.0, for Flink 1.14.3 it works fine,
> even with both event_time & proc_time in CREATE TABLE statement.
>
> Now I use
>
> CREATE OR REPLACE TABLE input (
> event_header ROW(topic_name STRING),
> `timestamp` STRING NOT NULL,
> event_time AS TO_TIMESTAMP(fix_instant(`timestamp`),
> '-MM-dd''T''HH:mm:ss.SSS''Z'''),
> properties ROW(company_id STRING NOT NULL, scope_area STRING, action STRING)
> NOT NULL,
> WATERMARK FOR event_time AS event_time - INTERVAL '5' SECOND
> ) WITH (
>'connector' = 'kafka',
>'topic' = 'mb-1644796800-qbo',
>'properties.bootstrap.servers' = 'localhost:9092',
>'format' = 'json',
>'scan.startup.mode' = 'latest-offset',
>'json.ignore-parse-errors' = 'true',
>'json.fail-on-missing-field' = 'false'
> );
>
> And got the same exception
>
>
> java.lang.RuntimeException: Could not instantiate generated class
> 'WatermarkGenerator$0'
>
> at
> org.apache.flink.table.runtime.generated.GeneratedClass.newInstance(GeneratedClass.java:74)
> ~[flink-table-runtime-1.15.0.jar:1.15.0]
> …
> Caused by: org.apache.flink.util.FlinkRuntimeException:
> org.apache.flink.api.common.InvalidProgramException: Table program cannot
> be compiled. This is a bug. Please file an issue.
>
> at
> org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:94)
> ~[flink-table-runtime-1.15.0.jar:1.15.0]
>
> …
>
> Caused by: org.codehaus.commons.compiler.CompileException: Line 30, Column
> 75: Cannot determine simple type name "org"
>
> at
> org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:12211)
> ~[flink-table-runtime-1.15.0.jar:1.15.0]
>
>
>
> *From: *David Anderson 
> *Date: *Thursday, June 9, 2022 at 5:34 AM
> *To: *Benenson, Michael 
> *Cc: *user@flink.apache.org , Deshpande, Omkar <
> omkar_deshpa...@intuit.com>, Waghulde, Suraj 
> *Subject: *Re: Fink 15: InvalidProgramException: Table program cannot be
> compiled. This is a bug
>
> This email is from an external sender.
>
>
>
> A Table can have at most one time attribute. In your Table the proc_time
> column is a processing time attribute, and when you define a watermark on

Re: Fink 15: InvalidProgramException: Table program cannot be compiled. This is a bug

2022-06-09 Thread David Anderson
Sorry; I guess I jumped to conclusions and got it wrong. Let's dig in
deeper.

This sounds similar to what was reported in this thread [1] where someone
else ran into problems with Janino after upgrading from 1.14 to 1.15. Could
this be another instance of the same issue (i.e., Flink ends up using the
wrong version of Janino)?

[1] https://lists.apache.org/thread/9tw165cgpdqz4ron76b1ckmwm9hy4qfd

On Thu, Jun 9, 2022 at 8:18 PM Benenson, Michael <
mikhail_benen...@intuit.com> wrote:

> Hi, David
>
>
>
> I have tried CREATE TABLE … without proc_time, but it does not help, I see
> the same exception.
>
>
> And this is a new issue for Flink 1.15.0, for Flink 1.14.3 it works fine,
> even with both event_time & proc_time in CREATE TABLE statement.
>
> Now I use
>
> CREATE OR REPLACE TABLE input (
> event_header ROW(topic_name STRING),
> `timestamp` STRING NOT NULL,
> event_time AS TO_TIMESTAMP(fix_instant(`timestamp`),
> '-MM-dd''T''HH:mm:ss.SSS''Z'''),
> properties ROW(company_id STRING NOT NULL, scope_area STRING, action STRING)
> NOT NULL,
> WATERMARK FOR event_time AS event_time - INTERVAL '5' SECOND
> ) WITH (
>'connector' = 'kafka',
>'topic' = 'mb-1644796800-qbo',
>'properties.bootstrap.servers' = 'localhost:9092',
>'format' = 'json',
>'scan.startup.mode' = 'latest-offset',
>'json.ignore-parse-errors' = 'true',
>'json.fail-on-missing-field' = 'false'
> );
>
> And got the same exception
>
>
> java.lang.RuntimeException: Could not instantiate generated class
> 'WatermarkGenerator$0'
>
> at
> org.apache.flink.table.runtime.generated.GeneratedClass.newInstance(GeneratedClass.java:74)
> ~[flink-table-runtime-1.15.0.jar:1.15.0]
> …
> Caused by: org.apache.flink.util.FlinkRuntimeException:
> org.apache.flink.api.common.InvalidProgramException: Table program cannot
> be compiled. This is a bug. Please file an issue.
>
> at
> org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:94)
> ~[flink-table-runtime-1.15.0.jar:1.15.0]
>
> …
>
> Caused by: org.codehaus.commons.compiler.CompileException: Line 30, Column
> 75: Cannot determine simple type name "org"
>
>     at
> org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:12211)
> ~[flink-table-runtime-1.15.0.jar:1.15.0]
>
>
>
> *From: *David Anderson 
> *Date: *Thursday, June 9, 2022 at 5:34 AM
> *To: *Benenson, Michael 
> *Cc: *user@flink.apache.org , Deshpande, Omkar <
> omkar_deshpa...@intuit.com>, Waghulde, Suraj 
> *Subject: *Re: Fink 15: InvalidProgramException: Table program cannot be
> compiled. This is a bug
>
> This email is from an external sender.
>
>
>
> A Table can have at most one time attribute. In your Table the proc_time
> column is a processing time attribute, and when you define a watermark on
> the event_time column then that column becomes an event-time attribute.
>
>
>
> If you want to combine event time and processing time, you can use
> the PROCTIME() function in your queries without having a processing time
> attribute as one of the columns in the table.
>
>
>
> Best,
>
> David
>
>
>
> On Wed, Jun 8, 2022 at 9:46 PM Benenson, Michael <
> mikhail_benen...@intuit.com> wrote:
>
> Hi, folks
>
>
>
> *Short description*:
>
>
>
> I use Flink 1.15.0 sql-client and Java User Define Function in CREATE
> TABLE … statement to get Timestamp.
>
> It works OK, if I do not use Timestamp in Watermark, but if used in
> Watermark it causes
>
> java.lang.RuntimeException: Could not instantiate generated class
> 'WatermarkGenerator$0'
>
> …
>
> Caused by: org.apache.flink.api.common.InvalidProgramException: Table
> program cannot be compiled. This is a bug. Please file an issue.
>
> at org.apache.flink.table.runtime.generated.CompileUtils.doCompile(
> CompileUtils.java:107)
>
> …
>
> Caused by: org.codehaus.commons.compiler.CompileException: Line 30, Column
> 75: Cannot determine simple type name "org"
>
> at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:
> 12211)
>
>
>
>
>
> *Details*:
>
> java -version
>
> openjdk version "11.0.14.1" 2022-02-08 LTS
>
> OpenJDK Runtime Environment Corretto-11.0.14.10.1 (build 11.0.14.1+10-LTS)
>
> OpenJDK 64-Bit Server VM Corretto-11.0.14.10.1 (build 11.0.14.1+10-LTS,
> mixed mode)
>
>
>
> Flink 1.15.0, flink-1.15.0/bin/sql-client

Re: Fink 15: InvalidProgramException: Table program cannot be compiled. This is a bug

2022-06-09 Thread David Anderson
A Table can have at most one time attribute. In your Table the proc_time
column is a processing time attribute, and when you define a watermark on
the event_time column then that column becomes an event-time attribute.

If you want to combine event time and processing time, you can use
the PROCTIME() function in your queries without having a processing time
attribute as one of the columns in the table.

Best,
David

On Wed, Jun 8, 2022 at 9:46 PM Benenson, Michael <
mikhail_benen...@intuit.com> wrote:

> Hi, folks
>
>
>
> *Short description*:
>
>
>
> I use Flink 1.15.0 sql-client and Java User Define Function in CREATE
> TABLE … statement to get Timestamp.
>
> It works OK, if I do not use Timestamp in Watermark, but if used in
> Watermark it causes
>
> java.lang.RuntimeException: Could not instantiate generated class
> 'WatermarkGenerator$0'
>
> …
>
> Caused by: org.apache.flink.api.common.InvalidProgramException: Table
> program cannot be compiled. This is a bug. Please file an issue.
>
> at org.apache.flink.table.runtime.generated.CompileUtils.doCompile(
> CompileUtils.java:107)
>
> …
>
> Caused by: org.codehaus.commons.compiler.CompileException: Line 30, Column
> 75: Cannot determine simple type name "org"
>
> at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:
> 12211)
>
>
>
>
>
> *Details*:
>
> java -version
>
> openjdk version "11.0.14.1" 2022-02-08 LTS
>
> OpenJDK Runtime Environment Corretto-11.0.14.10.1 (build 11.0.14.1+10-LTS)
>
> OpenJDK 64-Bit Server VM Corretto-11.0.14.10.1 (build 11.0.14.1+10-LTS,
> mixed mode)
>
>
>
> Flink 1.15.0, flink-1.15.0/bin/sql-client.sh
>
>
>
> SET 'sql-client.execution.result-mode' = 'tableau';
>
> SET 'table.exec.sink.not-null-enforcer' = 'drop';
>
>
>
> CREATE TEMPORARY FUNCTION default_catalog.default_database.fix_instant
>
> AS 'com.intuit.data.strmprocess.udf.FixInstant' LANGUAGE JAVA;
>
>
>
> CREATE OR REPLACE TABLE input (
>
>   event_header ROW(topic_name STRING),
>
>   `timestamp` STRING NOT NULL,
>
>   event_time AS TO_TIMESTAMP(fix_instant(`timestamp`),
> '-MM-dd''T''HH:mm:ss.SSS''Z'''),
>
>   properties ROW(company_id STRING NOT NULL, scope_area STRING,
> action STRING) NOT NULL,
>
>   WATERMARK FOR event_time AS event_time - INTERVAL '5' SECOND,
>
>   proc_time AS PROCTIME()
>
> ) WITH (
>
> 'connector' = 'kafka',
>
> 'topic' = 'mb-1644796800-qbo',
>
> 'properties.bootstrap.servers' = 'localhost:9092',
>
> 'format' = 'json',
>
> 'scan.startup.mode' = 'latest-offset',
>
> 'json.ignore-parse-errors' = 'true',
>
> 'json.fail-on-missing-field' = 'false'
>
> );
>
>
>
> SELECT `timestamp`, event_time, event_header.topic_name AS topic,
> properties.company_id as company FROM input
>
>LIMIT 10
>
> ;
>
>
>
> Works fine, if I comment WATERMARK FOR event_time …
> Causes an error, if WATERMARK FOR event_time is used:
>
> 2022-06-08 12:16:32
>
> java.lang.RuntimeException: Could not instantiate generated class
> 'WatermarkGenerator$0'
>
> at org.apache.flink.table.runtime.generated.GeneratedClass
> .newInstance(GeneratedClass.java:74)
>
> at org.apache.flink.table.runtime.generated.
> GeneratedWatermarkGeneratorSupplier.createWatermarkGenerator(
> GeneratedWatermarkGeneratorSupplier.java:62)
>
> at org.apache.flink.streaming.api.operators.source.
> ProgressiveTimestampsAndWatermarks.createMainOutput(
> ProgressiveTimestampsAndWatermarks.java:104)
>
> at org.apache.flink.streaming.api.operators.SourceOperator
> .initializeMainOutput(SourceOperator.java:426)
>
> at org.apache.flink.streaming.api.operators.SourceOperator
> .emitNextNotReading(SourceOperator.java:402)
>
> at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(
> SourceOperator.java:387)
>
> at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput
> .emitNext(StreamTaskSourceInput.java:68)
>
> at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor
> .processInput(StreamOneInputProcessor.java:65)
>
> at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(
> StreamTask.java:519)
>
> at org.apache.fli

Re: Table program cannot be compiled. This is a bug. Please file an issue

2021-09-13 Thread Caizhi Weng
Hi!

This is because Java has a maximum method length of 64 KB. For Flink <=
1.13 please set table.generated-code.max-length to less than 65536 (~8192
is preferred) to limit the length of each generated method.

If this doesn't help, we've (hopefully) completely fixed this issue in
Flink 1.14 by creating a brand-new module called the java-code-splitter.
For now you might want to make the length of your SQL shorter and wait for
the release of Flink 1.14.

张颖  于2021年9月14日周二 上午11:07写道:

> I write a long sql,but when I explain my plan,it make a mistake:
>
> org.apache.flink.util.FlinkRuntimeException:
> org.apache.flink.api.common.InvalidProgramException: Table program cannot
> be compiled. This is a bug. Please file an issue.
> at
> org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:76)
> at
> org.apache.flink.table.runtime.generated.GeneratedClass.compile(GeneratedClass.java:77)
> at
> org.apache.flink.table.runtime.generated.GeneratedClass.getClass(GeneratedClass.java:95)
> at
> org.apache.flink.table.runtime.operators.CodeGenOperatorFactory.getStreamOperatorClass(CodeGenOperatorFactory.java:51)
> at
> org.apache.flink.client.program.topology.FlinkStreamTopology.setOperatorParameter(FlinkStreamTopology.java:75)
> at java.util.ArrayList.forEach(ArrayList.java:1257)
> at
> org.apache.flink.client.program.topology.FlinkStreamTopology.setOperatorParameters(FlinkStreamTopology.java:109)
> at
> org.apache.flink.client.program.topology.FlinkStreamTopology.updateStreamGraph(FlinkStreamTopology.java:122)
> at
> org.apache.flink.client.program.topology.FlinkStreamTopology.streamGraphTopoHandler(FlinkStreamTopology.java:115)
> at
> org.apache.flink.client.program.topology.FlinkStreamTopology.getPipelinePlanJson(FlinkStreamTopology.java:167)
> at
> org.apache.flink.client.program.AbstractFlinkTopology.getPlan(AbstractFlinkTopology.java:35)
> at
> org.apache.flink.client.program.PackagedProgramUtils.getPlanBox(PackagedProgramUtils.java:351)
> at
> org.apache.flink.runtime.webmonitor.handlers.JobPlanBoxHandler.lambda$handleRequest$4(JobPlanBoxHandler.java:138)
> at
> java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by:
> org.apache.flink.shaded.guava18.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.flink.api.common.InvalidProgramException: Table program cannot
> be compiled. This is a bug. Please file an issue.
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
> at
> org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:74)
> ... 20 common frames omitted
> Caused by: org.apache.flink.api.common.InvalidProgramException: Table
> program cannot be compiled. This is a bug. Please file an issue.
> at
> org.apache.flink.table.runtime.generated.CompileUtils.doCompile(CompileUtils.java:89)
> at
> org.apache.flink.table.runtime.generated.CompileUtils.lambda$compile$1(CompileUtils.java:74)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
> at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
> ... 23 common frames omitted
> Caused by: org.codehaus.janino.InternalCompilerException: Compiling
> "BatchCalc$21123": Code of method "split$21122$(LBatchCalc$21123;)V" of
> class "BatchCalc$21123" grows beyond 64 KB
> at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:382)
> at org.codehaus.j

Table program cannot be compiled. This is a bug. Please file an issue

2021-09-13 Thread 张颖
I write a long sql,but when I explain my plan,it make a mistake:


org.apache.flink.util.FlinkRuntimeException: 
org.apache.flink.api.common.InvalidProgramException: Table program cannot be 
compiled. This is a bug. Please file an issue.
at 
org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:76)
at 
org.apache.flink.table.runtime.generated.GeneratedClass.compile(GeneratedClass.java:77)
at 
org.apache.flink.table.runtime.generated.GeneratedClass.getClass(GeneratedClass.java:95)
at 
org.apache.flink.table.runtime.operators.CodeGenOperatorFactory.getStreamOperatorClass(CodeGenOperatorFactory.java:51)
at 
org.apache.flink.client.program.topology.FlinkStreamTopology.setOperatorParameter(FlinkStreamTopology.java:75)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at 
org.apache.flink.client.program.topology.FlinkStreamTopology.setOperatorParameters(FlinkStreamTopology.java:109)
at 
org.apache.flink.client.program.topology.FlinkStreamTopology.updateStreamGraph(FlinkStreamTopology.java:122)
at 
org.apache.flink.client.program.topology.FlinkStreamTopology.streamGraphTopoHandler(FlinkStreamTopology.java:115)
at 
org.apache.flink.client.program.topology.FlinkStreamTopology.getPipelinePlanJson(FlinkStreamTopology.java:167)
at 
org.apache.flink.client.program.AbstractFlinkTopology.getPlan(AbstractFlinkTopology.java:35)
at 
org.apache.flink.client.program.PackagedProgramUtils.getPlanBox(PackagedProgramUtils.java:351)
at 
org.apache.flink.runtime.webmonitor.handlers.JobPlanBoxHandler.lambda$handleRequest$4(JobPlanBoxHandler.java:138)
at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: 
org.apache.flink.shaded.guava18.com.google.common.util.concurrent.UncheckedExecutionException:
 org.apache.flink.api.common.InvalidProgramException: Table program cannot be 
compiled. This is a bug. Please file an issue.
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
at 
org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:74)
... 20 common frames omitted
Caused by: org.apache.flink.api.common.InvalidProgramException: Table program 
cannot be compiled. This is a bug. Please file an issue.
at 
org.apache.flink.table.runtime.generated.CompileUtils.doCompile(CompileUtils.java:89)
at 
org.apache.flink.table.runtime.generated.CompileUtils.lambda$compile$1(CompileUtils.java:74)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
... 23 common frames omitted
Caused by: org.codehaus.janino.InternalCompilerException: Compiling 
"BatchCalc$21123": Code of method "split$21122$(LBatchCalc$21123;)V" of class 
"BatchCalc$21123" grows beyond 64 KB
at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:382)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:237)
at 
org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:465)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:216)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:207)
at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80)
at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:75)
at 
org.apache.flink.table.runtime.generated.CompileUtils.doCompile(CompileUtils.java:86)
... 29 common frames omitted
Caused by: org.codehaus.janino.InternalCompilerException: Code of method 
"split$21122$(LBatchCalc$21123;)V" of class "BatchCalc$21123" grows beyond 64 KB
at org.codehaus.janino.CodeContext.makeSpace(CodeContext.java:1048)
at org.codehaus.janino.CodeContext.write(CodeContext.java:940)
at org.codehaus.janino.U

Fwd: Flink Table program cannot be compiled when enable checkpoint of StreamExecutionEnvironment

2020-06-16 Thread 杜斌
-- Forwarded message -
发件人: 杜斌 
Date: 2020年6月17日周三 上午10:31
Subject: Re: Flink Table program cannot be compiled when enable checkpoint
of StreamExecutionEnvironment
To: 


add the full stack trace here:


Caused by:
org.apache.flink.shaded.guava18.com.google.common.util.concurrent.UncheckedExecutionException:
org.apache.flink.api.common.InvalidProgramException: Table program cannot
be compiled. This is a bug. Please file an issue.
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
at
org.apache.flink.table.runtime.generated.CompileUtils.compile(CompileUtils.java:66)
... 14 more
Caused by: org.apache.flink.api.common.InvalidProgramException: Table
program cannot be compiled. This is a bug. Please file an issue.
at
org.apache.flink.table.runtime.generated.CompileUtils.doCompile(CompileUtils.java:81)
at
org.apache.flink.table.runtime.generated.CompileUtils.lambda$compile$1(CompileUtils.java:66)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
... 17 more
Caused by: org.codehaus.commons.compiler.CompileException: Line 2, Column
46: Cannot determine simple type name "org"
at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:12124)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6746)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6507)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6520)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6520)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6520)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6520)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6520)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:6520)
at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:6486)
at org.codehaus.janino.UnitCompiler.access$13800(UnitCompiler.java:215)
at
org.codehaus.janino.UnitCompiler$21$1.visitReferenceType(UnitCompiler.java:6394)
at
org.codehaus.janino.UnitCompiler$21$1.visitReferenceType(UnitCompiler.java:6389)
at org.codehaus.janino.Java$ReferenceType.accept(Java.java:3917)
at org.codehaus.janino.UnitCompiler$21.visitType(UnitCompiler.java:6389)
at org.codehaus.janino.UnitCompiler$21.visitType(UnitCompiler.java:6382)
at org.codehaus.janino.Java$ReferenceType.accept(Java.java:3916)
at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:6382)
at org.codehaus.janino.UnitCompiler.access$1300(UnitCompiler.java:215)
at
org.codehaus.janino.UnitCompiler$33.getSuperclass2(UnitCompiler.java:9886)
at org.codehaus.janino.IClass.getSuperclass(IClass.java:455)
at org.codehaus.janino.IClass.getIMethods(IClass.java:260)
at org.codehaus.janino.IClass.getIMethods(IClass.java:237)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:492)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:432)
at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:215)
at
org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:411)
at
org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:406)
at
org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1414)
at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:406)
at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:378)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:237)
at
org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:465)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:216)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:207)
at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80)
at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:75)
at
org.apache.flink.table.runtime.generated.CompileUtils.doCompile(CompileUtils.java:78)
... 23 more

杜斌  于2020年6月17日周三 上午10:29写道:

> Hi,
> Need help on this issue, here is what Flink reported when I enable the
> checkpoint setting of the StreamExecutionEnvironment:
>
> /* 1 */
> /* 2 */  public class SourceConversion$1 extends
> org.apache.flink.table.runtime.operator

Re: Table program cannot be compiled

2019-05-20 Thread Timo Walther

Hi Shahar,

yes the number of parameters should be the issue for a cannot compile 
exception. If you moved most of the constants to a member in the 
function, it should actually work.


Do you have a little reproducible example somewhere?

Thanks,
Timo



Am 16.05.19 um 19:59 schrieb shkob1:

Hi Timo,

Thanks for the link.
Not sure i understand your suggestion though, is the goal here reducing the
amount of parameters coming to the UDF? if thats the case i can maybe have
the tag names there, but still need the expressions to get evaluated before
entering the eval. Do you see this in a different way?

I tried moving the tag names to be a member within the function instead of a
parameter, but apparently i still have too many arguments.

Let me know if this is not what you meant.

Thanks!
Shahar



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/





Re: Table program cannot be compiled

2019-05-16 Thread shkob1
Hi Timo,

Thanks for the link.
Not sure i understand your suggestion though, is the goal here reducing the
amount of parameters coming to the UDF? if thats the case i can maybe have
the tag names there, but still need the expressions to get evaluated before
entering the eval. Do you see this in a different way?

I tried moving the tag names to be a member within the function instead of a
parameter, but apparently i still have too many arguments.

Let me know if this is not what you meant.

Thanks!
Shahar



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Re: Table program cannot be compiled

2019-05-16 Thread Timo Walther

Hi,

too many arguments for calling a UDF could currently lead to "grows 
beyond 64 KB" and maybe also causes the GC exception. This is a known 
issue covered in https://issues.apache.org/jira/browse/FLINK-8921.


Could you also add the tags to the function itself? Maybe as a static 
map for constant time access outside of the eval method?


Regards,
Timo


Am 15.05.19 um 17:10 schrieb Andrey Zagrebin:

Hi, I am looping in Timo and Dawid to look at the problem.

On Tue, May 14, 2019 at 9:12 PM shkob1 > wrote:


BTW looking at past posts on this issue[1] it should have been
fixed? i'm
using version 1.7.2
Also the recommendation was to use a custom function, though
that's exactly
what im doing with the conditionalArray function[2]

Thanks!

[1]

http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/DataStreamCalcRule-1802-quot-grows-beyond-64-KB-when-execute-long-sql-td20832.html#a20841

[2]
public class ConditionalArrayFunction extends ScalarFunction {

    public static final String NAME = "conditionalArray";

    public String[] eval(Object... keyValues) {
        if (keyValues.length == 0) {
            return new String[]{};
        }
        final List keyValuesList = Arrays.asList(keyValues);
        List trueItems = Lists.newArrayList();
        for (int i = 0; i < keyValuesList.size(); i = i + 2){
            final String key = (String)keyValuesList.get(i);
            final Object value = keyValuesList.get(i + 1);

            if (value != null && (boolean)value)
                trueItems.add(key);
        }
        return trueItems.toArray(new String[0]);
    }
}




--
Sent from:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/





Re: Table program cannot be compiled

2019-05-15 Thread Andrey Zagrebin
Hi, I am looping in Timo and Dawid to look at the problem.

On Tue, May 14, 2019 at 9:12 PM shkob1  wrote:

> BTW looking at past posts on this issue[1] it should have been fixed? i'm
> using version 1.7.2
> Also the recommendation was to use a custom function, though that's exactly
> what im doing with the conditionalArray function[2]
>
> Thanks!
>
> [1]
>
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/DataStreamCalcRule-1802-quot-grows-beyond-64-KB-when-execute-long-sql-td20832.html#a20841
>
> [2]
> public class ConditionalArrayFunction extends ScalarFunction {
>
> public static final String NAME = "conditionalArray";
>
> public String[] eval(Object... keyValues) {
> if (keyValues.length == 0) {
> return new String[]{};
> }
> final List keyValuesList = Arrays.asList(keyValues);
> List trueItems = Lists.newArrayList();
> for (int i = 0; i < keyValuesList.size(); i = i + 2){
> final String key = (String)keyValuesList.get(i);
> final Object value = keyValuesList.get(i + 1);
>
> if (value != null && (boolean)value)
> trueItems.add(key);
> }
> return trueItems.toArray(new String[0]);
> }
> }
>
>
>
>
> --
> Sent from:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
>


Re: Table program cannot be compiled

2019-05-14 Thread shkob1
BTW looking at past posts on this issue[1] it should have been fixed? i'm
using version 1.7.2
Also the recommendation was to use a custom function, though that's exactly
what im doing with the conditionalArray function[2] 

Thanks!

[1]
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/DataStreamCalcRule-1802-quot-grows-beyond-64-KB-when-execute-long-sql-td20832.html#a20841

[2]
public class ConditionalArrayFunction extends ScalarFunction {

public static final String NAME = "conditionalArray";

public String[] eval(Object... keyValues) {
if (keyValues.length == 0) {
return new String[]{};
}
final List keyValuesList = Arrays.asList(keyValues);
List trueItems = Lists.newArrayList();
for (int i = 0; i < keyValuesList.size(); i = i + 2){
final String key = (String)keyValuesList.get(i);
final Object value = keyValuesList.get(i + 1);

if (value != null && (boolean)value)
trueItems.add(key);
}
return trueItems.toArray(new String[0]);
}
}




--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Re: Table program cannot be compiled

2019-05-14 Thread shkob1
In a subsequent run i get
Caused by: org.codehaus.janino.JaninoRuntimeException: Code of method
"split$3681$(LDataStreamCalcRule$3682;)V" of class "DataStreamCalcRule$3682"
grows beyond 64 KB
 



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Table program cannot be compiled

2019-05-14 Thread shkob1
Hey,

While running a SQL query i get an OutOfMemoryError exception and "Table
program cannot be compiled" [2].
In my scenario i'm trying to enrich an event using an array of tags, each
tag has a boolean classification (like a WHERE clause) and with a custom
function i'm filtering the array to keep only TRUE results.
While i cannot publish the actual query the form of the query is as follows:

/SELECT originalEvent, conditionalArray('Tag1', boolean_condition1, 'Tag2',
boolean_condition2).. FROM../
(more info at the use case here[1] ) 

I do have quite a lot of those tags (counting 126 tags now) but i had
something similar in the past without this error.
Any idea about it and if you can think of a workaround to this issue?

[1]
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Reconstruct-object-through-partial-select-query-td27782.html

[2]
*org.apache.flink.api.common.InvalidProgramException: Table program cannot
be compiled. This is a bug. Please file an issue.*
at 
org.apache.flink.table.codegen.Compiler$class.compile(Compiler.scala:36)
at
org.apache.flink.table.runtime.CRowProcessRunner.compile(CRowProcessRunner.scala:35)
at
org.apache.flink.table.runtime.CRowProcessRunner.open(CRowProcessRunner.scala:49)
at
org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at
org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at
org.apache.flink.streaming.api.operators.ProcessOperator.open(ProcessOperator.java:56)
at
org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:424)
at
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:290)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
at java.lang.Thread.run(Thread.java:745)
*Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded*
at java.util.HashMap.newNode(HashMap.java:1734)
at java.util.HashMap.putVal(HashMap.java:641)
at java.util.HashMap.putMapEntries(HashMap.java:514)
at java.util.HashMap.putAll(HashMap.java:784)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3322)
at org.codehaus.janino.UnitCompiler.access$4900(UnitCompiler.java:212)
at
org.codehaus.janino.UnitCompiler$8.visitLocalVariableDeclarationStatement(UnitCompiler.java:3207)
at
org.codehaus.janino.UnitCompiler$8.visitLocalVariableDeclarationStatement(UnitCompiler.java:3175)
at
org.codehaus.janino.Java$LocalVariableDeclarationStatement.accept(Java.java:3348)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3174)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3231)
at org.codehaus.janino.UnitCompiler.access$3800(UnitCompiler.java:212)
at org.codehaus.janino.UnitCompiler$8.visitBlock(UnitCompiler.java:3193)
at org.codehaus.janino.UnitCompiler$8.visitBlock(UnitCompiler.java:3175)
at org.codehaus.janino.Java$Block.accept(Java.java:2753)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3174)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3267)
at org.codehaus.janino.UnitCompiler.access$4200(UnitCompiler.java:212)
at
org.codehaus.janino.UnitCompiler$8.visitIfStatement(UnitCompiler.java:3197)
at
org.codehaus.janino.UnitCompiler$8.visitIfStatement(UnitCompiler.java:3175)
at org.codehaus.janino.Java$IfStatement.accept(Java.java:2923)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3174)
at
org.codehaus.janino.UnitCompiler.buildLocalVariableMap(UnitCompiler.java:3163)
at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:2986)
at
org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1313)
at
org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1286)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:785)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:436)
at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:212)
at
org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:390)
at
org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:385)
at
org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1405)



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Re: [Table API example] Table program cannot be compiled. This is a bug. Please file an issue

2018-11-30 Thread Fabian Hueske
Hi Marvin,

Can you post the query (+ schema of tables) that lead to this exception?

Thank you,
Fabian

Am Fr., 30. Nov. 2018 um 10:55 Uhr schrieb Marvin777 <
xymaqingxiang...@gmail.com>:

> Hi all,
>
> I have a simple test for looking at Flink Table API and hit an exception
> reported as a bug.  I wonder though if it is a missing something.
>
> BTW, the example is flink-examples-table-with-dependencies.jar, and the
> version is 1.4.2 .
>
> Thanks Mavin.
>
> [image: image.png]
>
>


[Table API example] Table program cannot be compiled. This is a bug. Please file an issue

2018-11-30 Thread Marvin777
Hi all,

I have a simple test for looking at Flink Table API and hit an exception
reported as a bug.  I wonder though if it is a missing something.

BTW, the example is flink-examples-table-with-dependencies.jar, and the
version is 1.4.2 .

Thanks Mavin.

[image: image.png]