TableException

2019-06-12 Thread Pramit Vamsi
Hi,

I am attempting the following:

String sql = "INSERT INTO table3 "
+ "SELECT col1, col2,  window_start_time ,  window_end_time ,
MAX(col3), MAX(col4), MAX(col5) FROM "
+ "(SELECT col1,col2, "
+ "TUMBLE_START(ts, INTERVAL '1' MINUTE) as
window_start_time, "
+ "TUMBLE_END(ts, INTERVAL '1' MINUTE) as window_end_time, "
  
+ "FROM table1"
+ "WHERE"
+ "GROUP BY TUMBLE(ts, INTERVAL '1' MINUTE), col1, col2"
+ "*UNION *"
+ "SELECT col1, col2, "
+ "TUMBLE_START(ts, INTERVAL '1' MINUTE) as
window_start_time, "
+ "TUMBLE_END(ts, INTERVAL '1' MINUTE) as window_end_time, "

+ "FROM table2"
+ "WHERE . "
+ "GROUP BY TUMBLE(ts, INTERVAL '1' MINUTE),  col1, col2  )
"
+ " window_start_time, window_end_time, col1, col2";

tableEnv.sqlUpdate( sql  );

I am using JDBCAppendTableSink.

Exception:
org.apache.flink.table.api.TableException: AppendStreamTableSink requires
that Table has only insert changes.

What in the query should I fix?


Re: TableException

2019-06-12 Thread JingsongLee
Hi Pramit:
AppendStreamTableSink defines an external TableSink to emit a streaming table 
with only insert changes. If the table is also modified by update or delete 
changes, a TableException will be thrown.[1]
Your sql seems have update or delete changes.
You can try to use RetractStreamTableSink or UpsertStreamTableSink. 
(Unfortunately, we don't have Retract/Upsert JDBC Sink now, you can try to do 
by yourself)


[1]https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/sourceSinks.html#appendstreamtablesink

Best, JingsongLee


--
From:Pramit Vamsi 
Send Time:2019年6月13日(星期四) 01:35
To:user 
Subject:TableException

Hi,

I am attempting the following:

String sql = "INSERT INTO table3 "
+ "SELECT col1, col2,  window_start_time ,  window_end_time , 
MAX(col3), MAX(col4), MAX(col5) FROM "
+ "(SELECT col1,col2, "
+ "TUMBLE_START(ts, INTERVAL '1' MINUTE) as window_start_time, "
+ "TUMBLE_END(ts, INTERVAL '1' MINUTE) as window_end_time, "
  
+ "FROM table1"
+ "WHERE"
+ "GROUP BY TUMBLE(ts, INTERVAL '1' MINUTE), col1, col2"
+ "UNION "
+ "SELECT col1, col2, "
+ "TUMBLE_START(ts, INTERVAL '1' MINUTE) as window_start_time, "
+ "TUMBLE_END(ts, INTERVAL '1' MINUTE) as window_end_time, "

+ "FROM table2"
+ "WHERE . "
+ "GROUP BY TUMBLE(ts, INTERVAL '1' MINUTE),  col1, col2  ) "
+ " window_start_time, window_end_time, col1, col2";

tableEnv.sqlUpdate( sql  );

I am using JDBCAppendTableSink. 

Exception:
org.apache.flink.table.api.TableException: AppendStreamTableSink requires that 
Table has only insert changes.

What in the query should I fix? 



Flink 1.4 SQL API Streaming TableException

2018-03-09 Thread Pavel Ciorba
Hi everyone!

I decided to try the Time-windowed join functionality of Flink 1.4+.

My SQL query is an exact copy of the example in the documentation, and the
program reads and writes from Kafka.

I used the example from here:
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/sql.html#joins

Code:
https://gist.github.com/invoker27/ecb4f4b38a52642089e41f6f49886c28

Dependencies:
compile group: 'org.apache.flink', name: 'flink-table_2.11', version:
'1.4.0'
compile group: 'org.apache.flink', name: 'flink-streaming-scala_2.11',
version: '1.4.0'
compile group: 'org.apache.flink', name: 'flink-connector-kafka-0.11_2.11',
version: '1.4.0'

Error:
Exception in thread "main" org.apache.flink.table.api.TableException:
Cannot generate a valid execution plan for the given query:

FlinkLogicalJoin(condition=[AND(=($0, $5), >=($3, -($8, 1440)), <=($3,
$8))], joinType=[inner])
  FlinkLogicalTableSourceScan(table=[[TABLE1]], fields=[id, value1,
timestamp], source=[KafkaJSONTableSource])
  FlinkLogicalTableSourceScan(table=[[TABLE2]], fields=[id, value2,
timestamp], source=[KafkaJSONTableSource])

This exception indicates that the query uses an unsupported SQL feature.
Please check the documentation for the set of currently supported SQL
features.
at
org.apache.flink.table.api.TableEnvironment.runVolcanoPlanner(TableEnvironment.scala:274)
at
org.apache.flink.table.api.StreamTableEnvironment.optimize(StreamTableEnvironment.scala:683)
at
org.apache.flink.table.api.StreamTableEnvironment.writeToSink(StreamTableEnvironment.scala:251)
at org.apache.flink.table.api.Table.writeToSink(table.scala:862)
at org.apache.flink.table.api.Table.writeToSink(table.scala:830)
at
com.sheffield.healthmonitoring.join.conditionalhr.JoinSQL.main(JoinSQL.java:72)


I get the error in 1.4.0, 1.4.1 and 1.4.2, but 1.5-SNAPSHOT works.

>From what I can see the feature should work in 1.4.

What might be the issue?

Thank you!


TableException: GenericTypeInfo cannot be converted to Table

2017-12-08 Thread Sendoh
Hi Flink users,

I found the workarounds to resolve this exception in scala.
https://issues.apache.org/jira/browse/FLINK-6500

But I already provide the TypeInformation when deserializing json object,
and still see this exception.
Is there anything I ignore?

The sample code
https://gist.github.com/HungUnicorn/8a5c40fcf1e25c51cf77dc24a227d6d4

Best,

Sendoh



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Re: Flink 1.4 SQL API Streaming TableException

2018-03-09 Thread 杨力
To use a field in a table as timestamp, it must be declared as a rowtime
attribute for the table.

1) Call env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime).
2) Call withRowtimeAttribute on KafkaJsonTableSourceBuilder.

Reference:
1.
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/streaming.html#time-attributes
2.
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/sourceSinks.html#configuring-a-processing-time-attribute

On Sat, Mar 10, 2018 at 4:49 AM Pavel Ciorba  wrote:

> Hi everyone!
>
> I decided to try the Time-windowed join functionality of Flink 1.4+.
>
> My SQL query is an exact copy of the example in the documentation, and the
> program reads and writes from Kafka.
>
> I used the example from here:
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/sql.html#joins
>
> Code:
> https://gist.github.com/invoker27/ecb4f4b38a52642089e41f6f49886c28
>
> Dependencies:
> compile group: 'org.apache.flink', name: 'flink-table_2.11', version:
> '1.4.0'
> compile group: 'org.apache.flink', name: 'flink-streaming-scala_2.11',
> version: '1.4.0'
> compile group: 'org.apache.flink', name:
> 'flink-connector-kafka-0.11_2.11', version: '1.4.0'
>
> Error:
> Exception in thread "main" org.apache.flink.table.api.TableException:
> Cannot generate a valid execution plan for the given query:
>
> FlinkLogicalJoin(condition=[AND(=($0, $5), >=($3, -($8, 1440)), <=($3,
> $8))], joinType=[inner])
>   FlinkLogicalTableSourceScan(table=[[TABLE1]], fields=[id, value1,
> timestamp], source=[KafkaJSONTableSource])
>   FlinkLogicalTableSourceScan(table=[[TABLE2]], fields=[id, value2,
> timestamp], source=[KafkaJSONTableSource])
>
> This exception indicates that the query uses an unsupported SQL feature.
> Please check the documentation for the set of currently supported SQL
> features.
> at
> org.apache.flink.table.api.TableEnvironment.runVolcanoPlanner(TableEnvironment.scala:274)
> at
> org.apache.flink.table.api.StreamTableEnvironment.optimize(StreamTableEnvironment.scala:683)
> at
> org.apache.flink.table.api.StreamTableEnvironment.writeToSink(StreamTableEnvironment.scala:251)
> at org.apache.flink.table.api.Table.writeToSink(table.scala:862)
> at org.apache.flink.table.api.Table.writeToSink(table.scala:830)
> at
> com.sheffield.healthmonitoring.join.conditionalhr.JoinSQL.main(JoinSQL.java:72)
>
>
> I get the error in 1.4.0, 1.4.1 and 1.4.2, but 1.5-SNAPSHOT works.
>
> From what I can see the feature should work in 1.4.
>
> What might be the issue?
>
> Thank you!
>


Re: Flink 1.4 SQL API Streaming TableException

2018-03-09 Thread Pavel Ciorba
Bill Lee,

Man, you saved me from headbanging :) Thank you!

2018-03-10 0:25 GMT+02:00 杨力 :

> To use a field in a table as timestamp, it must be declared as a rowtime
> attribute for the table.
>
> 1) Call env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime).
> 2) Call withRowtimeAttribute on KafkaJsonTableSourceBuilder.
>
> Reference:
> 1. https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/
> streaming.html#time-attributes
> 2. https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/
> sourceSinks.html#configuring-a-processing-time-attribute
>
> On Sat, Mar 10, 2018 at 4:49 AM Pavel Ciorba  wrote:
>
>> Hi everyone!
>>
>> I decided to try the Time-windowed join functionality of Flink 1.4+.
>>
>> My SQL query is an exact copy of the example in the documentation, and
>> the program reads and writes from Kafka.
>>
>> I used the example from here:
>> https://ci.apache.org/projects/flink/flink-docs-
>> release-1.4/dev/table/sql.html#joins
>>
>> Code:
>> https://gist.github.com/invoker27/ecb4f4b38a52642089e41f6f49886c28
>>
>> Dependencies:
>> compile group: 'org.apache.flink', name: 'flink-table_2.11', version:
>> '1.4.0'
>> compile group: 'org.apache.flink', name: 'flink-streaming-scala_2.11',
>> version: '1.4.0'
>> compile group: 'org.apache.flink', name: 'flink-connector-kafka-0.11_2.11',
>> version: '1.4.0'
>>
>> Error:
>> Exception in thread "main" org.apache.flink.table.api.TableException:
>> Cannot generate a valid execution plan for the given query:
>>
>> FlinkLogicalJoin(condition=[AND(=($0, $5), >=($3, -($8, 1440)),
>> <=($3, $8))], joinType=[inner])
>>   FlinkLogicalTableSourceScan(table=[[TABLE1]], fields=[id, value1,
>> timestamp], source=[KafkaJSONTableSource])
>>   FlinkLogicalTableSourceScan(table=[[TABLE2]], fields=[id, value2,
>> timestamp], source=[KafkaJSONTableSource])
>>
>> This exception indicates that the query uses an unsupported SQL feature.
>> Please check the documentation for the set of currently supported SQL
>> features.
>> at org.apache.flink.table.api.TableEnvironment.runVolcanoPlanner(
>> TableEnvironment.scala:274)
>> at org.apache.flink.table.api.StreamTableEnvironment.optimize(
>> StreamTableEnvironment.scala:683)
>> at org.apache.flink.table.api.StreamTableEnvironment.writeToSink(
>> StreamTableEnvironment.scala:251)
>> at org.apache.flink.table.api.Table.writeToSink(table.scala:862)
>> at org.apache.flink.table.api.Table.writeToSink(table.scala:830)
>> at com.sheffield.healthmonitoring.join.conditionalhr.JoinSQL.main(
>> JoinSQL.java:72)
>>
>>
>> I get the error in 1.4.0, 1.4.1 and 1.4.2, but 1.5-SNAPSHOT works.
>>
>> From what I can see the feature should work in 1.4.
>>
>> What might be the issue?
>>
>> Thank you!
>>
>


Re: TableException: GenericTypeInfo cannot be converted to Table

2017-12-08 Thread Sendoh
Found it. I should use .returns(typeInformation) after the map function.





--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Re: TableException: GenericTypeInfo cannot be converted to Table

2017-12-08 Thread Fabian Hueske
Hi,

you give the TypeInformation to your user code but you don't expose it to
the DataStream API (the code of the FlatMapFunction is a black box for the
API).
You're FlatMapFunction should implement the ResultTypeQueryable interface
and return the TypeInformation.

Best, Fabian

2017-12-08 11:19 GMT+01:00 Sendoh :

> Hi Flink users,
>
> I found the workarounds to resolve this exception in scala.
> https://issues.apache.org/jira/browse/FLINK-6500
>
> But I already provide the TypeInformation when deserializing json object,
> and still see this exception.
> Is there anything I ignore?
>
> The sample code
> https://gist.github.com/HungUnicorn/8a5c40fcf1e25c51cf77dc24a227d6d4
>
> Best,
>
> Sendoh
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.
> n4.nabble.com/
>


Re: TableException: GenericTypeInfo cannot be converted to Table

2017-12-08 Thread Fabian Hueske
Yes.
Adding .returns(typeInfo) works as well. :-)

2017-12-08 11:29 GMT+01:00 Fabian Hueske :

> Hi,
>
> you give the TypeInformation to your user code but you don't expose it to
> the DataStream API (the code of the FlatMapFunction is a black box for the
> API).
> You're FlatMapFunction should implement the ResultTypeQueryable interface
> and return the TypeInformation.
>
> Best, Fabian
>
> 2017-12-08 11:19 GMT+01:00 Sendoh :
>
>> Hi Flink users,
>>
>> I found the workarounds to resolve this exception in scala.
>> https://issues.apache.org/jira/browse/FLINK-6500
>>
>> But I already provide the TypeInformation when deserializing json object,
>> and still see this exception.
>> Is there anything I ignore?
>>
>> The sample code
>> https://gist.github.com/HungUnicorn/8a5c40fcf1e25c51cf77dc24a227d6d4
>>
>> Best,
>>
>> Sendoh
>>
>>
>>
>> --
>> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.
>> nabble.com/
>>
>
>


Re: TableException: GenericTypeInfo cannot be converted to Table

2017-12-08 Thread Sendoh
Thanks! don't know this works as well.

Cheers,

Sendoh



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/