[ 
https://issues.apache.org/jira/browse/FLINK-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15046798#comment-15046798
 ] 

Timo Walther commented on FLINK-2099:
-------------------------------------

[~ovidiumarcu] It would be very nice to have someone else helping me with that 
task. If the main issues of the Table API are solved, I think I can move the 
SQL code to flink-contrib or flink-staging. So that more people can work on it 
and open PR against it.

[~fhueske] You are right. Supporting TPC-H queries first was also my initial 
thought because of less and simpler queries. However, we still have the license 
problem with generated data (see FLINK-705). I think we should use TPC-DS 
because there is already an Apache-licensed generator/validator tool: 
https://github.com/julianhyde/tpcds
It is in Maven central I can thus easily integrated for ITCases.

> Add a SQL API
> -------------
>
>                 Key: FLINK-2099
>                 URL: https://issues.apache.org/jira/browse/FLINK-2099
>             Project: Flink
>          Issue Type: New Feature
>          Components: Table API
>            Reporter: Timo Walther
>            Assignee: Timo Walther
>
> From the mailing list:
> Fabian: Flink's Table API is pretty close to what SQL provides. IMO, the best
> approach would be to leverage that and build a SQL parser (maybe together
> with a logical optimizer) on top of the Table API. Parser (and optimizer)
> could be built using Apache Calcite which is providing exactly this.
> Since the Table API is still a fairly new component and not very feature
> rich, it might make sense to extend and strengthen it before putting
> something major on top.
> Ted: It would also be relatively simple (I think) to retarget drill to Flink 
> if
> Flink doesn't provide enough typing meta-data to do traditional SQL.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to