[ 
https://issues.apache.org/jira/browse/FLINK-8492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hequn Cheng updated FLINK-8492:
-------------------------------
    Description: 
Considering the following test, unsupported exception will be thrown due to 
multi calc existing between correlate and TableFunctionScan.
{code:java}
// code placeholder
@Test
def testCrossJoinWithMultiFilter(): Unit = {
  val t = testData(env).toTable(tEnv).as('a, 'b, 'c)
  val func0 = new TableFunc0

  val result = t
    .join(func0('c) as('d, 'e))
    .select('c, 'd, 'e)
    .where('e > 10)
    .where('e > 20)
    .select('c, 'd)
    .toAppendStream[Row]

  result.addSink(new StreamITCase.StringSink[Row])
  env.execute()

  val expected = mutable.MutableList("Jack#22,Jack,22", "Anna#44,Anna,44")
  assertEquals(expected.sorted, StreamITCase.testResults.sorted)
}
{code}
I can see two options to fix this problem:
 # Adapt calcite OptRule to merge the continuous calc.
 # Merge multi calc in correlate convert rule.

I prefer the second one, not only it is easy to implement but also i think with 
or without an optimize rule should not influence flink functionality. 

  was:
Considering the following test, unsupported exception will be thrown due to 
multi calc existing between correlate and TableFunctionScan.
{code:java}
// code placeholder
@Test
def testCrossJoinWithMultiFilter(): Unit = {
  val t = testData(env).toTable(tEnv).as('a, 'b, 'c)
  val func0 = new TableFunc0

  val result = t
    .join(func0('c) as('d, 'e))
    .select('c, 'd, 'e)
    .where('e > 10)
    .where('e > 20)
    .select('c, 'd)
    .toAppendStream[Row]

  result.addSink(new StreamITCase.StringSink[Row])
  env.execute()

  val expected = mutable.MutableList("Jack#22,Jack,22", "Anna#44,Anna,44")
  assertEquals(expected.sorted, StreamITCase.testResults.sorted)
}
{code}
I can see two options to fix this problem:
 # Adapt calcite OptRule to merge the continuous calc.
 # Merge multi calc in correlate convert rule.

I prefer the second one, not only because it is easy to implement but also i 
think with or without an optimize rule should not influence flink 
functionality. 


> Fix unsupported exception for udtf with multi calc
> --------------------------------------------------
>
>                 Key: FLINK-8492
>                 URL: https://issues.apache.org/jira/browse/FLINK-8492
>             Project: Flink
>          Issue Type: Bug
>          Components: Table API & SQL
>            Reporter: Hequn Cheng
>            Assignee: Hequn Cheng
>            Priority: Major
>
> Considering the following test, unsupported exception will be thrown due to 
> multi calc existing between correlate and TableFunctionScan.
> {code:java}
> // code placeholder
> @Test
> def testCrossJoinWithMultiFilter(): Unit = {
>   val t = testData(env).toTable(tEnv).as('a, 'b, 'c)
>   val func0 = new TableFunc0
>   val result = t
>     .join(func0('c) as('d, 'e))
>     .select('c, 'd, 'e)
>     .where('e > 10)
>     .where('e > 20)
>     .select('c, 'd)
>     .toAppendStream[Row]
>   result.addSink(new StreamITCase.StringSink[Row])
>   env.execute()
>   val expected = mutable.MutableList("Jack#22,Jack,22", "Anna#44,Anna,44")
>   assertEquals(expected.sorted, StreamITCase.testResults.sorted)
> }
> {code}
> I can see two options to fix this problem:
>  # Adapt calcite OptRule to merge the continuous calc.
>  # Merge multi calc in correlate convert rule.
> I prefer the second one, not only it is easy to implement but also i think 
> with or without an optimize rule should not influence flink functionality. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to