Hi,

I'm ultimately trying to write a query in calcite with an UNNEST that gets
translated to spark sql. But the resulting query is invalid.

So starting with something like the query below, which runs fine as is on
Postgres:

SELECT t.test_name, res
FROM test_table as t,
UNNEST(t.test_array) as res

After converting to logical plan and passing it through RelToSqlConverter I
get this, which is invalid sparkSQL:

SELECT
  `$cor0`.`test_name`,
  `$cor0`.`test_array0` `res`
FROM
  test_table `$cor0`,
  LATERAL UNNEST (SELECT
      `$cor0`.`test_array`
    FROM
      (VALUES
          (0)) `t` (`ZERO`)) `t0` (`test_array`) `t00`

with the accompanying logical plan:

[Logical plan]
LogicalProject(test_name=[$1], res=[$3])
  LogicalCorrelate(correlation=[$cor0], joinType=[inner],
requiredColumns=[{2}])
    LogicalTableScan(table=[[test_table]])
    Uncollect
      LogicalProject(test_array=[$cor0.test_array])
        LogicalValues(tuples=[[{ 0 }]])

A result for a Postgres dialect has similar results (just exchanges ` for
").

In order to isolate whether my environment is the culprit, I've cloned
Calcite's codebase and hijacked a test under RelToSqlConverterTest:

final String sql = "select did from \"department\", unnest(array[1, 2, 3])
as t(did)\n";
final String result =
sql(sql).dialect(DatabaseProduct.POSTGRESQL.getDialect()).exec();

What I get as result is the below, which is invalid posgresql:

SELECT t1.DID
FROM foodmart.department,
(
  SELECT col_0 AS DID
  FROM UNNEST (
    SELECT ARRAY[1, 2, 3] FROM (VALUES (0)) AS t (ZERO)
  ) AS t0 (col_0)
) AS t1

I'm fairly new to Calcite, so it's very possible this is user error. There
are currently two tests under RelToSqlConverterTest that cover UNNEST, but
they both do `from unnest(select collect...`, so I wonder if UNNEST only
works in that scenario?

Reply via email to