yanjing.wang created CALCITE-6693:
-------------------------------------
Summary: Add Source SQL Dialect to RelToSqlConverterTest
Key: CALCITE-6693
URL: https://issues.apache.org/jira/browse/CALCITE-6693
Project: Calcite
Issue Type: Improvement
Components: core
Affects Versions: 1.38.0
Reporter: yanjing.wang
Assignee: yanjing.wang
Fix For: 1.39.0
Currently, {{RelToSqlConverterTest}} converts the original SQL to RelNode using
the default validator config and the target dialect's type system. This is
confusing because it's unclear whether the test is meant to verify the SQL
conversion between different dialects or within the same dialect. I believe it
should verify the conversion between source and target dialects. Therefore, we
should clearly define the source and target dialects and provide a way to set
them. The code would look like
{code:java}
@Test void testNullCollation() {
final String query = "select * from \"product\" order by \"brand_name\"";
final String expected = "SELECT *\n"
+ "FROM \"foodmart\".\"product\"\n"
+ "ORDER BY \"brand_name\"";
final String sparkExpected = "SELECT *\n"
+ "FROM `foodmart`.`product`\n"
+ "ORDER BY `brand_name` NULLS LAST";
sql(query)
.sourceDialect(PrestoSqlDialect.DEFAULT)
.withPresto().ok(expected)
.withSpark().ok(sparkExpected);
} {code}
We need also set correct null collation config using source dialect due to the
source and target dialect have different null collations.
For the case that the source dialect equals the target dialect
{code:java}
@Test void testCastDecimalBigPrecision() {
final String query = "select cast(\"product_id\" as decimal(60,2)) "
+ "from \"product\" ";
final String expectedRedshift = "SELECT CAST(\"product_id\" AS DECIMAL(38,
2))\n"
+ "FROM \"foodmart\".\"product\"";
sql(query)
.withRedshift()
.withSourceDialectEqualsTargetDialect()
.ok(expectedRedshift);
} {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)