Xiaodong Chen created FLINK-21680:
-------------------------------------

             Summary: Table Created with LIKE clause may get unexpected options 
"is_generic"
                 Key: FLINK-21680
                 URL: https://issues.apache.org/jira/browse/FLINK-21680
             Project: Flink
          Issue Type: Bug
    Affects Versions: 1.12.1
            Reporter: Xiaodong Chen


If We have a table in Hive Catalog,and then Create another table in default 
Flink Catalog use LIKE clause:
{code:java}
//代码占位符
CREATE TABLE hive_catalog.test.test_table (
    id String,
    create_time TIMESTAMP
)
WITH (
  'connector' = 'kafka',
  'topic' = 'test_table',
  'properties.bootstrap.servers' = 'kafka.address',
  'properties.group.id' = 'test',
  'format' = 'json',
  'json.ignore-parse-errors' = 'true',
  'scan.startup.mode' = 'earliest-offset'
);

CREATE TABLE test_like_table
WITH (
  'properties.group.id' = 'test_like'
)
LIKE hive_catalog.test.test_table;

{code}
When we query:
{code:java}
//代码占位符
SELECT * FROM test_like_table

{code}
Exception found:
{code:java}
//代码占位符
Exception in thread "main" org.apache.flink.table.api.ValidationException: 
Unable to create a source for reading table 
'default_catalog.default_database.test_like_table'.

Table options are:

'connector'='kafka'
'is_generic'='true'
'key.format'='json'
'properties.bootstrap.servers'='kafka.address'
'properties.group.id'='test_like'
'topic'='test_table'
'value.format'='json'
        at 
org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:122)
        at 
org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:254)
        at 
org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:100)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3585)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2507)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2144)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:165)
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:157)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:902)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:871)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:250)
        at 
org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:77)
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:660)
        at aiads.devops.flink.single_sql_job.Bootstrap.main(Bootstrap.java:71)
Caused by: org.apache.flink.table.api.ValidationException: Unsupported options 
found for connector 'kafka'.

Unsupported options:

is_generic

{code}
 

I believe it is caused by:
 # Flink uses the property ‘_is_generic_’ to tell whether a table is 
Hive-compatible or generic. When creating a table with {{HiveCatalog}}, it’s by 
default set "true".
 # When creating table by using LIKE clause with FlinkCatalog,  option 
‘_is_generic_’ be copied, but connectors found this option is not supported.

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to