[ 
https://issues.apache.org/jira/browse/FLINK-35286?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Goldenberg updated FLINK-35286:
------------------------------------
    Description: 
*Problem*

Referencing tables with the 'hive' connector outside of HiveCatalog gives the 
error
{code:java}
org.apache.flink.table.api.ValidationException: Could not find any factory for 
identifier 'hive' that implements 
'org.apache.flink.table.factories.DynamicTableFactory' in the classpath{code}
For example, when using a table created like
{code:java}
CREATE TABLE my_table LIKE hive.db.table WITH (...);{code}
Whereas the 'hive' connector is available if the table is referenced via 
HiveCatalog.

*Desired Behavior*
The 'hive' connector should be available for tables outside of HiveCatalog, for 
example in the default catalog.

*Benefits*
 * Can refer to Hive tables without fully qualified path `catalog.db.table` 
outside of HiveCatalog, useful when it is not the only catalog or data source.
 * Can modify Hive tables without changing Hive metastore or using SQL hints 
[here|https://nightlies.apache.org/flink/flink-docs-release-1.18/docs/connectors/table/hive/hive_read_write/#reading]

  was:
*Problem*

Referencing tables with the 'hive' connector outside of HiveCatalog gives the 
error

 
{code:java}
org.apache.flink.table.api.ValidationException: Could not find any factory for 
identifier 'hive' that implements 
'org.apache.flink.table.factories.DynamicTableFactory' in the classpath{code}
 

For example, when using a table created like

 
{code:java}
CREATE TABLE my_table LIKE hive.db.table WITH (...);{code}
Whereas the 'hive' connector is available if the table is referenced via 
HiveCatalog.

 

*Desired Behavior*
The 'hive' connector should be available for tables outside of HiveCatalog, for 
example in the default catalog.

*Benefits*
* Can refer to Hive tables without fully qualified path `catalog.db.table` 
outside of HiveCatalog, useful when it is not the only catalog or data source.
* Can modify Hive tables without changing Hive metastore or using SQL hints 
[here|https://nightlies.apache.org/flink/flink-docs-release-1.18/docs/connectors/table/hive/hive_read_write/#reading]


> Cannot discover Hive connector outside Hive catalog
> ---------------------------------------------------
>
>                 Key: FLINK-35286
>                 URL: https://issues.apache.org/jira/browse/FLINK-35286
>             Project: Flink
>          Issue Type: Improvement
>          Components: Connectors / Hive
>         Environment: Flink 1.18.1
> Hive 2.3.9
> flink-sql-connector-hive-2.3.9_2.12-1.18.1.jar
>            Reporter: Ryan Goldenberg
>            Priority: Major
>
> *Problem*
> Referencing tables with the 'hive' connector outside of HiveCatalog gives the 
> error
> {code:java}
> org.apache.flink.table.api.ValidationException: Could not find any factory 
> for identifier 'hive' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath{code}
> For example, when using a table created like
> {code:java}
> CREATE TABLE my_table LIKE hive.db.table WITH (...);{code}
> Whereas the 'hive' connector is available if the table is referenced via 
> HiveCatalog.
> *Desired Behavior*
> The 'hive' connector should be available for tables outside of HiveCatalog, 
> for example in the default catalog.
> *Benefits*
>  * Can refer to Hive tables without fully qualified path `catalog.db.table` 
> outside of HiveCatalog, useful when it is not the only catalog or data source.
>  * Can modify Hive tables without changing Hive metastore or using SQL hints 
> [here|https://nightlies.apache.org/flink/flink-docs-release-1.18/docs/connectors/table/hive/hive_read_write/#reading]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to