How do you ensure that the connector is actually available at runtime? Are you bundling it in a jar or putting it into Flinks lib directory?

On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
Correcting subject -> Could not find any factory for identifier 'jdbc'

From: Ronak Beejawat (rbeejawa)
Sent: Tuesday, January 11, 2022 6:43 PM
To: 'd...@flink.apache.org' <d...@flink.apache.org>; 'commun...@flink.apache.org' 
<commun...@flink.apache.org>; 'user@flink.apache.org' <user@flink.apache.org>
Cc: 'Hang Ruan' <ruanhang1...@gmail.com>; Shrinath Shenoy K (sshenoyk) <sshen...@cisco.com>; Karthikeyan 
Muthusamy (karmuthu) <karmu...@cisco.com>; Krishna Singitam (ksingita) <ksing...@cisco.com>; Arun Yadav 
(aruny) <ar...@cisco.com>; Jayaprakash Kuravatti (jkuravat) <jkura...@cisco.com>; Avi Sanwal (asanwal) 
<asan...@cisco.com>
Subject: what is efficient way to write Left join in flink

Hi Team,

Getting below exception while using jdbc connector :

Caused by: org.apache.flink.table.api.ValidationException: Could not find any 
factory for identifier 'jdbc' that implements 
'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.

Available factory identifiers are:

blackhole
datagen
filesystem
kafka
print
upsert-kafka


I have already added dependency for jdbc connector in pom.xml as mentioned 
below:

<dependency>
<groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-jdbc_2.11</artifactId>
        <version>1.14.2</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
        <artifactId>mysql-connector-java</artifactId>
        <version>5.1.41</version>
</dependency>

Referred release doc link for the same 
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/



Please help me on this and provide the solution for it !!!


Thanks
Ronak Beejawat


Reply via email to