[ 
https://issues.apache.org/jira/browse/FLINK-26437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17500018#comment-17500018
 ] 

Arindam Bhattacharjee commented on FLINK-26437:
-----------------------------------------------

Thanks [~straw] . But now I am getting  error at Sink end - 

 

SQL:CREATE TABLE user_details_fs (
  user_id varchar,
  item_id varchar,
  category_id varchar,
  behavior varchar,
ts TIMESTAMP(3)
) WITH (
'connector' = 'filesystem',
  'path' = 
'file:///Users/arindam.b/Documents/SparkCheckPointDirectory/user_details/',
  'format' = 'parquet'
  )
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
        at 
org.apache.flink.formats.parquet.ParquetFileFormatFactory.getParquetConfiguration(ParquetFileFormatFactory.java:115)
        at 
org.apache.flink.formats.parquet.ParquetFileFormatFactory.access$000(ParquetFileFormatFactory.java:51)
        at 
org.apache.flink.formats.parquet.ParquetFileFormatFactory$2.createRuntimeEncoder(ParquetFileFormatFactory.java:103)
        at 
org.apache.flink.formats.parquet.ParquetFileFormatFactory$2.createRuntimeEncoder(ParquetFileFormatFactory.java:97)
        at 
org.apache.flink.table.filesystem.FileSystemTableSink.createWriter(FileSystemTableSink.java:385)
        at 
org.apache.flink.table.filesystem.FileSystemTableSink.createStreamingSink(FileSystemTableSink.java:192)
        at 
org.apache.flink.table.filesystem.FileSystemTableSink.consume(FileSystemTableSink.java:153)
        at 
org.apache.flink.table.filesystem.FileSystemTableSink.lambda$getSinkRuntimeProvider$0(FileSystemTableSink.java:139)
        at 
org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.applySinkProvider(CommonExecSink.java:294)
        at 
org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.createSinkTransformation(CommonExecSink.java:145)
        at 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:140)
        at 
org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
        at 
org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToPlan$1(StreamPlanner.scala:70)
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
        at scala.collection.Iterator.foreach(Iterator.scala:937)
        at scala.collection.Iterator.foreach$(Iterator.scala:937)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
        at scala.collection.IterableLike.foreach(IterableLike.scala:70)
        at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at scala.collection.TraversableLike.map(TraversableLike.scala:233)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at 
org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:69)
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:165)
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1518)
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translateAndClearBuffer(TableEnvironmentImpl.java:1510)
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.execute(TableEnvironmentImpl.java:1460)
        at huangxu.chase.flinksql.demo.SqlSubmit.run(SqlSubmit.java:49)
        at huangxu.chase.flinksql.demo.SqlSubmit.main(SqlSubmit.java:24)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
        at 
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
        at 
org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
        at 
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
        at 
org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
        at 
org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
        at 
org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
        at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.conf.Configuration
        at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        ... 43 more

 

I am running it on local machine, but getting the error for Hadoop ClassPath. 

Can you please help me here?

 

Thanks,

Arindam

> Cannot discover a connector using option: 'connector'='jdbc'
> ------------------------------------------------------------
>
>                 Key: FLINK-26437
>                 URL: https://issues.apache.org/jira/browse/FLINK-26437
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / API
>    Affects Versions: 1.13.6
>            Reporter: Arindam Bhattacharjee
>            Priority: Major
>              Labels: sql-api, table-api
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Hi Team,
> When I was running SQL in Flink SQL-API, was getting the below error - 
> *Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a 
> connector using option: 'connector'='jdbc'*
>         at 
> org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:467)
>         at 
> org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:441)
>         at 
> org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:167)
>         ... 32 more
> Caused by: org.apache.flink.table.api.ValidationException: Could not find any 
> factory for identifier 'jdbc' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> Available factory identifiers are:
> blackhole
> datagen
> filesystem
> kafka
> print
> upsert-kafka
>         at 
> org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:319)
>         at 
> org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:463)
>         ... 34 more
> ------------------------
>  
> SQL I was using - 
> _CREATE TABLE pvuv_sink (_
>  _dt varchar PRIMARY KEY,_
>  _pv BIGINT,_
>  _uv BIGINT_
> _) WITH (_
>  _'connector' = 'jdbc',_
>  _'url' = 'jdbc:mysql://localhost:3306/flinksql_test',_
>  _'table-name' = 'pvuv_sink',_
>  _'username' = 'root',_
>  _'password' = 'xxxxxx',_
>  _'sink.buffer-flush.max-rows' = '1'_
> _);_



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to