发一下完整的异常信息?
> 在 2020年12月17日,上午11:53,肖越 <18242988...@163.com> 写道:
>
> 好的,非常感谢您的帮助,刚根据分享的连接,显示增加了jar包依赖,报错:py4j.protocol.Py4JJavaError: An error
> occurred while calling None.java.net.URL,应该就是没有对应的Oracle connector处理的原因吧?
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> 在 2020-12-17 10:44:56,"Dian Fu" <dian0511...@gmail.com> 写道:
>> 1)如Leonard Xu老师所说,目前JDBC connector还不支持oracle
>> 2)从报错看,你这还都没有走到那一步就报错了,可以检查一下这些:
>> a. JDBC connector的使用方式参考[1],比如'connector' = 'jdbc',而不是'connector.type' =
>> 'jdbc',这个是老的使用方式
>> b. JDBC
>> connector的jar需要显式添加相关依赖,可以如何在PyFlink中添加jar依赖,可以看一下[2]。其中JDBC需要的的jar,可以参考[1]
>>
>> [1]
>> https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/jdbc.html
>>
>> <https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/jdbc.html>
>> [2]
>> https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/python/table-api-users-guide/python_table_api_connectors.html#download-connector-and-format-jars
>>
>>> 在 2020年12月17日,上午10:06,肖越 <18242988...@163.com> 写道:
>>>
>>> 谢谢您的帮助,所以没有办法通过定义connector的方式获取连接是么? 有没有其他办法呢?
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> 在 2020-12-17 09:55:08,"Leonard Xu" <xbjt...@gmail.com> 写道:
>>>> 目前 JDBC connector 只支持 MySQL, Pg和Derby(一般测试用)这几种dialect, Oracle还不支持。
>>>>
>>>> 祝好,
>>>> Leonard
>>>>
>>>>> 在 2020年12月17日,09:47,肖越 <18242988...@163.com> 写道:
>>>>>
>>>>> pyflink小白,测试pyflink1.12功能,想通过定义connector从oracle数据库中获取数据
>>>>> 通过如下方式定义:
>>>>> env = StreamExecutionEnvironment.get_execution_environment()
>>>>> env.set_parallelism(1)
>>>>> env = StreamTableEnvironment \
>>>>> .create(env, environment_settings=EnvironmentSettings
>>>>> .new_instance()
>>>>> .use_blink_planner().build())
>>>>> source_ddl1 = """
>>>>> CREATE TABLE source_table (id BIGINT,pf_id VARCHAR,\
>>>>> tree_id VARCHAR,node_id VARCHAR,biz_date DATE,\
>>>>> ccy_type VARCHAR,cur_id_d VARCHAR,tldrate DOUBLE,\
>>>>> is_valid INT,time_mark TIMESTAMP) WITH (
>>>>> 'connector.type' = 'jdbc',
>>>>> 'connector.url' = 'jdbc:oracle:thin:@ip:dbnamel',
>>>>> 'connector.table' = 'ts_pf_ac_yldrate',
>>>>> 'connector.driver' = 'oracle.jdbc.driver.OracleDriver',
>>>>> 'connector.username' = 'xxx',
>>>>> 'connector.password' = 'xxx')
>>>>> """
>>>>> sql = "select pf_id,biz_date from source_table where biz_date='20160701' "
>>>>> env.sql_update(source_ddl1)
>>>>> table = env.sql_query(sql)
>>>>> env.execute("flink_test")
>>>>> 报错信息:
>>>>> raise java_exception
>>>>> pyflink.util.exceptions.TableException: findAndCreateTableSource failed.
>>>>> at
>>>>> org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:49)
>>>>> at
>>>>> org.apache.flink.table.planner.plan.schema.LegacyCatalogSourceTable.findAndCreateLegacyTableSource(LegacyCatalogSourceTable.scala:193)
>>>>> at
>>>>> org.apache.flink.table.planner.plan.schema.LegacyCatalogSourceTable.toRel(LegacyCatalogSourceTable.scala:94)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3585)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2507)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2144)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)
>>>>> at
>>>>> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
>>>>> at
>>>>> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:165)
>>>>> at
>>>>> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:157)
>>>>> at
>>>>> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:823)
>>>>> at
>>>>> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:795)
>>>>> at
>>>>> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:250)
>>>>> at
>>>>> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:78)
>>>>> at
>>>>> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:639)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>>> at
>>>>> org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
>>>>> at
>>>>> org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>>>>> at
>>>>> org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
>>>>> at
>>>>> org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
>>>>> at
>>>>> org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
>>>>> at
>>>>> org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
>>>>> at java.lang.Thread.run(Thread.java:748)
>>