Hi!

这是 hive 报的错。使用 hive 的时候,相关的环境变量如 HIVE_CONF_DIR 和 HIVE_HOME 等都需要设置好。

niko <jfq...@163.com> 于2021年7月16日周五 下午4:44写道:

> 你好!<br/>我尝试了一下,connector 使用了nginx
> 代理,但运行后报了如下的错。<br/>java.lang.ExceptionInInitializerError<br/> at
> org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:230)<br/>
>   at
> org.apache.flink.table.catalog.hive.HiveCatalog.&lt;init&gt;(HiveCatalog.java:169)<br/>
>     at
> org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:97)<br/>
>     at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1121)<br/>
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:1019)<br/>
>      at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666)<br/>
>     at
> com.my.flink.streaming.core.execute.ExecuteSql.exeSql(ExecuteSql.java:51)<br/>
>      at
> com.my.flink.streaming.core.JobApplication.main(JobApplication.java:117)<br/>
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)<br/>
>    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)<br/>
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)<br/>
>  at java.lang.reflect.Method.invoke(Method.java:498)<br/>        at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349)<br/>
>       at
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:219)<br/>
>    at
> org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)<br/>
>       at
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)<br/>
>   at
> org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)<br/>
>  at
> org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)<br/>
>     at
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)<br/>
>  at java.security.AccessController.doPrivileged(Native Method)<br/>      at
> javax.security.auth.Subject.doAs(Subject.java:422)<br/>      at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)<br/>
>   at
> org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)<br/>
>     at
> org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)<br/>Caused
> by: java.lang.IllegalArgumentException: URI scheme is not "file"<br/> at
> java.io.File.&lt;init&gt;(File.java:421)<br/>        at
> org.apache.hadoop.hive.conf.HiveConf.findConfigFile(HiveConf.java:179)<br/>
> at
> org.apache.hadoop.hive.conf.HiveConf.&lt;clinit&gt;(HiveConf.java:146)<br/>
> ... 24 more
> 在 2021-07-16 16:10:10,"Caizhi Weng" <tsreape...@gmail.com> 写道:
> >Hi!
> >
> >理论上可行,可以尝试一下。但要注意 -C 指定的路径必须是所有节点都能访问到,如果指定的是一个本地路径,那么所有节点的本地路径下都要有相应的
> >connector jar。
> >
> >niko <jfq...@163.com> 于2021年7月16日周五 下午3:21写道:
> >
> >> 能否使用命令行的 -C 命令 加载 flink sql 的 connector?
>

回复