[jira] [Created] (FLINK-23397) [DOCS] task_failure_recovery page return 404

2021-07-15 Thread Dino Zhang (Jira)
Dino Zhang created FLINK-23397:
--

 Summary: [DOCS] task_failure_recovery page return 404
 Key: FLINK-23397
 URL: https://issues.apache.org/jira/browse/FLINK-23397
 Project: Flink
  Issue Type: Bug
  Components: Documentation
Reporter: Dino Zhang
 Attachments: image-2021-07-15-17-32-39-423.png

[https://ci.apache.org/projects/flink/flink-docs-master/docs/ops/state/task_failure_recovery/]

 

!image-2021-07-15-17-32-39-423.png!

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (FLINK-20012) Hive 3.1 integration exception

2020-11-05 Thread Dino Zhang (Jira)
Dino Zhang created FLINK-20012:
--

 Summary: Hive 3.1 integration exception
 Key: FLINK-20012
 URL: https://issues.apache.org/jira/browse/FLINK-20012
 Project: Flink
  Issue Type: Bug
  Components: Connectors / Hive
Affects Versions: 1.11.2
Reporter: Dino Zhang


When add extra dependencies to the /lib directory,and config hive conf in 
sql-client-defaults.yaml,and run /sql-client.sh embedded,But I'm getting the 
error
{code:java}
Caused by: java.lang.NoSuchMethodError: 
com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)VCaused
 by: java.lang.NoSuchMethodError: 
com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
 at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) at 
org.apache.hadoop.conf.Configuration.set(Configuration.java:1338) at 
org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536) at 
org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554) at 
org.apache.hadoop.mapred.JobConf.(JobConf.java:448) at 
org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141) at 
org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:5109) at 
org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:209)
 at 
org.apache.flink.table.catalog.hive.HiveCatalog.(HiveCatalog.java:161) at 
org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:378)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:626)
 at java.util.HashMap.forEach(HashMap.java:1289) at 
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.(ExecutionContext.java:183)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext.(ExecutionContext.java:136)
 at 
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
 ... 3 more   
{code}
At the same time,I found the guava-18 version in flink-1.11.2, but the guava-27 
version in hive 3.1

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)