At noon I had restore all settings in kylin.properties file, then cube
build under MR mode without any problems. So there were very strange, I
think the default setting for use same hadoop configure. Whats different
about action copy the hive-site in custom hadoop conf directory. As well  I
had found deep about Hive source code, make sure the NoSuchMethodError were
right. Because the constructor function is build in Hive2.1x version. Then
try to upgrade the hive jars under spark jars folder, but still not
working. So what can I do now?



PS:

1. Use the Kylin2.0 spark guide line:
https://kylin.apache.org/docs20/tutorial/cube_spark.html
2. Use beeline replace for CLI to connect Hive.
3. Upload the hive-site.xml file for your information.
4. Throws error source code of Kylin

public void configureJob(Job job) {
try {
job.getConfiguration().addResource("hive-site.xml");

HCatInputFormat.setInput(job, dbName, tableName);
job.setInputFormatClass(HCatInputFormat.class);
} catch (IOException e) {
throw new RuntimeException(e);
}
}



Thanks.


2018-03-27 9:47 GMT+08:00 凡梦星尘 <elkan1...@gmail.com>:

> Hi Billy:
>
> Thanks for your suggest, I will try it. Hope can response good news.
>
> 2018-03-27 9:11 GMT+08:00 Billy Liu <billy...@apache.org>:
>
>> For the first question, it seems some classpath conflict issue
>> existing there "NoSuchMethodError", try google "HiveMetaStoreClient
>> NoSuchMethodError"
>>
>> For the second question, to enable the dashboard, please follow the
>> http://kylin.apache.org/docs23/howto/howto_setup_systemcube.html
>> first.
>>
>> With Warm regards
>>
>> Billy Liu
>>
>>
>> 2018-03-26 17:59 GMT+08:00 凡梦星尘 <elkan1...@gmail.com>:
>> > Hi guys:
>> >
>> > Congratulation  to Kylin release 2.3.0 with so many features.
>> >
>> > The last weekend I had try to upgrade this newest version to test ENV.
>> All
>> > things going is good and run well sample cube under MR mode. But when I
>> try
>> > to switch Spark mode not success. After fixed some problems that meet a
>> > hard trouble not fixed. The error log see below:
>> >
>> > java.lang.RuntimeException: java.io.IOException:
>> > com.google.common.util.concurrent.UncheckedExecutionException:
>> > java.lang.RuntimeException: Unable to instantiate
>> > org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiv
>> eMetaStoreClient
>> >         at org.apache.kylin.source.hive.HiveMRInput$HiveTableInputForma
>> t.configureJob(HiveMRInput.java:116)
>> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.setu
>> pMapper(FactDistinctColumnsJob.java:121)
>> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(
>> FactDistinctColumnsJob.java:99)
>> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork
>> (MapReduceExecutable.java:130)
>> >         at org.apache.kylin.job.execution.AbstractExecutable.execute(
>> AbstractExecutable.java:162)
>> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.doWo
>> rk(DefaultChainedExecutable.java:67)
>> >         at org.apache.kylin.job.execution.AbstractExecutable.execute(
>> AbstractExecutable.java:162)
>> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRun
>> ner.run(DefaultScheduler.java:300)
>> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>> Executor.java:1149)
>> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>> lExecutor.java:624)
>> >         at java.lang.Thread.run(Thread.java:748)
>> > Caused by: java.io.IOException:
>> > com.google.common.util.concurrent.UncheckedExecutionException:
>> > java.lang.RuntimeException: Unable to instantiate
>> > org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiv
>> eMetaStoreClient
>> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(
>> HCatInputFormat.java:97)
>> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(
>> HCatInputFormat.java:51)
>> >         at org.apache.kylin.source.hive.HiveMRInput$HiveTableInputForma
>> t.configureJob(HiveMRInput.java:113)
>> >         ... 10 more
>> > Caused by: com.google.common.util.concurrent.UncheckedExecutionExceptio
>> n:
>> > java.lang.RuntimeException: Unable to instantiate
>> > org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiv
>> eMetaStoreClient
>> >         at com.google.common.cache.LocalCache$Segment.get(LocalCache.
>> java:2256)
>> >         at com.google.common.cache.LocalCache.get(LocalCache.java:3985)
>> >         at com.google.common.cache.LocalCache$LocalManualCache.get(
>> LocalCache.java:4788)
>> >         at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(
>> HiveClientCache.java:292)
>> >         at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClie
>> ntCache.java:267)
>> >         at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreCli
>> ent(HCatUtil.java:558)
>> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJ
>> obInfo(InitializeInput.java:104)
>> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(
>> InitializeInput.java:88)
>> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(
>> HCatInputFormat.java:95)
>> >         ... 12 more
>> > Caused by: java.lang.RuntimeException: Unable to instantiate
>> > org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiv
>> eMetaStoreClient
>> >         at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(
>> MetaStoreUtils.java:1566)
>> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<
>> init>(RetryingMetaStoreClient.java:92)
>> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.get
>> Proxy(RetryingMetaStoreClient.java:138)
>> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.get
>> Proxy(RetryingMetaStoreClient.java:124)
>> >         at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveC
>> lientCache.java:297)
>> >         at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveC
>> lientCache.java:292)
>> >         at com.google.common.cache.LocalCache$LocalManualCache$1.load(
>> LocalCache.java:4791)
>> >         at com.google.common.cache.LocalCache$LoadingValueReference.loa
>> dFuture(LocalCache.java:3584)
>> >         at com.google.common.cache.LocalCache$Segment.loadSync(LocalCac
>> he.java:2372)
>> >         at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(
>> LocalCache.java:2335)
>> >         at com.google.common.cache.LocalCache$Segment.get(LocalCache.
>> java:2250)
>> >         ... 20 more
>> > Caused by: java.lang.reflect.InvocationTargetException
>> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance(Native
>> ConstructorAccessorImpl.java:62)
>> >         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De
>> legatingConstructorAccessorImpl.java:45)
>> >         at java.lang.reflect.Constructor.newInstance(Constructor.java:4
>> 23)
>> >         at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(
>> MetaStoreUtils.java:1564)
>> >         ... 30 more
>> > Caused by: java.lang.NoSuchMethodError:
>> > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(
>> Lorg/apache/hadoop/hive/conf/HiveConf;Lorg/apache/hadoop/
>> hive/metastore/HiveMetaHookLoader;Ljava/lang/Boolean;)V
>> >         at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiv
>> eMetaStoreClient.<init>(HiveClientCache.java:409)
>> >         ... 35 more
>> >
>> > result code:2
>> >
>> >
>> > At first time I had research some likely issues on forum but there not
>> same
>> > with me. Also try used the resolve ways but still not work. So someone
>> can
>> > give me more useful suggest.
>> >
>> > Thanks.
>> >
>> >
>> > PS:
>> >
>> > 1. Hadoop ENV screenshot photo:
>> >
>> > http://77l54p.com1.z0.glb.clouddn.com/hdp_stack_versions_2.
>> 6.4.png-alias
>> >
>> > 2. Other errors about dashboard page action, detail info see below logs:
>> >
>> > 2018-03-26 17:44:32,580 INFO  [Query
>> > 89e44f1b-5636-4c4b-98cb-423237effe81-49] service.QueryService:428 :
>> > Using project: KYLIN_SYSTEM
>> > 2018-03-26 17:44:32,585 INFO  [Query
>> > 89e44f1b-5636-4c4b-98cb-423237effe81-49] service.QueryService:429 :
>> > The original query:  select
>> > count(*),sum(QUERY_TIME_COST)/(count(QUERY_TIME_COST)),max(Q
>> UERY_TIME_COST),min(QUERY_TIME_COST)
>> > from KYLIN.HIVE_METRICS_QUERY_QA where KDAY_DATE >= '2018-03-19' and
>> > KDAY_DATE <= '2018-03-25' and PROJECT ='LEARN_KYLIN' and EXCEPTION =
>> > 'NULL'
>> > 2018-03-26 17:44:32,583 ERROR [http-bio-7070-exec-5]
>> > controller.BasicController:61 :
>> > java.lang.NullPointerException
>> >         at org.apache.kylin.rest.util.QueryRequestLimits.<init>(QueryRe
>> questLimits.java:107)
>> >         at org.apache.kylin.rest.service.QueryService.doQueryWithCache(
>> QueryService.java:454)
>> >         at org.apache.kylin.rest.service.QueryService.doQueryWithCache(
>> QueryService.java:390)
>> >         at org.apache.kylin.rest.controller.DashboardController.getChar
>> tData(DashboardController.java:111)
>> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>> ssorImpl.java:62)
>> >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>> thodAccessorImpl.java:43)
>> >         at java.lang.reflect.Method.invoke(Method.java:498)
>> >         at org.springframework.web.method.support.InvocableHandlerMetho
>> d.doInvoke(InvocableHandlerMethod.java:205)
>> >         at org.springframework.web.method.support.InvocableHandlerMetho
>> d.invokeForRequest(InvocableHandlerMethod.java:133)
>> >         at org.springframework.web.servlet.mvc.method.annotation.Servle
>> tInvocableHandlerMethod.invokeAndHandle(ServletInvocableHand
>> lerMethod.java:97)
>> >         at org.springframework.web.servlet.mvc.method.annotation.Reques
>> tMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHan
>> dlerAdapter.java:827)
>> >         at org.springframework.web.servlet.mvc.method.annotation.Reques
>> tMappingHandlerAdapter.handleInternal(RequestMappingHandlerA
>> dapter.java:738)
>> >         at org.springframework.web.servlet.mvc.method.AbstractHandlerMe
>> thodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
>> >         at org.springframework.web.servlet.DispatcherServlet.doDispatch
>> (DispatcherServlet.java:967)
>> >         at org.springframework.web.servlet.DispatcherServlet.doService(
>> DispatcherServlet.java:901)
>> >                 at org.springframework.web.servle
>> t.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
>> >         at org.springframework.web.servlet.DispatcherServlet.doService(
>> DispatcherServlet.java:901)
>> >         at org.springframework.web.servlet.FrameworkServlet.processRequ
>> est(FrameworkServlet.java:970)
>> >         at org.springframework.web.servlet.FrameworkServlet.doGet(
>> FrameworkServlet.java:861)
>> >         at javax.servlet.http.HttpServlet.service(HttpServlet.java:624)
>> >         at org.springframework.web.servlet.FrameworkServlet.service(
>> FrameworkServlet.java:846)
>> >         at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
>> >         at org.apache.catalina.core.ApplicationFilterChain.internalDoFi
>> lter(ApplicationFilterChain.java:303)
>> >         at org.apache.catalina.core.ApplicationFilterChain.doFilter(App
>> licationFilterChain.java:208)
>> >         at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilte
>> r.java:52)
>> >         at org.apache.catalina.core.ApplicationFilterChain.internalDoFi
>> lter(ApplicationFilterChain.java:241)
>> >         at org.apache.catalina.core.ApplicationFilterChain.doFilter(App
>> licationFilterChain.java:208)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:317)
>> >         at org.springframework.security.web.access.intercept.FilterSecu
>> rityInterceptor.invoke(FilterSecurityInterceptor.java:127)
>> >         at org.springframework.security.web.access.intercept.FilterSecu
>> rityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.access.ExceptionTranslation
>> Filter.doFilter(ExceptionTranslationFilter.java:114)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.session.SessionManagementFi
>> lter.doFilter(SessionManagementFilter.java:137)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.authentication.AnonymousAut
>> henticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.servletapi.SecurityContextH
>> olderAwareRequestFilter.doFilter(SecurityContextHolder
>> AwareRequestFilter.java:170)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.savedrequest.RequestCacheAw
>> areFilter.doFilter(RequestCacheAwareFilter.java:63)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.authentication.www.BasicAut
>> henticationFilter.doFilterInternal(BasicAuthenticationFilter.java:158)
>> >         at org.springframework.web.filter.OncePerRequestFilter.doFilter
>> (OncePerRequestFilter.java:107)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.authentication.AbstractAuth
>> enticationProcessingFilter.doFilter(AbstractAuthenticatio
>> nProcessingFilter.java:200)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.authentication.logout.Logou
>> tFilter.doFilter(LogoutFilter.java:116)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.header.HeaderWriterFilter.d
>> oFilterInternal(HeaderWriterFilter.java:64)
>> >
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.header.HeaderWriterFilter.d
>> oFilterInternal(HeaderWriterFilter.java:64)
>> >         at org.springframework.web.filter.OncePerRequestFilter.doFilter
>> (OncePerRequestFilter.java:107)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.context.request.async.WebAs
>> yncManagerIntegrationFilter.doFilterInternal(WebAsyncManag
>> erIntegrationFilter.java:56)
>> >         at org.springframework.web.filter.OncePerRequestFilter.doFilter
>> (OncePerRequestFilter.java:107)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.context.SecurityContextPers
>> istenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
>> >         at org.springframework.security.web.FilterChainProxy$VirtualFil
>> terChain.doFilter(FilterChainProxy.java:331)
>> >         at org.springframework.security.web.FilterChainProxy.doFilterIn
>> ternal(FilterChainProxy.java:214)
>> >         at org.springframework.security.web.FilterChainProxy.doFilter(F
>> ilterChainProxy.java:177)
>> >         at org.springframework.web.filter.DelegatingFilterProxy.invokeD
>> elegate(DelegatingFilterProxy.java:346)
>> >         at org.springframework.web.filter.DelegatingFilterProxy.doFilte
>> r(DelegatingFilterProxy.java:262)
>> >         at org.apache.catalina.core.ApplicationFilterChain.internalDoFi
>> lter(ApplicationFilterChain.java:241)
>> >         at org.apache.catalina.core.ApplicationFilterChain.doFilter(App
>> licationFilterChain.java:208)
>> >         at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilte
>> r.java:209)
>> >         at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilte
>> r.java:244)
>> >         at org.apache.catalina.core.ApplicationFilterChain.internalDoFi
>> lter(ApplicationFilterChain.java:241)
>> >         at org.apache.catalina.core.ApplicationFilterChain.doFilter(App
>> licationFilterChain.java:208)
>> >         at org.apache.catalina.core.StandardWrapperValve.invoke(Standar
>> dWrapperValve.java:219)
>> >         at org.apache.catalina.core.StandardContextValve.invoke(Standar
>> dContextValve.java:110)
>> >         at org.apache.catalina.core.StandardHostValve.invoke(StandardHo
>> stValve.java:169)
>> >         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorRepo
>> rtValve.java:103)
>> >         at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogVa
>> lve.java:962)
>> >         at org.apache.catalina.core.StandardEngineValve.invoke(Standard
>> EngineValve.java:116)
>> >         at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAd
>> apter.java:445)
>> >         at org.apache.coyote.http11.AbstractHttp11Processor.process(Abs
>> tractHttp11Processor.java:1115)
>> >         at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler
>> .process(AbstractProtocol.java:637)
>> >         at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(
>> JIoEndpoint.java:316)
>> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>> Executor.java:1149)
>> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>> lExecutor.java:624)
>> >         at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.
>> run(TaskThread.java:61)
>> >         at java.lang.Thread.run(Thread.java:748)
>>
>
>
  <configuration>
    
    <property>
      <name>ambari.hive.db.schema.name</name>
      <value>hive</value>
    </property>
    
    <property>
      <name>atlas.hook.hive.maxThreads</name>
      <value>1</value>
    </property>
    
    <property>
      <name>atlas.hook.hive.minThreads</name>
      <value>1</value>
    </property>
    
    <property>
      <name>datanucleus.autoCreateSchema</name>
      <value>false</value>
    </property>
    
    <property>
      <name>datanucleus.cache.level2.type</name>
      <value>none</value>
    </property>
    
    <property>
      <name>datanucleus.fixedDatastore</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.auto.convert.join</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.auto.convert.join.noconditionaltask</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.auto.convert.join.noconditionaltask.size</name>
      <value>572662306</value>
    </property>
    
    <property>
      <name>hive.auto.convert.sortmerge.join</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.auto.convert.sortmerge.join.to.mapjoin</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.cbo.enable</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.cli.print.header</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.cluster.delegation.token.store.class</name>
      <value>org.apache.hadoop.hive.thrift.ZooKeeperTokenStore</value>
    </property>
    
    <property>
      <name>hive.cluster.delegation.token.store.zookeeper.connectString</name>
      <value>xxx:2181,xxx:2181,xxx:2181</value>
    </property>
    
    <property>
      <name>hive.cluster.delegation.token.store.zookeeper.znode</name>
      <value>/hive/cluster/delegation</value>
    </property>
    
    <property>
      <name>hive.compactor.abortedtxn.threshold</name>
      <value>1000</value>
    </property>
    
    <property>
      <name>hive.compactor.check.interval</name>
      <value>300L</value>
    </property>
    
    <property>
      <name>hive.compactor.delta.num.threshold</name>
      <value>10</value>
    </property>
    
    <property>
      <name>hive.compactor.delta.pct.threshold</name>
      <value>0.1f</value>
    </property>
    
    <property>
      <name>hive.compactor.initiator.on</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.compactor.worker.threads</name>
      <value>1</value>
    </property>
    
    <property>
      <name>hive.compactor.worker.timeout</name>
      <value>86400L</value>
    </property>
    
    <property>
      <name>hive.compute.query.using.stats</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.conf.restricted.list</name>
      <value>hive.security.authenticator.manager,hive.security.authorization.manager,hive.users.in.admin.role</value>
    </property>
    
    <property>
      <name>hive.convert.join.bucket.mapjoin.tez</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.default.fileformat</name>
      <value>TextFile</value>
    </property>
    
    <property>
      <name>hive.default.fileformat.managed</name>
      <value>TextFile</value>
    </property>
    
    <property>
      <name>hive.enforce.bucketing</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.enforce.sorting</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.enforce.sortmergebucketmapjoin</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.exec.compress.intermediate</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.exec.compress.output</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.exec.dynamic.partition</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.exec.dynamic.partition.mode</name>
      <value>nonstrict</value>
    </property>
    
    <property>
      <name>hive.exec.failure.hooks</name>
      <value>org.apache.hadoop.hive.ql.hooks.ATSHook</value>
    </property>
    
    <property>
      <name>hive.exec.max.created.files</name>
      <value>100000</value>
    </property>
    
    <property>
      <name>hive.exec.max.dynamic.partitions</name>
      <value>5000</value>
    </property>
    
    <property>
      <name>hive.exec.max.dynamic.partitions.pernode</name>
      <value>2000</value>
    </property>
    
    <property>
      <name>hive.exec.orc.compression.strategy</name>
      <value>SPEED</value>
    </property>
    
    <property>
      <name>hive.exec.orc.default.compress</name>
      <value>ZLIB</value>
    </property>
    
    <property>
      <name>hive.exec.orc.default.stripe.size</name>
      <value>67108864</value>
    </property>
    
    <property>
      <name>hive.exec.orc.encoding.strategy</name>
      <value>SPEED</value>
    </property>
    
    <property>
      <name>hive.exec.parallel</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.exec.parallel.thread.number</name>
      <value>8</value>
    </property>
    
    <property>
      <name>hive.exec.post.hooks</name>
      <value>org.apache.hadoop.hive.ql.hooks.ATSHook</value>
    </property>
    
    <property>
      <name>hive.exec.pre.hooks</name>
      <value>org.apache.hadoop.hive.ql.hooks.ATSHook</value>
    </property>
    
    <property>
      <name>hive.exec.reducers.bytes.per.reducer</name>
      <value>67108864</value>
    </property>
    
    <property>
      <name>hive.exec.reducers.max</name>
      <value>1009</value>
    </property>
    
    <property>
      <name>hive.exec.scratchdir</name>
      <value>/tmp/hive</value>
    </property>
    
    <property>
      <name>hive.exec.submit.local.task.via.child</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.exec.submitviachild</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.execution.engine</name>
      <value>mr</value>
    </property>
    
    <property>
      <name>hive.fetch.task.aggr</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.fetch.task.conversion</name>
      <value>more</value>
    </property>
    
    <property>
      <name>hive.fetch.task.conversion.threshold</name>
      <value>1073741824</value>
    </property>
    
    <property>
      <name>hive.limit.optimize.enable</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.limit.pushdown.memory.usage</name>
      <value>0.04</value>
    </property>
    
    <property>
      <name>hive.map.aggr</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.map.aggr.hash.force.flush.memory.threshold</name>
      <value>0.9</value>
    </property>
    
    <property>
      <name>hive.map.aggr.hash.min.reduction</name>
      <value>0.5</value>
    </property>
    
    <property>
      <name>hive.map.aggr.hash.percentmemory</name>
      <value>0.5</value>
    </property>
    
    <property>
      <name>hive.mapjoin.bucket.cache.size</name>
      <value>10000</value>
    </property>
    
    <property>
      <name>hive.mapjoin.optimized.hashtable</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.mapred.reduce.tasks.speculative.execution</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.merge.mapfiles</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.merge.mapredfiles</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.merge.orcfile.stripe.level</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.merge.rcfile.block.level</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.merge.size.per.task</name>
      <value>256000000</value>
    </property>
    
    <property>
      <name>hive.merge.smallfiles.avgsize</name>
      <value>16000000</value>
    </property>
    
    <property>
      <name>hive.merge.tezfiles</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.metastore.authorization.storage.checks</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.metastore.cache.pinobjtypes</name>
      <value>Table,Database,Type,FieldSchema,Order</value>
    </property>
    
    <property>
      <name>hive.metastore.client.connect.retry.delay</name>
      <value>5s</value>
    </property>
    
    <property>
      <name>hive.metastore.client.socket.timeout</name>
      <value>1800s</value>
    </property>
    
    <property>
      <name>hive.metastore.connect.retries</name>
      <value>24</value>
    </property>
    
    <property>
      <name>hive.metastore.execute.setugi</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.metastore.failure.retries</name>
      <value>24</value>
    </property>
    
    <property>
      <name>hive.metastore.kerberos.keytab.file</name>
      <value>/etc/security/keytabs/hive.service.keytab</value>
    </property>
    
    <property>
      <name>hive.metastore.kerberos.principal</name>
      <value>hive/_h...@example.com</value>
    </property>
    
    <property>
      <name>hive.metastore.pre.event.listeners</name>
      <value>org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener</value>
    </property>
    
    <property>
      <name>hive.metastore.sasl.enabled</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.metastore.server.max.threads</name>
      <value>100000</value>
    </property>
    
    <property>
      <name>hive.metastore.uris</name>
      <value>thrift://xxx:9083</value>
    </property>
    
    <property>
      <name>hive.metastore.warehouse.dir</name>
      <value>/apps/hive/warehouse</value>
    </property>
    
    <property>
      <name>hive.optimize.bucketmapjoin</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.optimize.bucketmapjoin.sortedmerge</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.optimize.constant.propagation</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.optimize.index.filter</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.optimize.metadataonly</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.optimize.null.scan</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.optimize.reducededuplication</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.optimize.reducededuplication.min.reducer</name>
      <value>4</value>
    </property>
    
    <property>
      <name>hive.optimize.sort.dynamic.partition</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.orc.compute.splits.num.threads</name>
      <value>10</value>
    </property>
    
    <property>
      <name>hive.orc.splits.include.file.footer</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.prewarm.enabled</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.prewarm.numcontainers</name>
      <value>3</value>
    </property>
    
    <property>
      <name>hive.security.authenticator.manager</name>
      <value>org.apache.hadoop.hive.ql.security.ProxyUserAuthenticator</value>
    </property>
    
    <property>
      <name>hive.security.authorization.enabled</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.security.authorization.manager</name>
      <value>org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory</value>
    </property>
    
    <property>
      <name>hive.security.metastore.authenticator.manager</name>
      <value>org.apache.hadoop.hive.ql.security.HadoopDefaultMetastoreAuthenticator</value>
    </property>
    
    <property>
      <name>hive.security.metastore.authorization.auth.reads</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.security.metastore.authorization.manager</name>
      <value>org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider</value>
    </property>
    
    <property>
      <name>hive.server2.allow.user.substitution</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.server2.authentication</name>
      <value>NONE</value>
    </property>
    
    <property>
      <name>hive.server2.authentication.spnego.keytab</name>
      <value>HTTP/_h...@example.com</value>
    </property>
    
    <property>
      <name>hive.server2.authentication.spnego.principal</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>
    
    <property>
      <name>hive.server2.enable.doAs</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.server2.logging.operation.enabled</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.server2.logging.operation.log.location</name>
      <value>/tmp/hive/operation_logs</value>
    </property>
    
    <property>
      <name>hive.server2.max.start.attempts</name>
      <value>5</value>
    </property>
    
    <property>
      <name>hive.server2.support.dynamic.service.discovery</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.server2.table.type.mapping</name>
      <value>CLASSIC</value>
    </property>
    
    <property>
      <name>hive.server2.tez.default.queues</name>
      <value>default</value>
    </property>
    
    <property>
      <name>hive.server2.tez.initialize.default.sessions</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.server2.tez.sessions.per.default.queue</name>
      <value>1</value>
    </property>
    
    <property>
      <name>hive.server2.thrift.http.path</name>
      <value>cliservice</value>
    </property>
    
    <property>
      <name>hive.server2.thrift.http.port</name>
      <value>10001</value>
    </property>
    
    <property>
      <name>hive.server2.thrift.max.worker.threads</name>
      <value>500</value>
    </property>
    
    <property>
      <name>hive.server2.thrift.port</name>
      <value>10000</value>
    </property>
    
    <property>
      <name>hive.server2.thrift.sasl.qop</name>
      <value>auth</value>
    </property>
    
    <property>
      <name>hive.server2.transport.mode</name>
      <value>binary</value>
    </property>
    
    <property>
      <name>hive.server2.use.SSL</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.server2.zookeeper.namespace</name>
      <value>hiveserver2</value>
    </property>
    
    <property>
      <name>hive.smbjoin.cache.rows</name>
      <value>10000</value>
    </property>
    
    <property>
      <name>hive.start.cleanup.scratchdir</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.stats.autogather</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.stats.dbclass</name>
      <value>fs</value>
    </property>
    
    <property>
      <name>hive.stats.fetch.column.stats</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.stats.fetch.partition.stats</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.support.concurrency</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.tez.auto.reducer.parallelism</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.tez.container.size</name>
      <value>2048</value>
    </property>
    
    <property>
      <name>hive.tez.cpu.vcores</name>
      <value>-1</value>
    </property>
    
    <property>
      <name>hive.tez.dynamic.partition.pruning</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.tez.dynamic.partition.pruning.max.data.size</name>
      <value>104857600</value>
    </property>
    
    <property>
      <name>hive.tez.dynamic.partition.pruning.max.event.size</name>
      <value>1048576</value>
    </property>
    
    <property>
      <name>hive.tez.input.format</name>
      <value>org.apache.hadoop.hive.ql.io.HiveInputFormat</value>
    </property>
    
    <property>
      <name>hive.tez.java.opts</name>
      <value>-server -Djava.net.preferIPv4Stack=true -XX:NewRatio=8 -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+PrintGCDetails -verbose:gc -XX:+PrintGCTimeStamps</value>
    </property>
    
    <property>
      <name>hive.tez.log.level</name>
      <value>INFO</value>
    </property>
    
    <property>
      <name>hive.tez.max.partition.factor</name>
      <value>2.0</value>
    </property>
    
    <property>
      <name>hive.tez.min.partition.factor</name>
      <value>0.25</value>
    </property>
    
    <property>
      <name>hive.tez.smb.number.waves</name>
      <value>0.5</value>
    </property>
    
    <property>
      <name>hive.txn.manager</name>
      <value>org.apache.hadoop.hive.ql.lockmgr.DbTxnManager</value>
    </property>
    
    <property>
      <name>hive.txn.max.open.batch</name>
      <value>1000</value>
    </property>
    
    <property>
      <name>hive.txn.timeout</name>
      <value>300</value>
    </property>
    
    <property>
      <name>hive.user.install.directory</name>
      <value>/user/</value>
    </property>
    
    <property>
      <name>hive.vectorized.execution.enabled</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.vectorized.execution.reduce.enabled</name>
      <value>false</value>
    </property>
    
    <property>
      <name>hive.vectorized.groupby.checkinterval</name>
      <value>4096</value>
    </property>
    
    <property>
      <name>hive.vectorized.groupby.flush.percent</name>
      <value>0.1</value>
    </property>
    
    <property>
      <name>hive.vectorized.groupby.maxentries</name>
      <value>100000</value>
    </property>
    
    <property>
      <name>hive.warehouse.subdir.inherit.perms</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hive.zookeeper.client.port</name>
      <value>2181</value>
    </property>
    
    <property>
      <name>hive.zookeeper.namespace</name>
      <value>hive_zookeeper_namespace</value>
    </property>
    
    <property>
      <name>hive.zookeeper.quorum</name>
      <value>xxx:2181,xxx:2181,xxx:2181</value>
    </property>
    
    <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
    </property>
    
    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://xxx/hive</value>
    </property>
    
    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hive</value>
    </property>

    <property>
      <name>hive.metastore.metrics.enabled</name>
      <value>true</value>
    </property>

    <property>
      <name>hive.service.metrics.hadoop2.component</name>
      <value>hivemetastore</value>
    </property>

    <property>
      <name>hive.service.metrics.reporter</name>
      <value>HADOOP2</value>
    </property>
    
  </configuration>

Reply via email to