my env hadoop 2.7.1 ,kylin1.2,hive 1.2.1,hbase 0.98.

my hive config :


   <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://10.24.248.196:3306/hive?characterEncoding=UTF-8</value>
    <description>JDBC connect string for a JDBC metastore</description>
  </property>
  
   <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property> 
  
   <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>root</value>
    <description>Username to use against metastore database</description>
  </property>


  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>root</value>
    <description>password to use against metastore database</description>
  </property>
  
        <property>
          <name>datanucleus.transactionIsolation</name>
          <value>repeatable-read</value>
        </property>  
        
        <property>
          <name>datanucleus.valuegeneration.transactionIsolation</name>
          <value>repeatable-read</value>
        </property>     


    <property>
      <name>hive.aux.jars.path</name>
      
<value>file:///usr/local/hive/lib/json-serde-1.3.6-jar-with-dependencies.jar,file:///usr/local/hive/lib/gson-2.2.4.jar,file:///usr/local/hive/lib/data-hive-udf.jar</value>
      <description>The location of the plugin jars that contain implementations 
of user defined functions and serdes.</description>
    </property> 





------------------ ???????? ------------------
??????: "yu feng";<olaptes...@gmail.com>;
????????: 2016??1??7??(??????) ????11:25
??????: "dev"<dev@kylin.apache.org>; 

????: Re: java.io.IOException: 
NoSuchObjectException(message:default.kylin_intermediate_learn_kylin



I have encountered this problem, it is more likely because your current
hive metastore config is error, could you tell some more detailed
infomation about your env.

2016-01-07 10:23 GMT+08:00 ???? <363938...@qq.com>:

> hi:
>   i build cube have a error.
> logs:
> java.io.IOException:
> NoSuchObjectException(message:default.kylin_intermediate_learn_kylin_four_20150201000000_20151230000000_8d26cc4b_e012_4414_a89b_c8d9323ae277
> table not found)
>         at
> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
>         at
> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
>         at
> org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.setupMapper(FactDistinctColumnsJob.java:101)
>         at
> org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:77)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>         at
> org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExecutable.java:120)
>         at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>         at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
>         at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>         at
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by:
> NoSuchObjectException(message:default.kylin_intermediate_learn_kylin_four_20150201000000_20151230000000_8d26cc4b_e012_4414_a89b_c8d9323ae277
> table not found)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1808)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1778)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>         at com.sun.proxy.$Proxy47.get_table(Unknown Source)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1208)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:152)
>         at com.sun.proxy.$Proxy48.getTable(Unknown Source)
>         at
> org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
>         at
> org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
>         at
> org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
>         at
> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
>         ... 13 more

Reply via email to