I am using hive 0.14.0 and do not config "hive.metastore.uris", maybe you
need to start hive metastore in hive 1.2.1

2016-01-07 13:28 GMT+08:00 Xiaoyu Wang <[email protected]>:

> Hi,
> I think this exception may cause by the "hive.metastore.uris" property is
> not set in hive-site.xml
> Kylin use HCatalog to read Hive table. HCatalog will use
> "hive.metastore.uris" property to create HiveMetaStoreClient and get table
> meta.
> if not set . HCatalog use local meta server , so it will throw
> NoSuchObjectException exception.
>
> You can config the "hive.metastore.uris" property in hive-site.xml and
> start hive metastore. so HCatalog can connect to it.
>
>
>
> 在 2016年01月07日 12:52, yu feng 写道:
>
>> you can check whether the table "default.kylin_intermediate_
>> learn_kylin_four_20150201000000_20151230000000_8d26cc4b_e012_4414_a89b_
>> c8d9323ae277" exist in your hive and do any other hive-site.xml exist in
>> your classpath. it is strange because you can load hive table(before
>> create
>> and build cube).
>>
>> 2016-01-07 11:47 GMT+08:00 和风 <[email protected]>:
>>
>> my env hadoop 2.7.1 ,kylin1.2,hive 1.2.1,hbase 0.98.
>>>
>>>
>>> my hive config :
>>>
>>>
>>>     <property>
>>>      <name>javax.jdo.option.ConnectionURL</name>
>>>      <value>jdbc:mysql://10.24.248.196:3306/hive?characterEncoding=UTF-8
>>> </value>
>>>      <description>JDBC connect string for a JDBC metastore</description>
>>>    </property>
>>>
>>>     <property>
>>>      <name>javax.jdo.option.ConnectionDriverName</name>
>>>      <value>com.mysql.jdbc.Driver</value>
>>>      <description>Driver class name for a JDBC metastore</description>
>>>    </property>
>>>
>>>     <property>
>>>      <name>javax.jdo.option.ConnectionUserName</name>
>>>      <value>root</value>
>>>      <description>Username to use against metastore
>>> database</description>
>>>    </property>
>>>
>>>
>>>    <property>
>>>      <name>javax.jdo.option.ConnectionPassword</name>
>>>      <value>root</value>
>>>      <description>password to use against metastore
>>> database</description>
>>>    </property>
>>>
>>>          <property>
>>>            <name>datanucleus.transactionIsolation</name>
>>>            <value>repeatable-read</value>
>>>          </property>
>>>
>>>          <property>
>>>            <name>datanucleus.valuegeneration.transactionIsolation</name>
>>>            <value>repeatable-read</value>
>>>          </property>
>>>
>>>
>>>      <property>
>>>        <name>hive.aux.jars.path</name>
>>>
>>>
>>> <value>file:///usr/local/hive/lib/json-serde-1.3.6-jar-with-dependencies.jar,file:///usr/local/hive/lib/gson-2.2.4.jar,file:///usr/local/hive/lib/data-hive-udf.jar</value>
>>>        <description>The location of the plugin jars that contain
>>> implementations of user defined functions and serdes.</description>
>>>      </property>
>>>
>>>
>>>
>>>
>>>
>>> ------------------ 原始邮件 ------------------
>>> 发件人: "yu feng";<[email protected]>;
>>> 发送时间: 2016年1月7日(星期四) 中午11:25
>>> 收件人: "dev"<[email protected]>;
>>>
>>> 主题: Re: java.io.IOException:
>>> NoSuchObjectException(message:default.kylin_intermediate_learn_kylin
>>>
>>>
>>>
>>> I have encountered this problem, it is more likely because your current
>>> hive metastore config is error, could you tell some more detailed
>>> infomation about your env.
>>>
>>> 2016-01-07 10:23 GMT+08:00 和风 <[email protected]>:
>>>
>>> hi:
>>>>    i build cube have a error.
>>>> logs:
>>>> java.io.IOException:
>>>>
>>>>
>>> NoSuchObjectException(message:default.kylin_intermediate_learn_kylin_four_20150201000000_20151230000000_8d26cc4b_e012_4414_a89b_c8d9323ae277
>>>
>>>> table not found)
>>>>          at
>>>>
>>>>
>>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.setupMapper(FactDistinctColumnsJob.java:101)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.hadoop.cube.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:77)
>>>
>>>>          at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>          at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExecutable.java:120)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
>>>
>>>>          at
>>>>
>>>>
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>
>>>>          at
>>>>
>>>>
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>
>>>>          at java.lang.Thread.run(Thread.java:745)
>>>> Caused by:
>>>>
>>>>
>>> NoSuchObjectException(message:default.kylin_intermediate_learn_kylin_four_20150201000000_20151230000000_8d26cc4b_e012_4414_a89b_c8d9323ae277
>>>
>>>> table not found)
>>>>          at
>>>>
>>>>
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1808)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1778)
>>>
>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>          at
>>>>
>>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>
>>>>          at
>>>>
>>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>>          at
>>>>
>>>>
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>>>
>>>>          at com.sun.proxy.$Proxy47.get_table(Unknown Source)
>>>>          at
>>>>
>>>>
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1208)
>>>
>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>          at
>>>>
>>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>
>>>>          at
>>>>
>>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>>          at
>>>>
>>>>
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:152)
>>>
>>>>          at com.sun.proxy.$Proxy48.getTable(Unknown Source)
>>>>          at
>>>> org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
>>>>          at
>>>>
>>>>
>>> org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
>>>
>>>>          at
>>>>
>>>>
>>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
>>>
>>>>          ... 13 more
>>>>
>>>

Reply via email to