if the cube has no data, you'd better firstly have a check on the hive
source table.

Kylin runs "hive -e" shell command to generate the intermediate flat table
at the 1st step of the cube job. If your "hive" command can read the right
table, Kylin should be able also.

The intermediate hive table was dropped at the end of the cube build. But
you can rebuild the cube, and check the intermediate table content before
the build be finished. Or you can directly run the hive create intermediate
table SQL (can be found in the "parameter" of the first step).

With the intermediate table, you can check whether it has data. If no data,
you can further check the filter conditions; sometimes it is the date
format doesn't match causing this.

If the intermediate table has data, while cube has no data, then that is a
problem, but I almost haven't seen this before.

2016-04-27 10:08 GMT+08:00 bitbean <bitb...@qq.com>:

> not so match
>
>
> my case: local hive and remote hive share the metadata database, so my
> local hive can see  fact table on remote hdfs.
>
>
> I use the fact table on remote hdfs, now my trouble is that  kylin will
> build cube in 5 minutes, but the result htable  has no data.
>
>
> I guess kylin doesn't fetch data from remote hdfs, and kylin doesn't tell
> me it can't fetch data,
>
>
>
>
> so can you help me to resolve it ?
>
>
>
>
> ------------------ 原始邮件 ------------------
> 发件人: "ShaoFeng Shi";<shaofeng...@apache.org>;
> 发送时间: 2016年4月26日(星期二) 下午5:42
> 收件人: "dev"<dev@kylin.apache.org>;
>
> 主题: Re: multi hadoop cluster
>
>
>
> will this match your case?
> https://issues.apache.org/jira/browse/KYLIN-1172
>
> 2016-04-26 16:55 GMT+08:00 bitbean <bitb...@qq.com>:
>
> > Hi all,
> >
> >      i am encountering a problem with multiple hadoop cluster.
> >
> >
> >      kylin submit job to yarn on one hdfs, but my fact table is on other
> > hdfs. Two hadoop clusters  use the same mysql to store metadata.
> >
> >
> > so when i build cube, the first step to create intermediate table , and
> > insert data from fact table.
> >
> >
> >     but i can't access the fact table in kylin's hive.
> >
> >
> >      for example , the first step as below
> >
> >
> > "kylin_intermediate_cube8_20160301000000_20160413000000 SELECT
> > PARTNER_USR_DOC_BASIC_INFO_FT0_S.PHONE_PROVINCE_IND
> > FROM WLT_PARTNER.PARTNER_USR_DOC_BASIC_INFO_FT0_S as
> > PARTNER_USR_DOC_BASIC_INFO_FT0_S
> > WHERE (PARTNER_USR_DOC_BASIC_INFO_FT0_S.PT_LOG_D >= '2016-03-01' AND
> > PARTNER_USR_DOC_BASIC_INFO_FT0_S.PT_LOG_D < '2016-04-13')"
> >
> >
> >
> >      Table "PARTNER_USR_DOC_BASIC_INFO_FT0_S" locate
> > "hdfs://hadoop2NameNode/wlt_partner/PARTNER_USR_DOC_BASIC_INFO_FT0_S"
> >
> >
> > but "kylin_intermediate_cube8_20160301000000_20160413000000"  locate
> > "hdfs://bihbasemaster/"
> >
> >
> > they are different clusters.
> >
> >
> >     The current situation is there will not be any error in WEBUI at step
> > 1,
> >
> >
> >      When cube done, there is nothing in  Htable, so What can i do?
>
>
>
>
> --
> Best regards,
>
> Shaofeng Shi
>



-- 
Best regards,

Shaofeng Shi

Reply via email to