Hi, Samuel
Can you give us some detail log, so we can dig into the root cause
On Fri Feb 20 2015 at 2:44:32 AM Samuel Bock <[email protected]>
wrote:
> Hello all,
>
> We are in the process of evaluating Kylin for use as an OLAP engine. To
> that end, we are trying to get a minimum viable setup with a representative
> sample of our data in order to gather performance metrics. We have kylin
> running against a 10 node cluster, the provided cubes build successfully
> and the system seems functional. Attempting to build a simple cube against
> our data results in an OutOfMemoryError in the kylin server process (so far
> we have given it up to a 46 gig heap). I was wondering if you could give me
> some guidance as to likely causes, any configurations I'm likely to have
> missed before I start diving into the source. I have changed the
> "dictionary" setting to false, as recommended for high-cardinality
> dimensions, but have not changed configuration significantly apart from
> that.
>
> For reference, the sizes of the hive tables we're building the cubes from
> dimension table: 25,399,061 rows
> fact table: 270,940,921 rows
>
> (And as a note, there are no pertinent log messages except to indicate that
> it is in the Build Dimension Dictionary step)
>
> Thank you,
> sam bock
>