Re: hive benchmark

2015-08-11 Thread Noam Hasson
Sure,

Even a single node with can support it, it's all a question of processing
time.


On Tue, Aug 11, 2015 at 9:31 AM, siva kumar  wrote:

> Hi Folks,
>   I need to insert 1 billion records into hive and
> here is my cluster details.
>
> 1. 6-node Hadoop cluster cluster.
> 2. 16GB RAM on each node.
> 3. 2TB Hard-disk on each node.
>
> Is this configuration suitable for storing 1 billion records? If not, what
> is that all we need to store and read 1 billion records?
>
> Thanks and regards,
> siva
>
>

-- 
This e-mail, as well as any attached document, may contain material which 
is confidential and privileged and may include trademark, copyright and 
other intellectual property rights that are proprietary to Kenshoo Ltd, 
 its subsidiaries or affiliates ("Kenshoo"). This e-mail and its 
attachments may be read, copied and used only by the addressee for the 
purpose(s) for which it was disclosed herein. If you have received it in 
error, please destroy the message and any attachment, and contact us 
immediately. If you are not the intended recipient, be aware that any 
review, reliance, disclosure, copying, distribution or use of the contents 
of this message without Kenshoo's express permission is strictly prohibited.


hive benchmark

2015-08-10 Thread siva kumar
Hi Folks,
  I need to insert 1 billion records into hive and here
is my cluster details.

1. 6-node Hadoop cluster cluster.
2. 16GB RAM on each node.
3. 2TB Hard-disk on each node.

Is this configuration suitable for storing 1 billion records? If not, what
is that all we need to store and read 1 billion records?

Thanks and regards,
siva