Thank you very much Marco! Really appreciate your support.

On Thu, Jan 22, 2015 at 10:57 PM, Marco Shaw <marco.s...@gmail.com> wrote:

> (Starting over...)
>
> The best place to look for the requirements would be at the individual
> pages of each technology.
>
> As for absolute minimum requirements, I would suggest 50GB of disk space
> and at least 8GB of memory.  This is the absolute minimum.
>
> "Architecting" a solution like you are looking for is very complex.  If
> you are just looking for a proof-of-concept consider a Docker image or
> going to Cloudera/Hortonworks/MapR and look for their "express VMs" which
> can usually run on Oracle Virtualbox or VMware.
>
> Marco
>
>
> On Thu, Jan 22, 2015 at 7:36 AM, Sudipta Banerjee <
> asudipta.baner...@gmail.com> wrote:
>
>>
>>
>> Hi Apache-Spark team ,
>>
>> What are the system requirements installing Hadoop and Apache Spark?
>> I have attached the screen shot of Gparted.
>>
>>
>> Thanks and regards,
>> Sudipta
>>
>>
>>
>>
>> --
>> Sudipta Banerjee
>> Consultant, Business Analytics and Cloud Based Architecture
>> Call me +919019578099
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>
>


-- 
Sudipta Banerjee
Consultant, Business Analytics and Cloud Based Architecture
Call me +919019578099

Reply via email to