Hey Marco, if you’re primarily interested in trying Spark, you can also just 
get a binary build from Apache: http://spark.apache.org/downloads.html. You 
only need Java on your machine to run it. To see it work with the rest of the 
Hadoop ecosystem components it is probably better to use a VM.

Matei

On May 15, 2014, at 1:54 PM, Stephen Boesch <java...@gmail.com> wrote:

> 
> Hi Marco,
>   Hive itself is not working in the CDH5.0 VM (due to FNFE's on the third 
> party jars).  While you did not mention using Shark, you may keep that in 
> mind. I will try out spark-only commands late today and report what I find. 
> 
> 
> 2014-05-14 5:00 GMT-07:00 Marco Shaw <marco.s...@gmail.com>:
> Hi,
>  
> I've wanted to play with Spark.  I wanted to fast track things and just use 
> one of the vendor's "express VMs".  I've tried Cloudera CDH 5.0 and 
> Hortonworks HDP 2.1.
>  
> I've not written down all of my issues, but for certain, when I try to run 
> spark-shell it doesn't work.  Cloudera seems to crash, and both complain when 
> I try to use "SparkContext" in a simple Scala command.
>  
> So, just a basic question on whether anyone has had success getting these 
> express VMs to work properly with Spark *out of the box* (HDP does required 
> you install Spark manually).
>  
> I know Cloudera recommends 8GB of RAM, but I've been running it with 4GB.
>  
> Could it be that 4GB is just not enough, and causing issues or have others 
> had success using these Hadoop 2.x pre-built VMs with Spark 0.9.x?
>  
> Marco
> 

Reply via email to