You don't need HDFS or virtual machines to run Spark. You can just download it, 
unzip it and run it on your laptop. See 
http://spark.apache.org/docs/latest/index.html 
<http://spark.apache.org/docs/latest/index.html>.

Matei


> On Feb 6, 2015, at 2:58 PM, David Fallside <falls...@us.ibm.com> wrote:
> 
> King, consider trying the Spark Kernel 
> (https://github.com/ibm-et/spark-kernel 
> <https://github.com/ibm-et/spark-kernel>) which will install Spark etc and 
> provide you with a Spark/Scala Notebook in which you can develop your 
> algorithm. The Vagrant installation described in 
> https://github.com/ibm-et/spark-kernel/wiki/Vagrant-Development-Environment 
> <https://github.com/ibm-et/spark-kernel/wiki/Vagrant-Development-Environment> 
> will have you quickly up and running on a single machine without having to 
> manage the details of the system installations. There is a Docker version, 
> https://github.com/ibm-et/spark-kernel/wiki/Using-the-Docker-Container-for-the-Spark-Kernel
>  
> <https://github.com/ibm-et/spark-kernel/wiki/Using-the-Docker-Container-for-the-Spark-Kernel>,
>  if you prefer Docker.
> Regards,
> David
> 
> 
> King sami <kgsam...@gmail.com> wrote on 02/06/2015 08:09:39 AM:
> 
> > From: King sami <kgsam...@gmail.com>
> > To: user@spark.apache.org
> > Date: 02/06/2015 08:11 AM
> > Subject: Beginner in Spark
> > 
> > Hi,
> > 
> > I'm new in Spark, I'd like to install Spark with Scala. The aim is 
> > to build a data processing system foor door events. 
> > 
> > the first step is install spark, scala, hdfs and other required tools.
> > the second is build the algorithm programm in Scala which can treat 
> > a file of my data logs (events).
> > 
> > Could you please help me to install the required tools: Spark, 
> > Scala, HDF and tell me how can I execute my programm treating the entry 
> > file.
> > 
> > Best regards,
> 

Reply via email to