Hi,
  Thanks for the quick response.. Is  there a simple way to write and
deploy apps on spark.
import org.apache.spark.SparkContext;
import org.apache.spark.SparkContext._;
object HelloWorld {
    def main(args: Array[String]) {
      println("Hello, world!")
      val sc = new
SparkContext("localhost","wordcount",args(0),Seq(args(1)))
      val file = sc.textFile(args(2))
      file.map(_.split(" ").flatMap(word =>
(word,1))).reduceByKey(_+_).saveAsTextFile(args(3))
    }
  }

I built the path and added the spark jars to the project.. and the errors
got away (Though eclipse was not giving me the methods options instead it
was giving me general method options (like file. will give arr,asof etc
options instead of map etc)...
But.. now I have built a jar.. how do i run it against spark?
Like in hadoop.. all i need to do is create this jar and do hadoop jar
jarname classname args...

also.. what if i had to include third party libraries.. in hadoop i use to
implement toolrunner and use libjars option?
Any suggestions?


On Mon, Mar 17, 2014 at 9:57 AM, Matei Zaharia <matei.zaha...@gmail.com>wrote:

> Look at the "running the examples" section of
> http://spark.incubator.apache.org/docs/latest/index.html, there's a
> script to do it.
>
> On Mar 17, 2014, at 9:55 AM, Chengi Liu <chengi.liu...@gmail.com> wrote:
>
> > Hi,
> >   I compiled the spark examples and I see that there are couple of jars
> > spark-examples_2.10-0.9.0-incubating-sources.jar
> > spark-examples_2.10-0.9.0-incubating.jar
> > If I want to run an example using these jars, which one should I run and
> how do i run them?
> > Thanks
>
>

Reply via email to