SparkContext.addFile() and FileNotFoundException

2014-04-07 Thread Thierry Herrmann
Hi, I'm trying to use SparkContext.addFile() to propagate a file to worker nodes, in a standalone cluster (2 nodes, 1 master, 1 worker connected to the master). I don't have HDFS or any distributed file system. Just playing with basic stuff. Here's the code in my driver (actually spark-shell runnin

Re: How to index each map operation????

2014-04-01 Thread Thierry Herrmann
I'm new to Spark, but isn't this a pure scala question ? The following seems to work with the spark shell: $ spark-shell scala> val rdd = sc.makeRDD(List(10,20,30)) rdd: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[8] at makeRDD at :12 scala> var cnt = -1 cnt: Int = -1 scala> val rdd2