Re: Mapping with extra arguments

2014-08-21 Thread TJ Klein
Thanks for the nice example. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Mapping-with-extra-arguments-tp12541p12549.html Sent from the Apache Spark User List mailing list archive at Nabble.com. ---

Re: Mapping with extra arguments

2014-08-21 Thread TJ Klein
Thanks. That's pretty much what I need. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Mapping-with-extra-arguments-tp12541p12548.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Mapping with extra arguments

2014-08-20 Thread Mayur Rustagi
You can add that as part of your RDD, so as output of your map operation generate the input of your next map operation.. ofcourse the obscure logic of generating that data has to be map .. another way is nested def def factorial(number: Int) : Int = { def factorialWithAccumulator(accumulator:

Re: Mapping with extra arguments

2014-08-20 Thread zzl
def foo(extra_arg): …. def bar(row): # your code here return bar then pass foo(extra_arg) to spark map function. -- Best Regards! On Thursday, August 21, 2014 at 2:33 PM, TJ Klein wrote: > Hi, > > I am using Spark in Python. I wonder if there is a possibility for pass