ani, Jayesh
>> wrote:
>>
>>> Yes, as you found, in local mode, Spark won’t serialize your objects. It
>>> will just pass the reference to the closure. This means that it is possible
>>> to write code that works in local mode, but doesn’t when you run
>&
>>
>>
>>
>> *From: *Sheel Pancholi
>> *Date: *Friday, February 26, 2021 at 4:24 AM
>> *To: *user
>> *Subject: *[EXTERNAL] Spark closures behavior in local mode in IDEs
>>
>>
>>
>> *CAUTION*: This email originated from outside of the or
s possible
> to write code that works in local mode, but doesn’t when you run
> distributed.
>
>
>
> *From: *Sheel Pancholi
> *Date: *Friday, February 26, 2021 at 4:24 AM
> *To: *user
> *Subject: *[EXTERNAL] Spark closures behavior in local mode in IDEs
>
>
>
>
To: user
Subject: [EXTERNAL] Spark closures behavior in local mode in IDEs
CAUTION: This email originated from outside of the organization. Do not click
links or open attachments unless you can confirm the sender and know the
content is safe.
Hi ,
I am observing weird behavior of spark
Hi ,
I am observing weird behavior of spark and closures in local mode on my
machine v/s a 3 node cluster (Spark 2.4.5).
Following is the piece of code
object Example {
val num=5
def myfunc={
sc.parallelize(1 to 4).map(_+num).foreach(println)
}
}
I expected this to fail regardless since