Thanks Miguel...will have a read.
Thanks Jacek...that looks incredibly useful.
:)
Subject: Re: Simulate serialization when running local
From: mig...@zero-x.co
Date: Sun, 14 Aug 2016 21:07:41 -0700
CC: as...@live.com; user@spark.apache.org
To: ja...@japila.pl
Hi Ashic,
Absolutely
Hi Ashic,
Yes, there is one - local-cluster[N, cores, memory] - that you can use
for simulating a Spark cluster of [N, cores, memory] locally.
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L2478
Pozdrawiam,
Jacek Laskowski
https://medium
Hi,Is there a way to simulate "networked" spark when running local (i.e.
master=local[4])? Ideally, some setting that'll ensure any "Task not
serializable" errors are caught during local testing? I seem to vaguely
remember something, but am having trouble pinpointing it.
Cheers,Ashic.