It doesn't.
However, if you have a very large number of keys, with a small number of
very large keys, you can do one of the following:
A. Use a custom partitioner that counts the number of items in a key and
avoids putting large keys together; alternatively, if feasible (and
needed), include part
)).toList
}
On Friday, August 21, 2015 1:53 PM, Virgil Palanciuc virg...@gmail.com
wrote:
Hi Akhil,
I'm using spark 1.4.1.
Number of executors is not in the command line, not in the
getExecutorMemoryStatus
(I already mentioned that I tried that, works in spark-shell but not when
Is there any reliable way to find out the number of executors
programatically - regardless of how the job is run? A method that
preferably works for spark-standalone, yarn, mesos, regardless whether the
code runs from the shell or not?
Things that I tried and don't work:
-
-archives.us.apache.org/mod_mbox/spark-user/201411.mbox/%3ccacbyxk+ya1rbbnkwjheekpnbsbh10rykuzt-laqgpdanvhm...@mail.gmail.com%3E
On Aug 21, 2015 7:42 AM, Virgil Palanciuc vir...@palanciuc.eu wrote:
Is there any reliable way to find out the number of executors
programatically - regardless of how
Hi,
The Spark documentation states that If accumulators are created with a
name, they will be displayed in Spark’s UI
http://spark.apache.org/docs/latest/programming-guide.html#accumulators
Where exactly are they shown? I may be dense, but I can't find them on the
UI from http://localhost:4040