.
On Sat, Oct 5, 2013 at 7:15 PM, Aaron Babcock aaron.babc...@gmail.com wrote:
hmm, that did not seem to do it.
Interestingly the problem only appears with
rdd.take(1)
rdd.collect() works just fine
On Sat, Oct 5, 2013 at 4:49 PM, Aaron Davidson ilike...@gmail.com wrote:
You might try also
Hello,
I am using spark through a vpn. My driver machine ends up with two ip
addresses, one routable from the cluster and one not.
Things generally work when I set the SPARK_LOCAL_IP environment
variable to the proper ip address.
However, when I try to use the take function ie: myRdd.take(1),
SPARK_JAVA_OPTs as well.
e.g.,
-Dspark.driver.host=192.168.250.47
On Sat, Oct 5, 2013 at 2:45 PM, Aaron Babcock aaron.babc...@gmail.com
wrote:
Hello,
I am using spark through a vpn. My driver machine ends up with two ip
addresses, one routable from the cluster and one not.
Things generally work
Hi,
Does anyone have any experience using jmx and visualvm instead of yourkit
to remotely profile spark workers.
I tried the following in spark-env.sh but I get all kinds of failures when
workers spawn.
SPARK_JAVA_OPTS=-Dcom.sun.management.jmxremote
-Dcom.sun.management.jmxremote.port=9000